Showing posts with label Technology. Show all posts
Showing posts with label Technology. Show all posts

Monday, April 01, 2024

AI Will Pick Nobel Prize Winners in Physics

Please read the article carefully before you freak out. Hint: look at the date.

Zz.

Friday, March 29, 2024

My Favorite Web Application - Part 8

Previous posts:

My favorite web applications - Part 1

My favorite web applications - Part 2

My favorite web applications - Part 3

My favorite web applications - Part 4

My favorite web applications - Part 5

My favorite web application - Part 6

My favorite web application - Part 7

This is another one of my favorite web application because it has a ability to assign random values to various parameters in the problem.

This is a simulation of a motional emf in the form of a rail gun. It actually is a straight-forward application of magnetic force acting on a straight current. One may also solve this using Faraday's law, but it is not as straight-forward to solve because the magnetic flux (or rather, the area) does not change uniformly since the rod is accelerating.

What I also like about this simulation is that one can also tie in with what the students learned in Physics 1, i.e. they may verify their answer using kinematics, since we know the rod's mass, and it starts moving from rest. Knowing how far it travels and a good estimate of the time of travel gives us the value of the acceleration, and thus, the force acting on the rod. This should match with the magnetic force.

Zz.

Wednesday, April 05, 2023

ChatGPT and Projectile Motion, Part 2

If you missed my previous physics questions to ChatGPT, check them out here and here.

In this particular case, I asked ChatGPT a rather common question, and ended up having an argument with it. My questions are in red while ChatGPT's responses are in blue.

Three projectiles of the same mass are launched from the top of a cliff with the same initial speed. The first projectile is launched at an angle of 20 degrees above the horizontal. The second is launched horizontally. The third is launched at an angle 20 degrees below the horizontal. Which projectile hit the ground with the largest speed?


All three projectiles are launched with the same initial speed, which means they all have the same initial kinetic energy. The only difference is the angle at which they are launched.


The maximum range of a projectile launched at a given speed is achieved when it is launched at an angle of 45 degrees above the horizontal. This means that the first and third projectiles will have the same range, but the second projectile, launched horizontally, will have a shorter range.
However, the question asks which projectile hits the ground with the largest speed, not which has the greatest range. To answer this question, we need to consider the vertical component of each projectile's velocity at impact.


The first projectile is launched at an angle of 20 degrees above the horizontal, so it will have a significant vertical component of velocity at impact, causing it to hit the ground with a larger speed than the third projectile, which is launched at an angle of 20 degrees below the horizontal and will have a smaller vertical component of velocity at impact.


The second projectile, launched horizontally, will have no vertical component of velocity at impact, so it will hit the ground with the smallest speed of the three projectiles.


Therefore, the first projectile, launched at an angle of 20 degrees above the horizontal, will hit the ground with the largest speed.

It started off so well, realizing that all three have the same KE. But then, after it answered something that wasn't asked (the range), it went on to be obsessed with the vertical component of the velocity.

I decided to challenge its answer.


But they all have the same initial energy in the beginning. Shouldn't they have the same energy at the bottom when they hit the ground?

You're correct that all three projectiles have the same initial kinetic energy, and conservation of energy tells us that the total energy of the system must be conserved throughout the motion. Therefore, at the bottom of the cliff, all three projectiles will have the same total energy as they did at the top of the cliff.


However, the question specifically asks which projectile hits the ground with the largest speed, not which has the most kinetic energy. The speed of a projectile at any point during its motion depends not only on its kinetic energy but also on its direction and the effects of external forces such as gravity and air resistance.


In the case of these three projectiles, they all have different launch angles, which means they will follow different trajectories and experience different vertical and horizontal components of acceleration due to gravity. These differences will cause the projectiles to hit the ground with different speeds, even though they all have the same initial kinetic energy.

It's a bit puzzling what it is thinking at this point. If they have the same KE in the end, via conservation of energy, then should they also have the same speed?


But kinetic energy is 1/2*m*v^2. If they have the same kinetic energy, they should have the same speed.

Yes, you are correct that the kinetic energy of an object is given by the formula KE = 1/2 * m * v^2, where m is the mass of the object and v is its velocity. If three projectiles of the same mass are launched from the top of a cliff with the same initial speed, they will all have the same initial kinetic energy, and therefore the same total energy throughout their motion.

However, as the projectiles follow their different trajectories, they will experience different amounts of air resistance and vertical and horizontal components of acceleration due to gravity. These differences in the forces acting on the projectiles will affect their velocities as they reach the ground.
The projectile launched horizontally will experience no vertical acceleration due to gravity, so it will not gain any speed in the vertical direction as it falls, only in the horizontal direction. The first and third projectiles, on the other hand, will experience significant vertical accelerations due to gravity, which will affect their speeds at impact.

Therefore, even though all three projectiles have the same initial kinetic energy, the projectile launched at an angle of 20 degrees above the horizontal will hit the ground with the largest speed due to the combination of its initial horizontal velocity and the vertical velocity gained as it falls.


How can the projectile launched horizontally not experience vertically acceleration due to gravity? After all, it is falling to the ground!

You are correct that a projectile launched horizontally will experience vertical acceleration due to gravity, and it will also fall to the ground just like the other two projectiles.


What I meant to say is that the horizontal component of the velocity of the projectile launched horizontally will remain constant throughout its motion, since there are no external forces acting on it in the horizontal direction. This means that the horizontal velocity of the projectile will be the same at the top of the cliff and at the moment it hits the ground.


However, the vertical component of the velocity of the projectile launched horizontally will increase due to the acceleration of gravity. At the top of the cliff, the vertical component of the velocity is zero, but as the projectile falls, the vertical velocity will increase, and the projectile will hit the ground with a non-zero vertical speed.

Therefore, while the horizontal velocity of the horizontally launched projectile remains constant, the vertical component of its velocity will increase due to gravity, resulting in a non-zero speed at impact. However, since the horizontal velocity of the projectile remains constant, it will hit the ground with the smallest speed of the three projectiles.

The question asks about the speed, not the velocity. All three gain in speed as it falls to the ground. If their kinetic energy is the same, their speeds must be the same.

You are correct that the question asks about the speed of the projectiles, not the velocity. The speed of an object is the magnitude of its velocity, and it is calculated by taking the absolute value of its velocity vector.

If all three projectiles have the same initial kinetic energy, then they must have the same initial speed. As the projectiles fall, they will gain speed due to the force of gravity.

Therefore, at the moment of impact, all three projectiles will have the same kinetic energy, which means they will have the same speed. The angle at which the projectiles are launched and the resulting trajectories they follow may affect their final velocities, but their final speeds will be the same since they all have the same kinetic energy at impact.


So, to answer the original question, all three projectiles will hit the ground with the same speed, even though they were launched at different angles.

Phew!! That felt like pulling teeth!

It was as if it doesn't know the difference between speed and velocity, and I'm wondering if it was arguing with me about the vertical component of the velocity. It still makes no sense because clearly I explicitly mentioned "speed" at the very beginning.

Once again, if you don't know what you are looking for, there is no guarantee that you'll get the correct answer and consistent explanation.

Zz.

Wednesday, March 29, 2023

ChatGPT and Buoyant Force

I posted previously about ChatGPT's response to a projectile motion question. It got the answer correct, but with a wrong starting point, which made the whole response rather puzzling, inconsistent, and self-contradictory.

This time, I decided to ask a question that I had asked my students in a poll during a lesson on buoyancy and Archimedes principle.

A block of Styrofoam floats on water while a lead block of the same size block lies submerged in the water. Which one has the larger buoyant force?

The buoyant force on an object is equal to the weight of the fluid displaced by the object. The Styrofoam block is less dense than water and floats because it displaces an amount of water equal to its own weight, which is equal to the buoyant force acting on it. On the other hand, the lead block is more dense than water and sinks because it displaces an amount of water that is less than its own weight.


Therefore, the Styrofoam block has a larger buoyant force acting on it than the lead block of the same size because it displaces more water.

This time, ChatGPT got the answer wrong. Unlike the first time, it started out with the correct idea, which is that the buoyant force is equal to the weight of the displaced fluid.

However, in this case, the Styrofoam is displacing a very small amount of fluid because it is floating above the surface, while the lead block is completely submerged in the water, displacing its entire volume. This means that the lead block is displacing a larger volume of the fluid, and thus, having a larger buoyant force, despite it sinking in the water.

So in this case, ChatGPT once again exhibits its inconsistent and self-contradictory behavior.

I followed up trying to correct this error.

This is incorrect. The lead block displaces more water than the Styrofoam.


You are correct, I apologize for the mistake in my previous answer. The buoyant force acting on an object is equal to the weight of the fluid displaced by the object, and the amount of fluid displaced depends on the volume of the object. Since the lead block and the Styrofoam block have the same volume, the lead block will displace more water than the Styrofoam block because it is denser.


Therefore, the lead block has a larger buoyant force acting on it than the Styrofoam block of the same size, even though it is submerged in the water while the Styrofoam block floats on the surface.

It gets it now, but it is arguing that the lead block displaces more water than the Stryrofoam because it is "denser". While that is somewhat correct, it is not the reason why it displaces more water. It is because it is completely submerged in the water and displaces a volume of water equal to its volume.

If I reshape the same mass of lead into the shape of a bowl, the object itself will have the same "density", but this time, it can float on the water. It now displaces enough of the water such that the weight of the displaced water is equal to its weight, and thus, it will float. Granted that the whole "object" now includes the air that it encompasses so that its "total" density is less than the density of water, but the density of the lead itself hasn't changed. It is still denser than water but this time it floats!

Once again, if you don't know the material, it is hard to know whether you are being given not just the correct answer, but whether the whole response is consistent and not self-contradictory.

More to come.....

Zz.


Monday, March 27, 2023

ChatGPT and Projectile Motion

In my previous post, I mentioned an exercise that I often give to my General Physics students that are learning about projectile motion. Using a PhET application, they found out that two different scenarios produced a projectile motion that has the same maximum height and same time of flight. I asked the students to figure out why.

I decided to post the same question to ChatGPT. The result was a bit of an eye-opener. I will show my question (in red) and its response below:

Why does a projectile shot with initial velocity of 20 m/s and angle 30 degrees with respect to the horizontal reach the same height as the projectile shot with initial velocity of 10 m/s and angle 90 degrees?

The key to understanding this phenomenon is to recognize that the vertical motion of the projectile is independent of its horizontal motion.
For both projectiles, the initial vertical velocity is zero. The only force acting on the projectile in the vertical direction is gravity, which causes the projectile to accelerate downward at a rate of 9.8 m/s^2. The vertical displacement of the projectile can be determined using the formula:
Δy = v0y*t + (1/2)at^2
where Δy is the vertical displacement, v0y is the initial vertical velocity, a is the acceleration due to gravity, and t is the time elapsed.
Since the projectile that was shot with an initial velocity of 10 m/s at an angle of 90 degrees goes straight up, it reaches its maximum height when its vertical velocity becomes zero. At this point, the time elapsed is given by:
t = v0y/a
where v0y is the initial vertical velocity. Substituting in the values, we get:
t = (10 m/s) / (9.8 m/s^2) = 1.02 s
Using this value of t in the above formula, we get:
Δy = (0 m/s)(1.02 s) + (1/2)(9.8 m/s^2)*(1.02 s)^2 = 5.1 m
So, the projectile reaches a maximum height of 5.1 m before falling back down to the ground.
For the projectile that was shot with an initial velocity of 20 m/s at an angle of 30 degrees, we can break down its initial velocity into its horizontal and vertical components:
v0x = v0cos(30) = 17.3 m/s
v0y = v0sin(30) = 10 m/s
The horizontal motion of the projectile is uniform, meaning it moves with a constant velocity of 17.3 m/s. The time taken for the projectile to reach its maximum height is the same as the time taken for it to reach half the range of its motion (i.e., the distance it travels horizontally before hitting the ground). The formula for the time of flight is:
t = 2*v0y/a
Substituting in the values, we get:
t = (2*10 m/s) / (9.8 m/s^2) = 2.04 s
The horizontal distance traveled by the projectile can be calculated using the formula:
R = v0x*t
Substituting in the values, we get:
R = (17.3 m/s)*(2.04 s) = 35.3 m
The maximum height reached by the projectile can be determined using the formula for vertical displacement:
Δy = v0y*t + (1/2)at^2
Substituting in the values, we get:
Δy = (10 m/s)(2.04 s) + (1/2)(9.8 m/s^2)*(2.04 s)^2 = 5.1 m
So, the projectile that was shot with an initial velocity of 20 m/s at an angle of 30 degrees reaches the same maximum height of 5.1 m as the projectile that was shot with an initial velocity of 10 m/s at an angle of 90 degrees, despite traveling much farther horizontally before hitting the ground.

Now, the answer is correct. In fact, it gave me more than what I asked for, i.e. it included the actual calculation of even the range, which I did not ask for.

However, its starting point is incorrect, and in fact, it is self-contradictory. It starts of by saying that the initial vertical velocity for both projectile is ZERO. This is clearly wrong.

Yet, a few lines down, it then mentioned and even found the vertical component of the initial vertical velocity, all while it was oblivious to what it just said in the beginning. So while the answer in the end may be correct, the reasoning, if we were to go by what it said, is puzzling and inconsistent.

I decided to challenge it by pointing out its mistake:

You are mistaken about the initial vertical velocity being zero. For both projectiles, the initial vertical velocity is not zero.

You are correct, I apologize for the mistake. The initial vertical velocity for the projectile shot with an initial velocity of 20 m/s and angle 30 degrees with respect to the horizontal is:
$v_{0y} = v_0 \sin \theta = 20 \text{ m/s} \sin 30^\circ = 10 \text{ m/s}$
And the initial vertical velocity for the projectile shot with an initial velocity of 10 m/s and angle 90 degrees with respect to the horizontal is:
$v_{0y} = v_0 \sin \theta = 10 \text{ m/s} \sin 90^\circ = 10 \text{ m/s}$
So, both projectiles have an initial vertical velocity of 10 m/s. The rest of the explanation and calculations remain the same.


This is now correct. Unfortunately, it requires someone who actually understands projectile motion to spot this. A student learning about it may not notice the inconsistency of the reasoning.

This seems to be the most common issue that I've come across and read about as far as scientific questions being asked to ChatGPT. This was similarly observed in another post that I wrote about. ChatGPT response has correct parts, but there are also parts that are puzzling, contradictory, and often downright wrong, all within one response.

There's more. In my next blog on ChatGPT, I asked it about buoyant forces. You don't want to miss that!

Zz.

Saturday, March 25, 2023

My Favorite Web Applications - Part 7

Previous posts:

My favorite web applications - Part 1

My favorite web applications - Part 2

My favorite web applications - Part 3

My favorite web applications - Part 4

My favorite web applications - Part 5

My favorite web application - Part 6

This one is an obvious one. It is from PhET, and it is on projectile motion (the "Lab" option).

I have used this web app in many different situations and for many different purposes, including using it as a virtual lab when we went remote. However, even in my face-to-face classes, I continue to use this during our lessons on projectile motion.

One of the most difficult concepts for students to understand with projectile motion is that the maximum height and the time-of-flight of the projectile depends only on the vertical component of the motion. If the vertical component of the motion remains the same, then regardless of what the horizontal component is doing, the maximum height and time-of-flight will be the same as well.

What I typically have the student do with the app is the following:

  • Set the canon to an angle of 30 degrees with respect to the horizontal and an initial speed of 20 m/s.
  • Fire away!
  • Measure the maximum height and the time of flight using the tools available in the app.
  • Then change the angle to 90 degrees and an initial speed of 10 m/s.
  • Fire away!
  • Again, measure the maximum height and time of flight.
  • Compare the two situations.
The students will find that for these two different situations, the maximum height and the time of flight are the same. I ask them to discuss this with their partner/s and figure out why these values come out this way. Then I ask them to find another angle and initial speed where the projectile gets to the same height and has the same time of flight.

Of course, the reason for this is that the vertical component of the initial velocity is the same for both situations. The is the only thing that the two motion has in common. Thus, since the maximum height and time of flight depends only on the vertical motion, the two different situations will naturally produce the same values for each of those two quantities. If they understand this, then they will be able to quickly find another angle and initial speed via simple calculation rather than by trial-and-error.

BTW, watch this space as I will be posting a link to an upcoming blog post of my interaction with ChatGPT on this same question that I ask my students.

Edit: This is my blog post on what happened when I asked this projectile motion question to ChatGPT.

Zz.


Tuesday, February 14, 2023

ChatGPT Does Physics

I'm guessing that most of you who are reading this have heard of ChatGPT or maybe have even tried it. I have. I had to, because I need to know what it can and cannot do in case my students are also using it. I am still playing with it and trying several different things, so I will have a lot more to say about it.

Still, when I came across this video on YouTube, I had to post it here, because it seems that we have similar general observations (not including the writing the Python code for the infinite square well problem - I've never asked ChatGPT to write a code).


[フレーム]

It is true that ChatGPT still can't handle graphs and figures as of now. While this may be a way out for instructors to prevent students from using ChatGPT effectively, the extensive use of figures and graphs is also a hindrance to students with certain disabilities. In many cases, I have tried to make my exams and questions to be more "friendly" toward such students, especially in the spirit of making all my material accessible and moving toward conforming to the idea of Universal Design in Learning (UDL).

Of course, if my questions, including the figures, can be deciphered by text or document readers, then it should, in principle, be capable to be fully understood by ChatGPT, which then removes that barrier of using it to solve those problems. Sigh....

Like I said, there's more to come about this.

Z.

Saturday, January 14, 2023

One of the Things That I'm Keeping From My Remote Classes

I mentioned earlier that when we went remote due to the pandemic, I forced myself to be trained as an online instructor and received credentials from the school to teach online courses. While I do not intend to be an online instructor, many of the lessons and technique that I learned from such training are actually quite useful even when I eventually went back to in-person classes.

One of the aspect of online classes that I'm keeping even for my in-person classes is the discussion forum. At first, it seems that discussion forum might be a bit of a waste of time, considering that we all meet each other during classes, and can easily engage in conservation and discussion. It turns out that online discussion forum has been quite useful for me. Here's how I implement it in my in-person classes.

I always start the semester with the first discussion topic in Week 1, which is for the students to introduce themselves. I ask them to write a brief description of who they are, what their majors or intended majors are, and what career to they aspire to. I ask them about their hobbies, and anything else that may be interesting and unique about them.

What this gives me immediately is the information on what they are interested in, especially with their ares of study. I like this because I can invoke examples in my lessons that apply to what they want to do. This is something I do frequently, which is a conscious attempt at making the physics lessons relevant to their area of study. Even if it is not applicable to their major, I can also connect this to something they are into, especially if they are musicians and play a particular instruments, when we talk about sound waves, resonance, pitch, etc. Last semester, one of my students was a college tennis player, and we had a long discussion on air flow, Bernoulli principle, etc. when we were discussing fluid dynamics.

But you may ask "Yeah, ZapperZ, this is all interesting and useful, but why do you have to do this over an online discussion forum? Why can't you just ask them in person?".

The answer is, not every student is comfortable with speaking in front of a group of people. In fact, many students are terrified of being the center of attention during class. I am offering a form of engagements in which many of the introverts will be more comfortable with and are able to be more open in sharing who they are. The more extrovert ones are more than welcome to introduce themselves during class time. In the spirit of Universal Design in Learning (UDL), I am trying to provide more than just one means of engagement for the students.

This is not the only topic that I use in the online discussion forum. I'll make another post on another topic that make use of the discussion forum to get a snapshot on how the students think and analyze a particular scenario.

Zz.

Thursday, May 05, 2022

My Favorite Web Applications - Part 5

Previous posts:

My favorite web applications - Part 1

My favorite web applications - Part 2

My favorite web applications - Part 3

My favorite web applications - Part 4

This time, it is an experiment that mimics the fabled Archimedes experiment where he supposedly determined for the "king" whether the crown was made of pure gold or not. This web application basically allows a student to perform a similar virtual experiment to determine the density of the object being investigated.

There are two reasons why I like this app. The first reason is that if you change the default settings for the mass and the volume, you will given rather random values. This means that each student will have different values for the mass and volume, resulting in each student having a unique set of data and calculation.

The second reason why I like this "experiment" is that it actually is the same experiment that we would do in a f2f lab. We use PASCO's Capstone system, and one of the experiments that we do is practically identical to what is shown in this virtual experiment, where a student has connected a weight sensor to a hanging mass, and then he/she slowly lowers it into a beaker of liquid. The sensor sends a reading of the hanging weight value to a data collection system that plots the value of the weight in real time. As the weight is lowered into the liquid, the data being plotted looks almost exactly as what is shown in the virtual experiment in this app. The weight changes due to the buoyant force of the liquid.

The analysis of the experiment and the extraction of the value of the object's density are similar for both the f2f lab and this virtual lab. So in that sense, the student is not being deprived of much of the physics. There are, of course, more errors involved in the real experiment because often the object isn't hanging still, and the movement adds more noise to the data. The app doesn't allow the data to be extracted directly, so no curve fitting or calculation of average value can be made for a range of the data points, something the students in the f2f lab are asked to do to be able to determined the weight before and after immersion.

Still, it is an adequate virtual experiment, especially since each student will have to do his/her own analysis on a unique set of measurement. I actually have used this as part of an assessment where this app was part of an exam for a f2f class (before the pandemic). The student had already done the actual experiment, so they should be familiar with how to find the density of the object using this app since things should look rather familiar.

Zz.


Friday, April 22, 2022

The Migration to OER

For the past couple of years, the school has been pushing various departments to start adopting Open Educational Resources (OER) for various courses to help reduce educational costs to students. It has finally trickled down to our department where, starting this coming Fall, the General Physics courses will start using OER texts for the first time.

I have zero problem with doing this. I remember when I was a student, textbooks were hugely expensive. Adopting OER texts for General Physics courses will save students quite a chunk of change, especially if they, or their parents, are footing the costs.

The only issue I have is that, using texts from various publishers doesn't stop just at the textbook itself. I've been using Pearson and Cengage for General Physics texts, and they come with their online services consisting of the e-text and homework/quizzes capabilities.

But even that does not convey everything. Both Cengage and Pearson's website offers rather substantial student support that I have made used of, especially when we went remote. When I assign homework on Pearson's Mastering website, for example, I often select one or two "tutorial" items. These are questions in which, if the students are stuck, there are guided hints and prompts to help students overcome the barrier or difficulty at that stage. I find these types of tutorial very useful for the students and often had the students attempt one of them during class session.

The other thing that I find useful is the "adaptive learning" feature. I can set it up so that if a student struggled with one problem and finally thinks that he/she understood how to solve it, it will prompt the student to solve a similar problem to that one to see if the understanding can be nailed down. The student then has the chance to really test his/her understanding in solving the similar problem, and I can see for certain of the student's progress.

Unfortunately, none of these extensive feature are available in any of the OER sources. These features were extremely useful during remote learning where I'm not there to help the students in person. Yet, these features gave real-time feedback on how the students are doing and assisting the students in solving the problem, all done automatically without needing my intervention. These are what I will miss when I start using OER texts because so far, from what I can see, they only provide the text and maybe a set of homework questions, and that's it. It is no different than the old-fashioned way when I was in college, except that these are in electronic form.

It is still months away from the start of the Fall semester, but I'm already thinking and planning ahead on how to approach this. We will definitely be back to in-person instructions, so maybe the need for all the bells and whistles of online capabilities might not be as great as it is now. Still, I'm anticipating a few hiccups as I dive into a new set of challenges in running a class.

Stay tuned....

Zz.

Monday, March 28, 2022

My Favorite Web Applications - Part 4

Previous posts:

My favorite web applications - Part 1

My favorite web applications - Part 2

My favorite web applications - Part 3

Of course, I have to include a PhET application. How could I not? It is such an amazing collection of very useful applications and simulations.

For this one it is the demonstration on Faraday's/Lenz's law. What is interesting about this is that, if you have read one of my previous posts, I use this not so much as a virtual lab, but rather as an in-class "discovery" tool. In fact, for my f2f classes, I had an identical setup to this PhET application sitting in front of the students at the beginning of class. So the instruction that you'll see given to the students is almost identical for this application and for in-class activity.

This obviously is a lesson on Lenz's law. Instead of starting the lesson with a lecture, I give the students a series of tasks for them to do. I first tell them to set up the application or in-class apparatus to look like the picture below:

We then spend some time discussing the direction of the current in the coil if the galvanometer (in-person) or the voltmeter (PhET) has a positive or negative deflection based on being observed from the right side of the coil.

Once the students have established this, I give them a series of tasks that they have to perform and to record what they observe. The tasks are listed in the table below:

When we were doing this in-person, I asked the students to perform Task 1, to record what they observed, and then we all, as a class, discuss the observation. This exercise was helpful especially to students who were still unsure on what to do and what they should be observing. So this first tasks often clarified further what they needed to perform and what they should observe. For remote classes, this is not that easy mainly because I don't quite see what they students are doing and what they are observing. They are also doing this in their separate Zoom breakout rooms. They have a chance to discuss with members of their group, but I am not always there to double-check what they are observing. I do, however, get to see what they are recording because the table above is posted on a Google Slide document that I give them. So I can see every entry for each group and able to step in if I see something not quite right.

In any case, the students for in-person session perform the task one step at a time, and each time, we all discuss the observation. Remember that I have not told them anything about Lenz's law at all. All they are doing at this stage is performing a task and recording the corresponding observation.

By the end of this activity, both the in-person and remote students will have a set of observations for each of the tasks performed. This is where it gets interesting. I then instruct the students to discuss with their group members on how to come up with a set of rules or "laws" to accurately describe the behavior of the current in the coil in relation to what the bar magnet is doing. In other words, I want the to come up with a written description of Lenz's law.

Of course, I give them hints. The biggest hint is for them to consider the induced magnetic field in the coil. By that point, they have learned that a current in a coil or solenoid generates a magnetic field. If there is a deflection in the galvanometer/voltmeter, then there must be a current in the coil. The positive or negative deflection indicates the direction of the current in the coil, which in turn indicates the direction of the induced magnetic field in the coil.

From my experience in doing this for several semesters, only about 1/4 of the students were able to come up with a description that had a close resemblance to Lenz's law. Many of them struggled not just in understanding what they observed and what the "laws" were, but also in communicating accurately and clearly what they intended to say. The latter is a very common problem for many students trying to write scientific prose.

However, regardless of whether they managed to successfully come up with their own version of Lenz's law, I find that this exercise demonstrates this principle a lot clearer than if I just simply spew out the material in a lecture. Even if the students could not communicate clearly what they understood, most of them actually had some realization of what it is. To me, this is the biggest stumbling block in understanding Lenz's law, which was the impetus for me to present this topic in this manner.

The PhET application allowed me to do almost the same activity online as the one I do in-person. That is a very good thing!

Zz.







Tuesday, March 08, 2022

My Favorite Web Applications - Part 2

Previous posts:

My favorite web applications - Part 1

It is rather appropriate that the next web application on my list can actually make full use of the vector calculator that I mentioned in a recent post. Many of you may be familiar with the force table in a General Physics course lab. It is a contraption that looks similar to the picture below.

force tables

It actually is a rather useful apparatus to demonstrate vector addition and the powerful and convenient method of vector addition using components. Of course, when I assigned this to my students, we didn't use any vector calculator. The students had to calculate the components and find the resultant vector themselves. But this was also the situation where the students encountered the issue with knowing the correct angle that I mention in the vector calculator post. The only difference being that the visual "obviousness" here is more apparent than just looking at the numbers on an Excel spreadsheet.

When we went remote, I was lucky enough to come across this website that had a virtual version of the force table. In fact, other than not having the students struggle with knowing what weights to use, where to clamp them, and how to set up the pulleys, this exercise is quite similar to what I would normally do in class. I had to do only minor rewrite to my lab instruction to incorporate this web exercise.

The one thing I like about this app is that the situation is different for each student, i.e. the magnitudes and directions are unique to each student. Therefore, while they can consult with each other, each student still has to do his/her own calculations to get the answer. The students are given the instruction that they need to do this until they get it right, even if they exhausted all the tries and have to get the web to regenerate brand new set of forces and angles. Once they get it right, they have to do a screen capture of the acknowledgement page, and paste that in the report along with the working done to arrive at the correct answer.

The only thing I wish this web app has is the ability to specify the number of weights (or vectors) in use. In my in-person lab, I had the students start with just one vector, and they have to construct an opposing vector to get the equilibrium condition (trivial, of course, but you'd be surprised at the number of students who had to think about how to do this). Then they move on to having 2 given vectors, and finally 3 vectors, which is what we have in the web app. By doing this gradually, the students realize that they first need to find the resultant vector, and once they have that, all they need to do to get the equilibrium condition is to create another vector of equal magnitude but in opposite direction to the resultant.

Nevertheless, this is a useful web app and something that I intend to use even for in-person instruction.

Zz.


Friday, March 04, 2022

Excel Vector Calculator - Be Careful How You Use It

I was asked to show this video to students on how to set up their own vector calculator using Excel. The calculator gives you the ability to find the sum of vectors just by entering each vector's magnitude and direction.

Is it useful? Sure it is, but this is where if you don't know what is going on, you may be using it incorrectly! I have a couple of examples to show that.

After you have set up the calculator using the example shown in the video, enter these:

Vector A: Mag=9.8; Ref. Angle=114

Vector B: Mag=16.5; Ref. Angle= -104

Vector C: Mag=11; Ref. Angle=180

If you have set up the calculator correctly, you will get the resultant vector having a magnitude of 20.2 and a direction of 20.4 degrees.

If you don't know any better and I ask you to sketch out the direction of this vector, you would have drawn an arrow that is pointing in the first quadrant of a Cartesian coordinate system, which would be WRONG! In fact, most of my students would do that. It is a natural and automatic tendency to do so since angles are measured counter clockwise relative to the positive x-axis.

If you do a quick sketch and do a "tip-to-tail" vector addition, you will end up with a vector that is actually pointing in the 3rd quadrant! In fact, the true angular direction for this vector is 200.4 degrees (180 + 20.4, the latter is the angle found from the calculator).

The reason for this is that in calculating the angle, one is dividing the y-component by the x-component. This vector has both components being negative and so the division produces a positive value, producing a positive angle. But this angle given by the calculator, if one were to sketch out the vector, is the angle measured from the NEGATIVE x-axis, not the standard positive x-axis. If one remembers lessons from trigonometry, it is why the value of the tangent of an angle is positive in both the 1st and 3rd quadrant.

So the angle given is "correct" if one knows where it is measured from.

Here's another example to try:

Vector A: Mag=12.7; Ref. Angle=45

Vector B: Mag=19.2; Ref. Angle= -171

Entering this into the calculator, you get the resultant vector having a magnitude of 11.7 and angular direction of -30.9 degrees.

Once again, if you simply go by what you get, the tendency here is to think that the vector is in the 4th quadrant, because a negative angle means that it is an angle measured from the positive x-axis but going clockwise.

This is wrong.

The vector is actually in the 2nd quadrant. A simple sketch to do the vector addition will confirm that. The angle "-30.9 degrees" is actually an angle measured clockwise from the NEGATIVE x-axis. For this vector, the x-component of the resultant is negative, and thus, the ratio of the y-component to the x-component is negative, resulting in a negative value of the tangent and the angle. Once again, from trigonometric lesson, the tangent of an angle is negative in the 2nd and 4th quadrant.

What this all means is that a positive angle value is not unique - the vector could be in the 1st or 3rd quadrant - while a negative angle value is also not unique - the vector could be in the 2nd of 4th quadrant. Either do a quick sketch to do vector addition, or look at the sign of the resulting components.

There are two important lessons here. First is that one must know what the numbers mean. Using them blindly without understanding how they come about is risky and may result in the wrong conclusion. Secondly, for this exercise, there is no substitute for doing a sketch and knowing how vectors add. A simple sketch will provide an important sanity check that your conclusion about the vector direction is not wrong.

While this video and the setting up of the calculator is useful, the producer did not go far enough to demonstrate the possible pitfalls in using it blindly. There should have been examples involving what I had presented to tell the viewers what they should be careful about. I just wonder how many people had used this and interpreted their results incorrectly.

Zz.

Saturday, February 26, 2022

My Favorite Web Applications - Part 1

I've used online web applications as supplement or enhancement to the class material, but during the pandemic when we went remote, I relied on them even more. In fact, I remember a couple of days where I did a lot of surfing and searching to find suitable web applications for various activities and topics, simply to catalog on the various things that are out there that I could use for my classes.

Over the months and years, I have a bunch of web applications that I consistently go to that I find to be quite useful. These can either be simulations to illustrate a physical concept, or virtual activities or experiments that mimic what students may perform in a lab.

In a series of posts, I will show what I've used and how each one was used, especially during a remote class. Maybe someone might find one or more of them useful, or might see it being used in a different way. Better yet, maybe someone has a better web application for the same task. I definitely like to hear that!

To start of, here is my most favorite web application to demonstrate the phases of the moon and why we, on Earth, see what we see. Despite the simple-looking screen, this webpage is choke-full of information. The biggest part of the screen shows the location of an observer on Earth, the location of the Moon, and the position of the Sun. You can manually click and drag the observer and the moon to get them to move to any valid position, or run the animation.

But don't ignore the two smaller animation on the right side of the screen. The top animation shows the moon phase that the observer sees at that time of the month. The lower animation shows what the observer sees at that time of the month and the time of the day. It indicates the positions of the moon and the sun at that particular time of day.

This is a very useful application to get students to understand why we see various phases of the moon, why we see the moon in a particular position in the sky at certain time of the month, etc. I tend to let the student play with the application for a while and then ask them to use the application to answer a series of questions. For example, what is the most likely day of the month for you see a full moon directly above your head at midnight? This is what the student should set up with the application to answer this question.


Another example was opportunistic because the Muslim's fasting month started sometime during the semester, and it was a common practice (it still is in many parts of the Muslim world) for people to look for the crescent moon at sundown to signify the start of their fasting month. So I also ask for when is it most likely to see a waxing crescent at sundown?

One of the best thing about this app is the ability to make the students realize, if they haven't already, that they should and can see the Moon during the day, i.e. when the Sun is also in the same side of the sky as the Moon! This allows use to discuss the often-mistaken idea that the phases of the Moon are due to the shadow of the Earth on the Moon from the Sun. We can also carry a more advanced discussion on why we don't see eclipses of the Sun and the Moon every month, especially if they have understood what this web application seems to convey.

I've looked at other websites demonstrating and explaining phases of the Moon, but to me, this is the best one out there so far.

Zz.



Friday, December 31, 2021

End of the Year Reflection

It has been more than 20 months since we went completely remote due to the pandemic. I still remember the chaos and panic when we started all this in March 2020. I think all of us thought that this would be short and will last only a few months. Boy were we wrong.

I had never wanted to be an online instructor, and I do not see how an effective physics instruction can be done via such a means, even when I was aware that many schools have online general physics courses. I was skeptical on how well a student can learn from such a format.

But when force to teach remotely, I had to train myself into being an online instructor. I already knew back then that I cannot simply transpose all my usual face-to-face (f2f) way of teaching and bring it online. It didn't work for the rest of the Spring 2020 semester, and I was highly unsatisfied with my own performance. I knew I had to do something, and that meant forcing myself to be trained as a certified, online instructor. I want to learn what has been know to work and effective, and what doesn't.

At the institution that I am at, the requirement to be certified as an online instructor involves passing two selected online courses offered by Quality Matters. Then, to complete the certification, an in-house courses on using our learning management system (LMS). I will admit that they were all extremely useful and gave me a different mindset on online learning. I knew that online courses is a different beast than f2f classes, but there's a lot more "psychological" consideration with online learning, both synchronous and asynchronous. It is why trying to apply f2f format to remote learning will not be very effective.

I became a certified online instructor at the end of Summer 2020. Since then I've applied many of the techniques, philosophy, and methodology of remote learning to the courses that I had taught in the subsequent semesters. I went from being uncomfortable and unfamiliar with remote teaching to actually liking it! It got my creative juices flowing as I continue to think of various ways to increase students' engagement in the course.

And that word, "engagement", became the central theme that I've learned as an remote instructor. While I used to think of "class participation" as something I want the students to be active in in f2f classes, it is now a more general concept of student engagement that is more important. Class participation is only one type of student engagement, and I learned of how I could get students to be engaged in a subject matter under remote learning modality. I realized that I spent a lot of time thinking of various things and activities that students can do or participate in, either synchronously or asynchronously, to keep that focused on the material or as means to get them to understand the material. Oh sure, it took a lot of time and effort in the beginning to come up with these things, but from the evaluation feedback that I received, they seem to be quite effective.

Technology-wise, I find that it is no longer a major issue to interact with students in answering their questions or showing them how to solve problems. I will either post my hand-written work on the course's LMS page, or if it is a synchronous session, I used my iPad and an app called AirSketch to mirror my iPad onto my computer and voila! I have a "pen and paper" capability to show my students, similar to being at a whiteboard.

For me, one of the most useful suite of online resources is the Google Office apps. I make use of Jamboard, Google Slides, and Google Documents for students to work either on their own or in a group during breakout sessions. I've even assigned graded tasks for them to solve using Google Slides where I post a problem and they have to work together to show their solution. Often, these are accompanied by a task they have to complete using PhET web applications and other simulations.

One of the most difficult part of going remote is the labs. We were not prepared for this, so unlike schools that already have established online presences, we do not have kits to give out to students so that they can continue doing physics experiments at home. We end up relying on either simulations, virtual experiments, or other means. Along this path, I discovered Pivot Interactives, which in my opinion, is one of the best alternatives to doing labs online. This is because this is not a simulation. It is an actual experiment, but done by someone else. A students is left with the tools to measure various parameters. So the result is almost like what the student would get if he/she were to perform the experiment itself, meaning that the data have all the uncertainties and errors in a typical actual measurement. I have more to say about this in an upcoming post.

During the pandemic, I upgraded many of my equipment. I replaced my old MacBook Air with the new M1 MacBook Air (so much faster!), and added a few other accessories to enhance my synchronous session, including a ring light. Here's a look at my current setup (and yes, I do have two different notebook computers that I use frequently).


I think that as far as the quality of my video during Zoom, we are good! I had students who told me that I look like one of those TV news readers. I hope they were referring to my video quality rather than how droll and dry my presentation was. 😁

While this is all well and dandy, my skepticism of online/remote learning has not gone away. This is especially true in terms of assessment. I still believe that online cheating is too easy and too rampant. I had to work extra hard in reducing (not eliminating) the chances of cheating during my exams. Forget about using questions given by the textbook publishers because those can easily be found online, especially on Chegg. All of my exams are questions that I had to formulate on my own. And it never fail to amaze me how a student who scored 20/20 on a homework will crash badly in an exam that contained questions similar to that found on that homework. And yes, I have found questions that I had formulated in an exam given just a few weeks before now appearing online verbatim, even including the sketch that I made. It means that I can't even recycle my own questions in future exams.

I am aware that there are several proctoring method that can be done with remote learning, but many of them sound rather creepy and Big-Brother-ie to me. I do not want those things installed on my computer, so why should I force my students to have them. The way I reduced the degree of cheating in my exams is to inform the students in advance that all the exams will have limited time, will open only within a certain period of time, and all the questions will be out of my head. With this, I hoped that they will realize that even if they copied from other sources to do the homework, they will at least try to understand what was done in the solution rather than doing a blind copy. I've only had limited success so far with this.

As 2021 closes, it looks like I will have to start appearing on campus. The labs will now be done f2f while the rest of the instruction will still be done remotely. I also have the option of having exams done in class rather than remotely, so maybe that will eliminate issues of cheating with the exam. What is different now than in March 2020 is that I no longer have that apprehension of teaching remotely. While I will continue with my own personal and professional development as an instructor, I think that I am now well-equipped to handle remote instructions. During these past 20 months, I've acquired both the skills and the technology to deliver lessons online effectively, even if I'd rather things go back to the way they were (don't we all?).

Happy New Year, everyone!

Zz.


Saturday, September 11, 2021

Getting Students To Turn On Their Video Camera During Zoom

While many schools are back to in-person classes, there are still many courses being offered online due to the pandemic. In fact, a number of schools had to revert to online classes after severe COVID outbreak on campus. So online lessons are not going away anytime soon for traditional in-person schools.

At the start of the Fall 2021 semester, during many of the meetings I attended with faculty members from my department and other departments, one of the most common "complaints" that I hear was how to get the students to turn on their camera. The school has made it abundantly clear that we cannot force the students to do that, and that turning on their cameras was something voluntary.

Still, many faculty members were having a hard time teaching to "blank boxes" on their screen. They complaint was that they find it frustratingly lonely when they look at their screen and see no faces and no one there at the other end. They also said that they couldn't see any body language to gauge the students' reaction, as if looking at a live Zoom window could tell you the accurate body language of a person.

To be clear, many of the faculty members who moaned about this were from the language/humanities/etc. department. So eventually, I had to say something about this.

What the issue here really isn't about turning on someone's camera or being able to look at a person on the screen. Rather, it is the issue of STUDENTS ENGAGEMENT on the subject matter. When we teach in class, we can judge how much the student is engage in what we are teaching, and there are many face-to-face interactions that engages the student into understanding the material.

We can't do that in an online lesson, be it synchronous or asynchronous. Treating an online lesson the same way as you would a f2f class will suck, as I've said many times. As an instructor, we have to rethink EVERYTHING when we teach things online, because the whole emotional/psychology of things are different.

I told my faculty colleagues from other departments that, if anything, *I* have a greater need to see my students in terms of being relevant to the material being taught, and not just for my psychological needs. I told them that when I teach the topic of magnetic field, such as when a charge particle moves in a magnetic field, we find the direction of the force acting on the charge particle using the cross product depicted by the so-called "right-hand rule". When I taught this in class, I can see how the students were lining up their right-hand and how they "curl" their hand to finally look at their thumbs to show the direction of the force.

So here, there is a direct and academic need to be able to see what the students are doing, and not simply just for my benefit. Thus, if anyone here has a greater claim to want to see the students during a lesson, I argue that it would be me. Yet, I make no such requirement to the students. I told them that if they want me to verify that they are doing the right-hand rule correctly, they should consider turning on their cameras, and that was it. I don't bemoan the situation that I couldn't see my students, etc. and it somehow made it feel "lonely" or as if I'm talking to nobody. It isn't about me. It is about the students!

I think that a lot of people do not realize the extra and unique challenges of teaching STEM subjects remotely/online, and this includes non-science administrators and faculty members. STEM faculty members should make their voices heard more often, and be involved in the relevant committees so that we don't get left out in course design, etc. Otherwise, a lot of things that they think will work, won't for the courses that we teach.

Zz.

Saturday, July 24, 2021

Keeping Online Teaching Techniques For In-Person Classes

Fall semester 2021 is looming, and many schools are going back to in-person, face-to-face classes, at least here in the US. The pandemic that caused us to teach remotely has forced many of us to hone our skills as instructors and adapt them to the online environment. I mentioned earlier how I decided to enroll myself in an online program to get certified by my institution as a qualified online instructor. I had no intention of wanting to be an online instructor, but I wanted to take these courses because I needed to learn the important skills and methodology of being an online instructor because, whether I like it or not, and whether I want to or not, I was going to be delivering my lessons online. And I did, for the next 3 semesters (including summer).

The thing is, my training as an online instructor may actually have improved my overall skill as an instructor, even for an in-person setting. There are many important lessons that I learned from that training program, and I fully intend to use many of them when I go back to in-person teaching. What brought this article about was a discussion I had with a few of my colleagues on whether we will use any of what we learned from the online training courses when we go back on-site. All of us unanimously said yes.

So here, I want to outline a few of the things that I had used in my online/remote classes that I will continue to use in my in-person classes.

  1. Remote office hours. God yes! I no longer see the need to have specific office hours at a specific location. Oh, I am still going to publish one office hour on-site, but the other office hour will be via Zoom that students can drop in remotely. Not only that, I'm going to make it very flexible for students to make an appointment to see me via Zoom at almost any time of the day if I'm available. Students are no longer restricted to seeing me only when they are on campus.
  2. More extensive use of the Learning Management System (LMS). Before the shutdown, I have been using the LMS to record grades, to post announcements and the course syllabus, to upload my lecture notes, and for the students to submit their lab documents, etc. However, now, I will make even more use of the LMS after what I have learned from online courses. I will continue to use the discussion feature, although not as extensively as I did for the online classes. I find that students who are either shy or do not actively participate in person, often make more use of the discussion feature in their participation with the class. This discussion feature could be an extension of the labs, where students get to give their opinion on the experience of a particular experiment, or what they expected or didn't expect.
  3. Related to the LMS, I will keep the design and layout of the LMS page based on what I've learned from online courses. This was the biggest cosmetic and design change that I have adopted. Having weekly module that contains EVERYTHING that a student needs to do and finish that week makes so much more sense now. The surveys that I had done seem to indicate that this is highly successful and that students were able to navigate the LMS and find everything that they need easily.
  4. Before the pandemic shutdown, I was running a flipped classroom, so I was already making use of online lessons, videos, etc. I was also using clickers for the polls and a form of peer-instruction during my lesson. I adopted this for my online lessons. So this won't change when I go back to in-person classes. However, it think that I want to add a lot more video lessons that students can use as additional resources for when they are out of class. It just means that even when they are not in class, they will be able to review the lessons of that week via other means and other sources.
  5. One of the most important aspect of online courses is student engagements. It is crucial because this is one of the most difficult thing to accomplish in online classes, and yet, it has been shown to be an effective means to convey the lesson and get a student interested in it. This is not that big of a deal for in-person classes because, well, the students are there and you can engage them with via many things in class. However, I want to formalize this engagement even when it is easier to do during in-person classes. Before the pandemic, I was already running what I would call "Studio Physics" classes, where the students have frequent discussion when other students, and even perform the demonstration of certain physics phenomenon themselves as part of the lesson. I want to increase this even more. While I had used web applications such as the ones from PhET, I want to use more of these so that they become on-going "projects" for them to investigate outside of class hours. During the past year and a half, I found a lot more websites having many different physics online simulations, "labs", and other applications that had used in my online courses. I found many of them that would be very useful to assign to students even in an in-person class. Again, a lot of this could be part of their discussion task where they discuss and report what they learned or discovered while using these web apps.
  6. Clearer, more extensive and detailed feedback. In an online course, student and teacher often do not interact either live, or often, and feedback often does not come fast and quick, unlike in-person interaction. So feedback on something, especially on an assigned and graded task, must be as clear and complete as possible that first time. This includes the tone of the feedback where we were told be mindful of how the feedback might be perceived by the student. While this may not be as crucial in an in-person environment, I still intend to make full use of the feedback feature on the LMS to give a detailed assessment on a student's work. I think that having it in writing makes it more concrete and gives the student a clear and more permanent evaluation, especially if it involves something done online such as the discussion forum.

I'm glad that I got the training that I needed to be an online instructor, even when I have very little desire to be an online instructor. The skills and technique that I learned were valuable, and in some ways, I hope it made me a better teacher both during this remote-learning phases, and when I go back to in-person instruction. We shall see if this is the reality for in-person instruction soon enough.

Zz.

Monday, January 11, 2021

Physics Labs At Home

I've made several posts on various virtual experiments that may be done in conjunction with the standard physics courses. While many of these are adequate, nothing beats an actual, physical experiment that requires actual observation and measurement in person.

This paper lists a few experiments that a student may be able to do at home using items that a student may find at home. Since almost everyone having smart phones, there are certainly many activities that can be done using such devices. I've asked my students to use their smartphones to install sky-viewing app to be able to track planets, stars, and other celestial bodies. We have also used various apps that made used of the accelerometer in the phone to measure acceleration. I also have an app called "Gauges" (iOS) that allows you to use your smartphone to be an altimeter, speedometer, barometer, accelerometer (of course), magnometer, and to measure sound level and luminance. I am in the middle of designing a few "in class" (and now, it is "at home") activities using these capabilities.

While virtual experiment is fine for the present unusual situation, I still believe that this is not the same as actually doing the measurement itself and physically performing the experiment. So I'm trying to find activities that a student may be able to do him/herself, or in collaboration with another student if he/she does not have all the necessary equipment. I want to incorporate this as part of the lesson rather than an actual "experiment", so that the student can see the phenomenon that they are studying or about to study.

Have you designed simple at-home physics experiments for your students?

Zz.

Sunday, August 23, 2020

Teaching In Fall 2020

At the start of a new semester, I'm amazed at how much I've learned over the summer about online and remote learning. Taking the workshops on learning about how to deliver such classes was definitely worthwhile. My perspective on such modality has expanded and I think I am significantly better equipped than when this whole mess has started.

The biggest takeaway from everything that I've learned is that taking what you do in a face-to-face (f2f) classroom and importing it wholesale into a remote or online class does not work very well, even when delivered synchronously. f2f and online are two completely different beasts, and the mechanics, pedagogy, and "psychology" are very different. Based on what I have learned about some of the best practices method, student engagements and interaction are significantly more important in online courses for their success. So there has to be a conscious effort to design the course so that the students have to engage with the material, with the instructor, and with other students. This seems to be a very common and central pedagogy in almost everything that I've read about online learning.

Luckily, the tools that we use for that can promote such engagements. It appears that many Learning Management Systems (LMS) have upgraded and add features to their software over the summer. I know that the one that I'm using seems to have been loaded with a lot more capabilities than what I remembered earlier in the year. I've employed a bunch of tools in the LMS system for the asynchronous part of my class, and I've planned many activities during our synchronous sessions via Zoom (breakout rooms, polls, etc.). This includes a few "live lab sessions" in my physics class where the emphasis will be more on simple observation, extensive analysis, qualitative explanations, and a few quantitative calculations. We'll see how that will work.

I think that taking the workshops over the summer and being an online learner myself with those two workshops gave me a more accurate perspective from the point of view of an online student. Just finding things itself can be an issue, and you have no one to ask immediately. I used to arrange my LMS page in terms of modules. There's a module for quizzes, there's a module for homework, there's a module for exams, there's a module for labs, there's a module for lecture notes, etc. So if you want to find the document for Lab 5, you go to the lab module and search for Lab 5 document. If you want to take a quiz, you go to the quiz module and take the quiz that's relevant for that week.

While on paper this all looks fine, when you actually go through something like this, you realize that you have to jump to several different modules for find all the tasks that you need to do for that week. I noticed that this is not what they did and how they presented it in all the online workshops that I took. Instead, they organized everything weekly. In Week 1 of the course, these are all the stuff you need to do. Then we go to Week 2 and these are all the stuff you need to do, etc. Each week, the lecture notes, the quiz, the homework, etc.. are all there, in the same folder or module. You don't have to go jumping around to different module to find all the things you need to do. This is such a simple thing to do and yet, it made for a more intuitive approach once you go through it as an online student.

In the end, I'm not as stressed out about my classes being completely online this semester as I was in Spring. I think I'm better prepared for it, and more well-equipped. And boy, I hope it shows in the students' performance this semester. Stay tune....

Zz.

Friday, July 10, 2020

Simple Way To Help Your Instructor During Remote/Online Learning

Dear Student,

Many of you may continue having some form of online or remote learning this coming Fall 2020 and maybe even for the next few semesters. Even if you get back to face-to-face instruction, you may still need to communicate with your instructor electronically. So this advice that I'm going to give you will be applicable especially if you require remote assistance from your instructor.

In STEM subjects, specially math, physics, engineering, etc., face-to-face instruction has the imminent advantage over remote learning because of the simple ability to write and sketch. In physics especially, when we approach or discuss a concept or a problem, a sketch is often required to set up the situation. This is then usually followed by the writing down of mathematical equations, and then the grinding out of the math to solve the problem. In a typical class situation, these are done on paper, and it is the simplest and quickest form to do such a task.

Doing this during remote learning can be challenging. Most of you may be require to show your work, or to show what you have attempted if you need help from your instructor on a homework problem, or even during quizzes and exams. What most students end up doing is to pull out their smartphones, snap a photo of the page with their work, and then e-mailing the image file to their instructors.

I have been on the receiving end of such submission, and in at least half of the cases, it was very difficult to read and decipher the image that I received. Most students do not inspect the images for legibility. I often receive images that are dark, with poor contrast, and often having shadows that made some parts easier to read than others. In addition, the angle that the images were captured may also be rather odd, because most of these were not captured straight on.

Here, I'd like to make a suggestion on how you, as a student, can help your instructor by submitted a clearer and more legible image, using the same equipment that you already have. This is also to your advantage because in the case of a quiz or an exam, if your instructor does not understand or can't read what you wrote, you probably will not be given credit for such work if it is required.

OK, so here is what you should do. Install a scanner app on your smartphone or mobile devices that you frequently use. There are many scanner apps available on iOS and Android platform. Many of these are free or with minimal cost. Next time you need to send a snapshot of your work, use the scanner app instead or using the standard camera app. It makes a tremendous difference, and I'll prove it to you here.

I have an iPhone, and the scanner app that I have is called ClearScanner. The free version of this app has limited capability (no OCR), but it is still sufficient for what you will need it for (I have the full, paid version).

In the first image, it is a page filled with handwritten work. I took this using the iPhone standard camera app. I did not make any edits on the image quality, didn't do any cropping, etc, other than change the image file size. This is what I get.


Now, I hate to say this, but I will. Most of the submission that I received from my students were not even half as good as this. But let's go with this in any case. Now, already you can see that, as someone who has to figure out what has been written, this may not be impossible to do, but you are asking that person to do quite a bit more work here. The low lighting and the crazy angle that the image was taken present a challenge to read this accurately.

Compare this to the image of the same page but taken by the scanner app. I took it from the same angle, and under the same lighting condition. The only difference being that the scanner app asked me to confirm that correct boundary of my document in the image. In this case, the boundary is the edge of the paper. Once I confirmed that, the app took over and produced this image:


The difference is night and day. Not only is the writing clearer here, but the crazy angle is also gone. The app corrects for the angle and presents it as if you scanned it on a flat-bed scanner. And all this with hardly any more effort than taking a regular photo.

I will show another example. In the photo below, another common "feature" of images that I receive can be seen, i.e. shadowing.


Once again, it doesn't look bad in this image, but the ones that I have received were a lot worse than this. I had to do my own image editing to be able to see clearly the writing that was in the shadow.

So how does it look using the scanner app? Again, I didn't do anything to the image other than confirm the boundary of the document. This is what I get:


It looks almost identical to the previously scanned page, with the shadow removed. I assure you that this is a different image than the first scanned image above.

This is such a simple thing to do, with hardly no additional steps and effort, and yet, it produces such a remarkable difference in clarity. Which one, do you think, will your instructor prefer to receive?

Students in my class during the previous semester were told to install such apps if they have the capability. It made a tremendous difference in the quality of the document that they submitted. Many of them also told me that the app was useful to capture even written notes on whiteboard in class. So you may find that it is one of those handy and useful app that you didn't realize you need till you have it.

Zz.
Subscribe to: Comments (Atom)

AltStyle によって変換されたページ (->オリジナル) /