There are many, many small misconceptions which are extremely widespread and just refuse to die. Here are some of them.
Almost everybody has heard this one, and almost everybody believes it without the slightest of doubts: Glass is a liquid with a really high viscosity. Because of this, when a piece of glass is subject to gravity for long enough, it will (very) slowly flow down.
Well, no. Whether glass is officially defined a liquid or not is a matter of definition (AFAIK it's not defined as such), but it certainly doesn't flow. There is numerous evidence against this notion, for example:
People's response: "But glass panes on old church windows are thicker at the bottom."
Yes, that's the whole (and only) source of the misconception. Regardless of all the evidence of the contrary, the window panes are considered to be the ultimate proof that glass is a liquid and flows. (People also get in a difficult position when they are asked about glass artifacts which are thousands of years older than those glass panes and do not show any sign of flowing whatsoever.)
For some reason these people just can't understand the notion that a phenomenon such as this one might have some other explanation. That the "glass is a liquid" explanation may actually be wrong. Just because you can't think of any other explanation doesn't mean there is none.
Hundreds of years ago the glass industry was not as advanced as it is today, and the method by which glass panes were created caused the panes to have uneven thickness: One edge of the pane would be thicker than the opposite edge. When installing the panes on windows, the thicker edge was always put on the lower edge of the window for stability.
Yeah, space is really cold. Absolutely freezing. Anything put into outer space (such as a human) will completely freeze in seconds.
Well, no, space is not cold. Vacuum has no temperature. In fact, vacuum is a rather good temperature insulator because it doesn't transmit heat by conduction or convection.
Most people get this one wrong. Even knowledgeable people who get all the other facts about space right (such as people not exploding when they are in vacuum) get this one wrong.
There are three basic methods for heat transfer: Conduction, convection and radiation. A piece of meat in a freezer will freeze in a matter of hours mainly because of conduction and convection. Radiation has only a very minimal effect on it.
In vacuum there's no conduction nor convection. The only way to transfer heat is radiation. Unlike many people might think, objects don't radiate heat very well (unless they are very hot).
If there are no heat sources nearby, a human body in vacuum will indeed freeze eventually by radiating its own heat away, but this will happen very slowly. It will certainly not happen in a matter of seconds, like depicted in some movies.
Many scifi movies happen in the solar system, and there's a huge radiator of heat nearby: The Sun. It may even be that a human in interplanetary space never freezes because the Sun is constantly keeping it warm, unless he is very far away from it.
One of the biggest problems in designing spacecraft (even theoretical interstellar ones) is not how to keep them warm, but on the contrary, how to keep them cool. Motors, electronics and such all create heat, and there's nowhere this heat can dissipate to. Vacuum is a good temperature insulator and thus it's very hard to get rid of all that heat. Just putting a big thermal sink on the outer hull of the spacecraft would not be enough because it will radiate heat away too slowly.
One thing which might add to all this confusion is the so-called cosmic microwave background radiation, which is the phenomenon that the entire Universe is basically a black-body radiator of approximately 3 kelvin. Many people get confused by this and believe this means that anything put into deep space will quickly freeze to 3 kelvin.
The background radiation has nothing to do with how fast something will freeze in space. In fact, quite ironically, the effect is the opposite: Background radiation adds to the temperature of the object (in other words, it's another source of heat), it doesn't freeze it. (Of course if it's the only source of heat, it will, naturally, not be enough to keep the object warm. It just means basically that the object has a 3 kelvin heat source around it.)
Still not convinced? Well, consider this:
The Spitzer Space Telescope is an infrared observatory that launched in 2003. Since it detects infrared light, its instruments need to be cooled, or else they will emit infrared radiation which would interfere with the images. To do this, liquid helium was used to cool these instruments to almost absolute zero.
Now, if space was really, really cold, and everything put into space would freeze in seconds, why would they need to use liquid helium to cool off the instruments? Wouldn't the coldness of space be enough?
No, because as said, vacuum is a good insulator, and a very poor way of cooling anything, especially instruments which produce heat. That's why they needed the liquid helium to cool it down.
(Unfortunately by now the liquid helium in the Spitzer has boiled away, so most of the detectors are useless due to the heat, except for a couple which are not affected so much by it and can run by simply being shaded from direct sunlight.)
No. A defibrillator is used to stop a heart which is fibrillating (hence its name). It's not used if the heart is completely stopped (not even fibrillating).
If the heart starts fibrillating, its muscles basically get "out of sync" and start contracting erratically and randomly. This is a life-threatening situation which can kill a person in mere minutes (because bloodflow stops).
What the defibrillator does is that it induces a huge shock to the heart muscles to stop them from doing that. After they have all been stopped at the same time, they can start beating in sync again (hopefully, at least). Usually regular CPR is performed after the defibrillation to induce the heart to beat again (something you seldom see in movies).
I really can't understand why this misconception is so persistent regardless of all the evidence of the contrary:
The correct (and far simpler) explanation is that the sunlight heats the black panel more than the white panel, and consequently the gas near the black panel gets heated more as well, causing it to expand, rotating the mill.
It's that simple. Let it rest already, will you?
The experiment is: Weight an empty balloon, then fill it with air, and weight it again: It will be heavier. Thus it showcases the weight of air?
No, it doesn't. That's just as ridiculous as putting an empty balloon in water, weighting it (under water), then filling the balloon with water and weighting it again. In this case there will be no weight difference. Does that mean water doesn't have any weight?
What this experiment showcases is the In some parts of the world (seemingly especially in the United States)
there's a widespread misconception that the end of the compass which points
towards North is actually the minus (ie. South) end of a magnet,
while the end which points towards South is the plus (ie. North) end.
In other words, a compass only labels them "N" and "S" to designate the
direction they point, not according the true polarity of the magnetic needle.
Of course this is completely wrong, as anyone can easily test with a
regular magnet which has its plus (N) and minus (S) ends marked: Make it
float on water and see which end points North. The plus end will
end up pointing North. That means that the magnetic pole near the geographical
North pole of the Earth is a minus (South) magnetic pole, and thus
attracts the plus (North) end of a magnet.
However, this entry in this list is not really about that misconception,
which should be rather clear to anybody. This entry is more a rant about
misleading terminology.
For example, read the
wikipedia article
related to the subject. It constantly and confusingly uses a diverse set of
similar terms to describe magnetic poles. It uses and freely mixes together
terms like "North Magnetic Pole", "North Geomagnetic Pole" and "magnetic field
north pole", without clearly explaining what is meant with them.
For example, this kind of sentence is rather confusing: "As described later
in this article, the North Magnetic Pole is physically a magnetic field south
pole. The North Magnetic Pole should not be confused with the lesser known
North Geomagnetic Pole, described later in this article." (And no, the
explanation "later in this article" does not make it any clearer.)
So is it the magnetic north pole, or the magnetic south pole? What's the
difference between a "magnetic pole" and a "magnetic field pole"?
Why use such confusing terms? Even if the article is not technically
incorrect, it's really confusing and can only further misconceptions. It's not
surprising that everything on the discussion page for that article is related
precisely to the confusing terminology.
Why not keep it plain and simple: The end of the compass needle which points
towards North is the magnetic North of the magnet in the needle. The magnetic
pole close to the geographical North pole of the Earth is Earth's magnetic
South pole. Period. No more terms are needed. No need to confuse and already
confusing subject with tons of similar and ambiguous terms which randomly mix
the words "North" and "South".
Especially sentences like "the magnetic north pole is actually the magnetic
south pole" is something which should be avoided. It makes absolutely no sense
and is self-contradictory. Just say "magnetic south pole", period. No need to
confuse things.
The subject of General Relativity (GR) is a whole lot more complicated than
what popular science depicts. Often people have a notion of GR which is too
simplified, and then have all kinds of misconceptions based on this limited
understanding. Curiously, these misunderstandings can happen even to
scientists themselves, and can sometimes be found in scientific publications.
Superluminal motion (or, more precisely, the act of the distance between
two objects growing faster than the speed of light in vacuum, c) is
one of the most often misunderstood notions in GR.
The simplified, "popular physics" notion is that nothing can move faster
than c, period. Thus when people are told eg. about the notion of
the so-called cosmological horizon, they immediately reject the idea
because it goes against their understanding of GR.
The notion of the cosmological horizon is that the size of the
observable Universe is smaller than the whole Universe because,
due to the expansion of the Universe,
there are parts of it which are receding from us faster than
c. The parts which recede from us faster than
c cannot be observed in any way because nothing can reach us from
those parts. The maximum observable distance (ie. the part of the Universe
which recedes from us exactly at velocity c) is the absolute limit
for the observable Universe, and thus forms an impassable horizon for us.
There's no way we can detect anything past that point. If there's something
farther away than this distance from us, we cannot know about it.
Most people immediately reject this possibility because it goes against
the notion that nothing can travel faster than c, that something
moving away from us faster than that would break the laws of physics.
Likewise people reject the idea that in the initial stages of the Big Bang
the Universe could have expanded at an exponential rate, way faster than
c. Again, they think that this is impossible according to GR because
nothing can travel faster than c.
However, that notion is an oversimplification of GR.
What GR tells us is that nothing can travel between two points
in space faster than c. In other words, it's impossible for anything
(such as an object or a particle) to go from point A to point B faster than
c. This is often formulated saying that information cannot
be transferred between two points faster than c.
However, GR does not prohibit the distance between two points to
grow faster than c. On the contrary, GR actually predicts
this possibility. This may at first sound like it's the same thing, but there
is a subtle, but extremely important difference between the two things.
The distance between two points in space may grow faster than c,
but this does not break the GR rules because there is nothing
travelling between the two points at that speed. There is no
exchange of information (eg. in the form of particles) between the two points.
(In fact, if two points are receding from each other faster than c,
it becomes completely impossible to transfer anything between these two points.
Effectively there's an impassable horizon: From the perspective of the first
point the second point is behind an impassable horizon and thus no knowledge
about it can be retrieved in any way.)
This property of GR thus does not allow superluminal travel. We are still
bound to the upper limit of c. We cannot reach the stars faster than
light. However, it is possible for some star to recede from us faster than
c.
(The reason why it is possible for two objects to recede from each other
faster than c is that the geometry of space is not static.
The expansion of the Universe causes this geometry to constantly change. In
vernacular terms, we can say that "new space" is formed between the two
points which are receding from each other. In other words, the geometry of
the space between the two points is changing so that the distance between
the two points grows faster than c. As said, this is actually
something that GR predicts rather than forbids.)
There are actually situations other than the expansion of the Universe
where distances grow faster than c. For example, an
ergosphere is an elliptical space around a rotating black hole,
outside its event horizon, where objects can travel faster than light (due to
an effect called frame-dragging). Again, this is something
predicted by GR, rather than prohibited by it. And again, this
phenomenon only affects distances, but cannot be used to transfer information
between two points faster than c.
Yet even with all these explanations the misconceptions persist. People just
refuse to understand that GR is a bit more complicated than how it is presented
in popular science (or one could even say, popular culture).
In some countries, such as the USA, rat extermination is a huge business.
Millions and millions of dollars are spent each year in rat extermination.
Sadly, this is mostly a scam: The rat extermination business rides on the
widespread belief that rats are a pestilence, that they spread diseases and
are dangerous.
This misconception comes mainly from the bubonic plague pandemic in the
middle ages, which was back then assumed to be caused by rats, and produced
a massive anti-rat hysteria for the centuries to come. While it's not
absolutely certain what exactly caused the bubonic plague, it's widely
accepted that it was indeed spread by rats, although indirectly: More
precisely, it was spread by fleas, and since rats were the most widespread
warm-blooded mammal around, they indirectly helped spreading the disease by
carrying the fleas around.
Nowadays, however, you are much more likely to get flea bites from fleas
in dogs and cats than the ones in rats. And there is no pandemic being spread
by fleas nowadays. Rats are completely innocent in this regard.
Also, contrary to popular belief, rats are less likely to carry rabies than
many other mammals. While being bitten by a rat should always be taken
seriously, this is not because it's precisely a rat, but because it's an
animal bite. Any animal bite, be it a rat, a squirrel or even a dog, should
be taken equally seriously. (In fact, a dog bite can actually be more dangerous
than a rat bite, as dog saliva can contain certain micro-organisms which can
be deadly to humans.)
This is one of those phenomena where everyone just "knows" by popular
culture that rats are "dangerous", even though there is just no evidence.
When was the last time you heard about any epidemic spread by rats? Heck,
when was the last time you heard of anyone having got a disease from a rat?
While there certainly are individual cases, they are much rarer in the modern
world than diseases spread by other animals. You are much more likely to catch
a disease from a dog than from a rat (if nothing else, because people are
often a lot more exposed to dogs than to rats).
On the contrary, in many cases rats can actually help stopping diseases
from spreading, by disposing of dead animal corpses and other garbage. In
other words, rats are probably more beneficial than harmful to
human societies: They keep our environment cleaner (of animal corpses and
other garbage) than it would be without them. Getting completely rid of
rats would actually be a disservice to humans.
There are way more dangerous spreaders of disease, such as some species
of fly.
Of course this doesn't mean that you should pet a wild rat you see in a
sewer, but the danger posed by rat infestations is greatly, greatly exaggerated
(and abused for money). Rats are mostly innocent and don't spread any more
diseases than other animals, nor are they especially dangerous when left alone.
I don't really understand where this urban legend has originated, but it's
surprisingly common (I have seen all kinds of numbers, including such an
extreme as 99%). The real number is 50 to 60 percent.
Magnetic north pole
Superluminal motion
Rats are a pestilence
The human body is over 90% water