Wednesday 29 April 2009

"The only source of knowledge is experience." Albert Einstein


My mother didn't raise a fool you know.

If you tell me there are fairies at the bottom of your garden, I'm going to remain sceptical.
I need to see that kind of thing for myself.

I never thought I'd like sprouts. I was a sucker and fell for the old, "don't knock it til you've tried it". Thus confirming what I already thought I knew.

I have, on a few occasions, burned myself. Not intentionally, of course, but accidents happen. This has given me a pretty solid understanding of the relationship between heat, touch and pain. Enough of an understanding that I no longer need to touch it to know it.

Pain is a tricky one though. Sure, the root of knowledge may well be in experience, but I'm willing to go out on a limb and take someone's word for it every once in a while. I guess that all depends how intent you are on knowing things for yourself. Some folk are just that bit more die-hard I suppose.

Cue American entomologist Justin O. Schmidt...

The Schmidt Sting Pain Index or the Justin O. Schmidt Pain Index is a scale rating the relative pain caused by different Hymenopteran stings. (Hymenoptera are a large order of insects including bees, wasps and ants.) Bet you can tell where this is going, huh?

His original paper, published in 1984, was an effort to categorise and compare the haemolytic properties (involving the distruction of red blood cells) of insect venom. For this paper, he created an index. Starting at 0 for stings that are completely ineffective against humans, progressed through 2, a familiar pain such as a common bee or wasp sting, and finished at 4 for the most painful stings. In his conclusion, he offered some descriptions of the most painful examples... " Paraponera Clavata stings induced immediate, excruciating pain and numbness to pencil-point pressure, as well as trembling in the form of a totally uncontrollable urge to shake the affected part." Now, that stikes me as a man talking from experience.

Schmidt has since refined his scale, culminating in a paper published in 1990 which classifies the stings of 78 species and 41 genera of Hymenoptera.

Again, Schmidt described some of the experiences in vivid detail:

  • 1.0 Sweat Bee: Light, ephemeral, almost fruity. A tiny spark has singed a single hair on your arm.
  • 1.2 Fire Ant: Sharp, sudden, mildly alarming. Like walking across a shag carpet & reaching for the light switch.
  • 1.8 Bullhorn Acacia Ant: A rare, piercing, elevated sort of pain. Someone has fired a staple into your cheek.
  • 2.0 Bald-faced Hornet: Rich, hearty, slightly crunchy. Similar to getting your hand mashed in a revolving door.
  • 2.0 Yellowjacket: Hot and smoky, almost irreverent. Imagine W.C. Fields extinguishing his cigar on your tongue.
  • 2.x Honet Beeand European Hornet: Like a matchhead that flips off and burns on your skin.
  • 3.0 Red Harvester Ant: Bold and unrelenting. Somebody is using a drill to excavate your ingrown toenail.
  • 3.0 Paper Wasp: Caustic & burning. Distinctly bitter aftertaste. Like spilling a beaker of hydrochloric acid on a paper cut.
  • 4.0 Tarantula Hawk: Blinding, fierce, shockingly electric. A running hair drier has been dropped into your bubble bath.
  • 4.0+ Bullet Ant: Pure, intense, brilliant pain. Like fire-walking over flaming charcoal with a 3-inch rusty nail in your heel.
OK....

That static shock from walking across shag carpet and touching the light switch, I get. It's a pretty universal example. I'll even buy being asked to imagine W.C. Fields stubbing his cigar out on my tongue. I haven't experienced it but it's within my imagining. However, I for one have never mashed my hand in a revolving door. But, this could well be the man that has, purely in the name of science.

Anyway, I'm going to take this guy's word for it.

There are issues with this though. For starters, this is the kind of person that describes a Sweat Bee sting as "almost fruity" and a Hornet sting as "slightly crunchy". Not to mention that this is the type of selfless nutcase that gets himself stung so he can tell others how sore it is. I don't want to burst his bubble but I'm still going to try and avoid being stung anyway, no matter what it's by. And if I am stung, the knowledge that this crazy s.o.b. can empathise will provide little comfort. I'll even go out on a limb and admit that Schmidty can probably take more than me. Some folks are just a bit more into pain. I know this. I have cable TV.

Anesthesiologist Henry Knowles Beecher (1904 - 1976), while serving as an Army medical consultant on the Anzio beachhead, observed that soldiers with pretty serious wounds complained of pain much less than his postoperative patients at Massachusetts General Hospital. Beecher hypothesized that the soldier's pain was alleviated by his survival of combat and the knowledge that he could now spend a few weeks or months recovering in safety and relative comfort. The hospital patient, however, had been removed from his home environment and faced an extended period of illness and the fear of possible complications. Beecher argued that "the reaction component" made pain such a complex and individualized phenomenon that it could only be studied effectively in the clinical setting. Patients with real pain would not exhibit the same physiologic manifestations or the same responses to analgesics as experimental subjects, who knew that they were in no serious danger and that the pain would soon cease.

So, I guess Einstein was right. The only definitive source of knowledge is experience.

But can I take his word for it?

Monday 27 April 2009

"Knowledge is of no value unless you put it into practice." Anton Chekhov


"Theory" is quite interesting.

A theory is essentially an explanation. An analytic tool for understanding any given subject matter, based on abstract ideas about that subject matter.

Uh... Eh?

Well, we have this concept of theory somehow not belonging to the real world. It's something that exists conceptually, rather than in reality. I just want to chuck in the word hypothetical and leave it there. But that's where I started today...

So what do we know?

I've heard of Quantum Theory. This must be different to the driving theory test. I don't remember any of the highway code being based on the principle that matter and energy have the properties of both particles and waves.

So, theory's based on informed guess work. An educated guess about the properties of matter and energy lets you build an elaborate framework of more guessing on top and then call it a theory. A bit like guessing how long it takes a car to break from 30mph if it's icy I suppose. The logic is sound enough that we feel confident in calling it a theory.

Sometimes we use the term theoretical in place of hypothetical to describe a result that is predicted by theory but has not yet been adequately tested by practical experiment. It is not entirely unheard of for a theory to produce predictions that are later found to be incorrect. By inference, a prediction proved incorrect by experiment demonstrates that the hypothesis is invalid. So does the buck stop with the theory? Was the experimental conjecture wrong and the theory failed to predict the hypothesis? This is pretty deep you know.

It feels to me a bit like doing a jigsaw puzzle and finding a few pieces missing. Even without the picture on the box, it's not really a guess that you need a blue piece to fill in that bit of sky. Also, if you've got pieces with blue sky, sea and sand, the theory that you're making a beach scene is a pretty sound one. But how many little pieces do you think you'd actually need to form the basis of this theory? One piece of sea, one of sand and a nice bit of blue sky would make a solid case for beach, right? Well, there's not a red piece or a bit of sail but suppose the focus of the picture is a big red boat with white sails. Our theory may still be correct; this is the beach. But we've missed the key point. And I'm in dander of doing the same now.

Theories are great, in theory. They move thought on, passed the limitations of actual experience. You run with one until you fill in the gaps and it stops to being a theory. Like when we were thinking about ignorance, the gaps are a good thing. They inspire us to go on and learn. The beef, as usual, is with false knowledge. This is when theories can be damaging. Either they go on long enough that they're taken as fact themselves, or they are cited as evidence, forming the basis of some false knowledge.

The term "theory" is often used colloquially to refer to any explanatory thought, even fanciful or speculative ones. However, in scholarly use it is reserved for ideas which meet baseline requirements about the kinds of observations made, the methods of classification used, and the consistency of the theory in its application among members of that class. Either way, they exist to be scrutinized. They must be continually revised to conform to new observations, by restricting the class of phenomena the theory applies to or changing the assertions made.

Sometimes, scholars set aside a theory because there is no way to examine its assertions analytically. These may continue on in the popular imagination until some means of of examination is found which either refutes or lends credence to the theory. But, it does not simply turn into fact just because it's a long-standing theory. Whether it's about relativity or evolution, theories have to be a work in progress.

Wednesday 15 April 2009

"One of the advantages of being disorderly is that one is constantly making exciting discoveries." - A. A. Milne


The Epistream flippantly glossed over a couple of major discoveries in our last installment. This was to emphasise the obscure roots of ordinary items and was quite intentional.

However, the disregard for those breakthroughs has left me wondering whether or not the case had been closed by way of false knowledge. To address this, let us look at the discovery of penicillin.

Under a microscope, the spore-bearing arms of the mould resemble tiny paintbrushes. The Latin for writer's paintbrush is penicillum, which is where we get the word pencil too.

So, Penicillin. Discovered by...?

Wrong.

Sir Alexander Fleming did receive a Nobel Prize for the re-discovery of its antibiotic qualities in 1944. Now, the top of the class will no-doubt have noticed the use of that critical prefix. The
semantic argument on ones ability to re-discover, gripping as it is, will be saved for another time. The real beef here, is mould.

The earliest documented use of mould as an ingredient in healing ointments comes from the Bedouin tribes of North Africa. For over a thousand years they have exploited the antibiotic qualities of the mould that grows on their donkeys' harnesses.

1897 witnessed another triumphant act of re-discovery as French army doctor Ernest Duchesne observed Arab stable hands using the mould from damp saddles to treat saddle sores. Exactly where the stable hands re-discovered this technique from is unknown. However, Duchesne identified the mould as Penicillium Glaucum and used it to cure guinea pigs (actual guinea pigs, rather miscellaneous experimental subjects) of typhoid. He also studied it's effect on the E. coli bacteria in what was the first clinical testing of the antibiotic now known as Penicillin. This research formed the basis of his thesis, in which he advocated further study into its antibiotic implication. Sadly, the Institut Pasteur never acknowledged his work and Duchesne died of tuberculosis in 1912. Ironically, a disease that his own discovery would go on to cure.

Back to Sir Fleming...

Sir Alexander Fleming recounted that the date of his breakthrough re-discovery was the morning of Friday, September 28, 1928. Another example of fortuitous, accidental discovery, it occurred in his laboratory, nestled in the basement of St. Mary's Hospital, now part of London's Imperial College. Fleming noticed that a petri dish containing Staphylococcus culture had been mistakenly left open and was contaminated by blue-green mould. Around it was a halo of inhibited bacterial growth. He concluded that the mould must be releasing a substance that was lysing the bacteria. He grew a pure culture and (re-)discovered that it was a Penicillium mould. At the time, he identified it as Penicillium Rubrum but Charles Thom recognized it to be Penicillium notatum many years later.

Anyway, if you'd like to try your hand at a bit of re-discovering, (and who knows, you might just get a Nobel prize for it too) then here are some likely places to find penicillin...

  • Brie
  • Camembert
  • Danish Blue
  • Gorgonzola
  • Limburger
  • Roquefort
  • Stilton

Now, I'm off to do a bit of intrepid re-discovering myself.

Tuesday 14 April 2009

"The beginning of knowledge is the discovery of something we do not understand."
- Frank Herbert


There is just so much out there that's seemingly beyond understanding. Human behavior being top of the list. Maths too. And the natural world. It's all a massive jumble of interesting, thought provoking, question raising and frustrating facts and theories. But, this environment is a glasshouse for knowledge. A cognitive grow-bag, sprouting runners of investigation and contemplation.

Like the great swathes of empty space on Marlowe's map of the African continent, soon to be over-run with the inroads of civilisation, the more that we discover, the more questions we unearth, the more we're drawn into the Darkness.

So, Frank's right. Knowledge starts with the discovery something we don't understand. Something that our inquisitive ignorance gives room to grow, rather than truncating it with false knowledge.

There are the scientific discoveries, like X-rays and Penicillin, whose invention is synonymous with the auspicious stumblings of the technically minded. These are the kind of inventions that pose the big questions and add to societies collective knowledge. But you'll kind of, sort of, pretty much know about that already. Such is their position in modern folklore. So check them out.

Now, as you tip up the bag and let the last few crispy, salty morsels fall into your mouth, I bet you think your understanding on the subject is just about total. The empty bag, a fitting symbol of the limit of snack time discovery.

Well, it wouldn't be the Epistream if it ended there...

Lots of amazingly mundane, everyday things have been discovered by accident. And, although our understanding of crisps and silly putty may now feel absolute, this was not always the case.

I give you, the top ten discoveries that, until now, probably didn't seem too exciting!


  1. Fireworks - Having originated in China about 2,000 years ago, legend has it that they were accidentally invented by a cook who mixed together charcoal, sulfur, and saltpeter. I assume that these items commonplace in the kitchens of era. The mixture burned quickly and when compressed in a bamboo tube, exploded.
  2. Crisps - Chef George Crum created the salty snack in 1853 at Moon's Lake House near Saratoga Springs, New York. Apparently, fed up with a customer who kept sending his fried potatoes back because they weren't crunchy enough, Crum sliced the potatoes as thin as possible, fried them in hot grease and doused them with salt. The "Saratoga Chips" became an over night success at the lodge and throughout New England. Eventually, the chips were mass-produced for home consumption, but since they were stored in barrels, they went stale very quickly. That was until the 1920s, when Laura Scudder invented the airtight bag by ironing together two pieces of waxed paper. This kept the crisps fresh for longer.
  3. Play-Doh - Accidentally invented in 1955 by Joseph and Noah McVicker while trying to make a wallpaper cleaner. It was marketed a year later by toy manufacturer Rainbow Crafts. More than 700 million pounds of Play-Doh have sold since then but to this day, the recipe remains top secret.
  4. Slinky - In 1943, naval engineer Richard James was in development of a spring that would support and stabilize sensitive equipment on ships. When one of the springs accidentally fell off a shelf, it continued moving, and James got the idea for a toy. His wife Betty came up with the name and the Slinky made its debut in late 1945. To date, more than 250 million Slinkys have been sold worldwide.
  5. Saccharin - Accidentally discovered in 1879 by researcher Constantine Fahlberg at Johns Hopkins University. Fahlberg's discovery came after he forgot to wash his hands before lunch. He had spilled a chemical on his hands now noticed the bread he was eating tasted unusually sweet. In 1880, Fahlberg jointly published the discovery with his professor, Ira Remsen. In 1884, Fahlberg obtained a patent and began mass-producing saccharin without Remsen. The use of saccharin did not become widespread until sugar was rationed during World War I. Its popularity shot up again during the 1960s and 1970s with the manufacture of Sweet'N Low and the introduction of diet soft drinks
  6. Post-it Note - The "small piece of paper with a strip of low-tack adhesive on the back" was conceived in 1974 by Arthur Fry as a way of holding bookmarks in his hymnal while singing in the church choir. He was aware of an adhesive accidentally developed in 1968 by fellow 3M employee Spencer Silver. No application for the lightly sticky stuff was apparent until Fry's "small piece of paper with a strip of low-tack adhesive on the back". The 3M company was initially skeptical about the product's profitability, but in 1980, the Post-it was introduced around the world.
  7. Silly Putty - It bounces, it stretches, it breaks. The silicone-based plastic clay was marketed as a children's toy by Binney & Smith, Inc. During WW2, while attempting to create a synthetic rubber substitute, James Wright dropped boric acid into silicone oil. The result was a polymerized substance that bounced but had no real use. That was, until 1950, when marketing expert Peter Hodgson saw its potential as a toy and renamed it Silly Putty. It also has many practical uses; picking up dirt; stabilizing wobbly furniture; stress reduction; physical therapy; and in medical and scientific simulations. It was even used by the crew of Apollo 8 to secure their tools in zero gravity
  8. Microwave - In 1945, Percy Spencer was experimenting with a new vacuum tube called a magnetron while doing research for the Raytheon Corporation. Intrigued by the melting chocolate bar in his pocket, he tried another experiment with popcorn. When it started popping, Spencer saw the potential in this revolutionary process. In 1947, Raytheon built the first microwave oven, the Radarange, which weighed 750 pounds, was 51/2 feet tall, and cost about $5,000.
  9. Corn Flakes - In 1894, Dr. John Harvey Kellogg was the superintendent of the Battle Creek Sanitarium in Michigan. He and his brother Will Keith Kellogg were Seventh Day Adventists and they were searching for wholesome foods to feed patients that also complied with the Adventists' strict vegetarian diet. When Will accidentally left some boiled wheat sitting out, it went stale by the time he returned. Rather than throw it away, the brothers sent it through rollers, hoping to make long sheets of dough, but they got flakes instead. They toasted the flakes, which were a big hit with patients, and patented them under the name Granose. The brothers experimented with other grains, including corn and in 1906, Will created the Kellogg's company to sell the corn flakes. On principle, John refused to join the company because Will lowered the health benefits of the cereal by adding sugar.
  10. Mauve - An 18-year-old chemist, William Perkin, wanted to cure malaria but instead his scientific endeavors changed the face of fashion forever. And, helped fight cancer. In 1856 Perkin was trying to come up with an artificial quinine. Instead of a malaria treatment, his experiments produced a thick murky mess. The more he looked at it, the more Perkin saw a beautiful color in his mess. This was the first-ever synthetic dye. His dye was far better than any dyes that came from nature; the color was brighter, more vibrant, and didn't fade or wash out. His discovery also turned chemistry into a money-generating science - making it attractive for a whole generation of curious-minded people. One of the people inspired by Perkin's work was German bacteriologist Paul Ehrlich, who used Perkin's dyes to pioneer immunology and chemotherapy.

Friday 10 April 2009

"Knowledge is of two kinds. We know a subject ourselves, or we know where we can find information on it." Samuel Johnson


We discussed the difference between knowledge, false knowledge and ignorance in the last entry of the Epistream. To expand on where we left off, King Thamus was skeptical of Theuth's new invention, writing, because he felt that it provided "the conceit of wisdom, instead of real wisdom".

Essentially, anyone who takes anything that they read to be gospel (including the Gospel... and the Epistream) is furnished with this "conceit of wisdom", or false knowledge.

  • A priori knowledge is independent of experience.
  • A posteriori knowledge is dependent on experience.
So, we're all going to be more inquisitive. If something interests us, we'll try to find out all we can about it. If possible, we'll do this through experience too. Using our nine senses.

Wait... Nine senses?

If you thought there were only five, get with it people.

Aristotle was the first to set out the original five senses. Sight, Smell, touch, taste and hearing. We should really take anything he said with a pinch of salt though. This was a guy who thought that some baby animals appeared by magic, without parents, out of mud and water. Obviously, this was because he didn't have a microscope to observe the minuscule eggs. But, come on. He also thought that eels didn't breed, bees were created by rotting bull carcasses, flies had four legs and intelligence came from the heart rather than the head.

So, the sixth sense? No matter what you've heard, it's not for danger, or for a bargain. It's Thermoception. The top of the class will have already worked out that this is a sense of heat.

Sense seven, Equilibrioception. Again, you guessed it, this is our sense of balance.

Sense eight, Nociception. Right, a gold star for knowing this one... It's the perception of pain from the skin, joints and organs. Excluding the brain, which has no pain receptors. Regardless of where your headache feels like it's coming from.

Sense nine, Proprioception. Often referred to as "body awareness", this is the sense of where your body parts are when you can't see or touch them.

It might not even end there. Neurologists often argue the case for your sense of hunger or thirst, sense of depth, or even a sense of meaning.

And what about Synaesthesia? Which is a neurologically based phenomenon where stimulation of one sense leads to automatic, involuntary experiences of a second sense.

In color-graphemic synesthesia, letters and numbers are perceived as inherently coloured.

In ordinal linguistic personification, numbers, days of the week and months of the year seem to have their own personalities. Whereas, in spatial-sequence synesthesia, numbers, days of the week and months of the year elicit precise locations in space.

Visual motion / sound synesthesia involves hearing sounds in response to visual motion.

Pretty weird huh? Well, maybe not. Outwith our species, we have evidence of some really interesting senses.

Electroception allows sharks to sense electric fields.

Birds and insects migrate using their sense of magnetoception. The sense of magnetic fields.

There's also echolocation in bats and dolphins.

Even infra-red vision in owls.

All of these amazing, information gathering tools. And we rely on googleception?
Well, I think King Thamus would have a pretty dim view of it, don't you?

Thursday 9 April 2009

"To be conscious that you are ignorant is a great step to knowledge" Benjamin Disraeli


I'm taking a stand today people.

Knowledge is a perpetual goal. The carrot on the end of long stick. A driving force.

False knowledge is a dead end. Damaging, limiting and destructive to human evolution.


There's a great bit in the opening of Conrad's Heart of Darkness where Marlowe talks about the map of the African continent as it appeared in his youth. For Marlowe the child, Africa's outline was essentially the sum total of topographical knowledge and this made it quite easy to swallow. This is conceptually akin to that area of uncharted waters which simply read, "There be monsters". This is where imagination lives.

The progression of our knowledge can be charted by the way our ignorance is continually expanding to encompass more and more as time goes on. It reminds me a bit of watching Lost. You've got questions, which upon being answered simply raise more and more questions. As infuriating as this can be, it's healthy.

In an information vacuum, imagination rampantly consumes all available space. It's only natural that a blank canvas leaves plenty of scope to add unfettered form and colour. Complications arise, as Marlowe recognised, when we begin to shade that lacuna with supposed "fact". This is when Darkness takes over those wide open spaces. As
George Bernard Shaw expertly put it, “Beware of false knowledge; it is more dangerous than ignorance.”

Ignorance breeds curiosity and fuels imagination. An inquisitive mind seeks to understand why, where false knowledge considers the matter closed. People tend to scapegoat ignorance for others backward, bigoted or biased behavior, whereas I would certainly attribute these traits to assimilated, false knowledge.

The most universal method for the transfer of knowledge is writing. Plato's work Phaedrus recounts Socrates' story of the Egyptian king Thamus and Theuth, the inventor of the written word. In this story, Theuth presents his new invention to King Thamus, telling him that it'll "improve both the wisdom and memory of the Egyptians". King Thamus is skeptical, rejecting it as a tool to aid recollection, rather than a means of retaining knowledge. His argument is that the written word will infect the Egyptian people with fake knowledge because they'll get their facts second hand, without experience.

Sorry. Remind anyone of anything? How much of this internet learning do you think really sinks in? Well, any questions, google 'em.

Wednesday 8 April 2009

"Three is a magic number." Bob Dorough

Or, third time's the charm. Whichever.

This is the third installment of the Epistream and I've got to tell you, I'm pretty damn excited.


So, why does 3 have such a good rep? Let's start at the beginning.

Historians have come across similar patterns in the evolution of the number three in early counting systems. It is evident that amongst primitive societies, they have a clearly defined, language equivalent word to describe the quantities "one" and "two" but that they follow that with "many", or "more".

Often, three is the largest number written with as many lines as the number represents. Roman and Chinese numerals still use the number "III". This was how Brahmin Indians wrote it, then the Gupta made the three lines curved. The Nagari turned the lines clockwise and ended each line with a slight downward stroke. Eventually, these strokes connected to the line below and presto, it evolved into a character that looks pretty much like a 3.

Check it out...

There are loads of 3's out there. In fact, they're everywhere.

Your ear has 3 semicircular canals and 3 ossicles (the smallest bones in your body).

There are 3 distinct social groups among the Great Apes:
  1. Orangutans (Solitary - little amount of both sexes)
  2. Gorillas (Harems - great amount of one sex)
  3. Common Chimps (Live in territories defended by related males - great amount of both sexes)
There are three types of galaxy:
  1. ellipticals
  2. spirals
  3. irregulars
RNA and DNA each have a triplet codon system.

Three is the atomic number of Lithium, which is also the 33rd most abundant element on Earth.

Atoms consist of three constituents: protons, neutrons and electrons.

A proton consists of three quarks: two up quarks and one down quark. A neutron also consists of three quarks: two down quarks and one up quark.

Isaac Asimov developed the Three Laws of Robotics, which state:
  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
There are three basic rock formations: Igneous, Metamorphic and Sedimentary.

Freud discussed his three tier model of the psyche in the 1920 essay "Beyond the Pleasure Principle". This was elaborated upon in "The Ego and The Id" (1923), where he developed it as an alternative to his previous topographic schema (conscious, unconscious, pre-conscious).

3 Indian Gods: Brahma, Vishnu and Shiva (Maheshwara)

3 Greek gods: Zeus, Poseidon and Hades

3 Roman gods: Jupiter, Neptune and Pluto

3 forms of Odin in Eddic Mythology: Har, Jafnhar and Thridi

Budhism's The Triple Bodhi (ways to understand the end of birth)- Budhu, Pasebudhu, Mahaarahath

Three main Abrahamic religions: Judaism, Christianity and Islam

The Holy Trinity of the Christian doctrine: the Father, the Son and the Holy Spirit.
This is also known as Tripartite division or the Godhead.

According to the Gospel of John, Jesus spread Christianity for 3 years.

Jesus supposedly rose from the dead on the third day after his death.

Jesus predicted that Peter would deny him three times.

Islam has devotional rites and certain formulas which are repeated three times and others thirty-three times.

A devout Muslim tries to make a pilgrimage to all three holy cities in Islam: Mecca, Medina and Jerusalem.

3 Musketeers

3 Blind Mice

3 Bears ( and Goldilocks)

3 Little Pigs


BUT...


There is no evidence to support goldfish having a three second memory.

It is generally accepted that there were three Wise Men that went to visit the baby Jesus. However, this is just a guess based on the number of gifts that they brought. There is nothing in the bible to support that there were 3 men, or even that they were men, or even that they were particularly wise. They were referred to as Magi and this tells us nothing about their number, wisdom or gender.

Going to Work is 3 times as dangerous as going to War. Statistically speaking.
About 2 million people die from work related accidents every year, as opposed to roughly 650,000 who die at War.

So, take care... and avoid that disgruntled guy in the mail room.



Tuesday 7 April 2009

"All men by nature desire knowledge." Aristotle


Here's today's nugget of pyrite for you folks.

And it comes to you with just the slightest hint of irony.

With thanks to Al "ManBearPig" Gore...


Chilly?

Well, it may or may not come as a surprise to you that we are currently experiencing an Ice Age.

OK. Not the full fat, woolly mammoth, Everybody Loves that charming movie franchise kind of Ice Age. But a totally bona fide Ice Age none the less.

An Ice Age is any period in the Earth's history where we have caps of ice at the Poles. The term for the period we're living in right now is 'Interglacial'. Yes, this does make it sound like we're between ice ages. However, it actually refers to the time of warming where our ice is retreating to the Poles. The duration of an interglacial period is not fixed. Experts, of which, I must stress, I'm not one, estimate their duration to be somewhere between 12,000 and 50,000 years. Not taking man's impact into account. However, how it's possible not to I don't know, considering the variables are things like atmospheric conditions. Others factors are things like; the layout of the continents and our planet's orbit. Not that we have any influence over that. But give us time.

We've apparently seen an average increase in surface temperature of 0.2°C per decade in the past 30 years. In fact, we're supposedly within about 1°C of the maximum temperature the planet's achieved in the past million years. And allegedly, global warming of more than 1°C, from the year 2000, will constitute a potentially dangerous climate change. That is, based on models of the likely effects it'll have on sea level and it's impact on species.

Well. A degree doesn't seem a lot to me. But...

In about the year 1500, the average temperature in Northern Europe dropped by a degree and we ended up with polar bears in Orkney. Well, a polar bear. Still, one's enough.

The "Little Ice Age" lasted about 300 years. During that time the Arctic ice sheet stretched far enough south that, not only did Orkney get a visit from a disgruntled and obviously disorientated polar bear, Eskimos even kayaked to Scotland on at least six occasions. As a small side note, Eskimo isn't a derogatory term. Originally coined by Algonquin Indians to describe those people that lived in high Arctic regions, it can mean "someone from another country" or "someone who speaks another language". It's not very P.C. in Canada, where they correct term is Inuit, but Alaskan Eskimos actually prefer it. Chiefly because they are most definitely NOT Inuit.

But anyway, back in the Little Ice Age...

A possible cause for it has been put forward by Utrecht University, the Black Death.

The cataclysmic drop in the population of Europe resulted in massive swathes of fertile farmland abandoned and eventually engulfed in millions of trees. Trees love carbon dioxide, so this would have led to a huge leap in the absorption of CO2 from our atmosphere. This would have led to a drop in the average temperature, the inverse of the greenhouse effect.

This was shortly followed by the eruption of Mount Tambora, Indonesia in 1815. The ash released into the atmosphere gave us a little taste of Nuclear Winter too. 1816 was referred to as "the year without Summer". However, it did provide the inspiration for Lord Byron's Darkness and set the scene for Mary Shelley's Frankenstein. So, not all bad.

Monday 6 April 2009

The Epistream

Knowledge is Power.

That being the case, please view this blog as a 9 volt brain-battery.

Go on, touch your tongue to terminals.

Feels good?

Yeah it does.

Want more?

Sure you do.

Consider it fact fuel. A little cleverness-coal with which to stock your synapse. Let's keep those little embers going 'til home time, shall we?

Well, first things first. Epistream. Not a real word. It's the result of the rough splicing of episteme and stream.

Stream, in this case, is taken to mean the continuous flow of data or information, which typically has a constant, or at least predictable, rate. In terms of the digital transmission of material, the parts which arrive first may be viewed immediately, while you're awaiting the rest. Much is the same here. You can enjoy today's info, safe in the knowledge (or anxious in the assumption, depending whether you're a glass half full or half empty person) that there is more flowing along behind.

Epi- was taken from Episteme. Coming from the Greek verb ἐπίσταμαι, "to know," episteme can be understood to mean a single unit of knowledge. Imagining our brain to be a ball-pool, each ball in the pool would represent one of these units of knowledge. If someone were to pull the plug on this ball-pool the resulting flow could be called an Epistream. Then, if they were to pass out each one, in no particular order and without proper citation, it'd look a lot like this blog.

Epistemology
is a branch of philosophy concerned with the nature and scope knowledge. The concept was introduced into English by the Scottish philosopher James Frederick Ferrier (1808–1864). Michel Foucault (1926-1984) used the term épistémè in his work The Order of Things (1966) to mean the historical a priori. That is, knowledge that's independent of personal experience or based on theoretics.

It is not to be confused with an Epistrema. This is a genus of the moth family Noctuidae, or Owlet moth, which includes more than 35,000 known species.

Other moth facts:

In the first 56 days of their life, the larva of the Polyphemus moth eats 86,000 times it's birthweight.

The Hawk moth (Sphinx) is the worlds fastest flying insect attaining a speed of over 50 kph.

Moths are NOT attracted to lightbulbs. They're disorientated by them. Apart from the odd forest fire, earthly light sources have been in existence for a very short time in comparison to the moth and moths are therefore used to using only the moon and the sun as light sources to guide them. Because the moon and sun are very far away, the incoming light rays that strike the insect arrive virtually parallel to each other. Moths have therefore evolved to expect to receive light at a fixed part of the eye and as long as they fly in a pretty straight line, this visual pattern stays the same. When the light source is a nearby bulb, the rays of light are coming from all angles. The moth tries to follow a straight line but ends up spiralling towards it.