Thursday 30 May 2013

Virus

Viruses have existed as long as life has been on earth.
Early references to viruses

Early references to viral infections include Homer’s mention of “rabid dogs”. Rabies is caused by a virus affecting dogs. This was also known in Mesopotamia.

Polio is also caused by a virus. It leads to paralysis of the lower limbs. Polio may also be witnessed in drawings from ancient Egypt.

In addition, small pox caused by a virus that is now eradicated from the world also has a significant role in history of S. and Central America.
Virology – the study of viruses

The study of viruses is called virology. Experiments on virology began with the experiments of Jenner in 1798. Jenner did not know the cause but found that that individuals exposed to cow pox did not suffer from small pox.

He began the first known form of vaccination with cow pox infection that prevented small pox infection in individuals. He had not yet found the causative organism or the cause of the immunity as yet for either cow pox or small pox.
Koch and Henle

Koch and Henle founded their postulates on microbiology of disease. This included that:
the organism must regularly be found in the lesions of the disease
it must be isolated from diseased host and grown in pure culture
inoculation of such a pure organism into a host should initiate the disease and should be recovered from the secondarily infected organism as well

Viruses do not confer to all of these postulates.
Louis Pasteur

In 1881-1885 Louis Pasteur first used animals as model for growing and studying viruses. He found that the rabies virus could be cultured in rabbit brains and discovered the rabies vaccine. However, Pasteur did not try to identify the infectious agent.
The discovery of viruses

1886-1903 – This period was the discovery period where the viruses were actually found. Ivanowski observed/looked for bacteria like substance and in 1898, Beijerink demonstrated filterable characteristic of the virus and found that the virus is an obligate parasite. This means that the virus is unable to live on its own.
Charles Chamberland and filterable agents

In 1884, the French microbiologist Charles Chamberland invented a filter with pores smaller than bacteria. Chamberland filter-candles of unglazed porcelain or made of diatomaceous earth (clay)-kieselguhr had been invented for water purification. These filters retained bacterium, and had a pore size of 0.1-0.5 micron. Viruses were filtered through these and called “filterable” organisms. Loeffler and Frosch (1898) reported that the infectious agent of foot and mouth diseases virus was a filterable agent.

In 1900 first human disease shown to be caused by a filterable agent was Yellow Fever by Walter Reed. He found the yellow fever virus present in blood of patients during the fever phase. He also found that the virus spread via mosquitoes. In 1853 there was an epidemic in New Orleans and the rate of mortality from this infection was as high as 28%. Infectivity was controlled by destroying mosquito populations
Trapping viruses

In the 1930's Elford developed collodion membranes that could trap the viruses and found that viruses had a size of 1 nano meter. In 1908, Ellerman and Bang demonstrated that certain types of tumors (leukemia of chicken) were caused by viruses. In 1911 Peyton Rous discovered that non-cellular agents like viruses could spread solid tumors. This was termed Rous Sarcoma virus (RSV).
Bacteriophages

The most important discovery was that of the Bacteriophage era. In 1915 Twort was working with vaccinia virus and found that the viruses grew in cultures of bacteria. He called then bacteriophage. Twort abandoned this work after World War I. In 1917, D'Herelle, a Canadian, also found similar bacteriophages.
Images of viruses

In 1931 the German engineers Ernst Ruska and Max Knoll found electron microscopy that enabled the first images of viruses. In 1935, American biochemist and virologist Wendell Stanley examined the tobacco mosaic virus and found it to be mostly made from protein. A short time later, this virus was separated into protein and RNA parts. Tobacco mosaic virus was the first one to be crystallised and whose structure could therefore be elucidated in detail.
Molecular biology

Between 1938 and 1970 virology developed by leaps and bounds into Molecular biology. The 1940's and 1950's was the era of the Bacteriophage and the animal virus.

Delbruck considered father of modern molecular biology. He developed the concepts of virology in the science. In 1952 Hershey and Chase showed that it was the nucleic acid portion that was responsible for the infectivity and carried the genetic material.

In 1954 Watson and Crick found the exact structure of DNA. Lwoff in 1949 found that virus could behave like a bacterial gene on the chromosome and also found the operon model for gene induction and repression. Lwoff in 1957 defined viruses as potentially pathogenic entities with an infectious phase and having only one type of nucleic acid, multiplying with their genetic material and unable to undergo binary fission.

In 1931, American pathologist Ernest William Goodpasture grew influenza and several other viruses in fertilised chickens' eggs. In 1949, John F. Enders, Thomas Weller, and Frederick Robbins grew polio virus in cultured human embryo cells, the first virus to be grown without using solid animal tissue or eggs. This enabled Jonas Salk to make an effective polio vaccine.

Era of polio research was next and was very important as in 1953 the Salk vaccine was introduced and by 1955 poliovirus had been crystallized. Later Sabin introduced attenuated polio vaccine.

In the 1980’s cloning of viral genes developed, sequencing of the viral genomes was successful and production of hybridomas was a reality. The AIDS virus HIV came next in the 1980’s. Further uses of viruses in gene therapy developed over the next two decades.

Sunday 19 May 2013

Solar Energy


The Basics
Solar energy technologies convert the sun’s light into usable electricity or heat. Solar energy systems can be divided into two major categories: photovoltaic and thermal. Photovoltaic cells produce electricity directly, while solar thermal systems produce heat for buildings, industrial processes or domestic hot water. Thermal systems can also generate electricity by operating heat engines or by producing steam to spin electric turbines. Solar energy systems have no fuel costs, so most of their cost comes from the original investment in the equipment. The total installed costs of solar applications vary depending on the type of financing used. Solar photovoltaics generally range from $6-$10 per watt installed, or $12,000-$30,000 for a typical 2-3 kilowatt residential-scale system. A solar hot water system sized for a typical home is much cheaper and costs between $3,500 and $8,000 depending on the size and type of the system (above prices exclude any incentives or rebates). 
 
Resource Potential
The Northwest receives more than enough sunlight to meet our entire energy needs for the foreseeable future. As the map above illustrates, the Northwest’s highest potential is in southeastern Oregon and southern Idaho; however, there are no “bad” solar sites—even the rainiest parts of the Northwest receive almost half as much solar energy as the deserts of California and Arizona, and they receive more than Germany, which has made itself a solar energy leader.
 
Photovoltaic Cells
Photovoltaics (PVs) convert sunlight directly into electricity, using semiconductors made from silicon or other materials. Photovoltaic modules mounted on homes in the Northwest can produce electricity at a levelized cost of 20-60 cents per kilowatt-hour (kWh) before incentives. Incentives can bring the levelized cost down considerably to 10-20 cents per kWh.
 
PVs generate power on a much smaller scale than traditional utility power plants, so they can often provide high-value electricity exactly where and when it is needed. PVs are often the best choice for supplying power for remote, “off-grid” sites or in situations where the transmission or distribution system would otherwise need to be upgraded in order to meet peak demands. Distribution line extensions of more than half a mile are generally more expensive than investing in a PV system for a typical home.
 
Other cost-effective PV applications include building-integrated power generation, meeting high summer demand for electricity (e.g., air conditioning), pumping water, lighting signs and powering equipment used for communications, safety or signaling.
 
Prices for photovoltaics are falling as markets expand. Solar PV demand has grown consistently by 20-25% per year over the past 20 years while solar cell prices fell from $27 per watt of capacity in 1982 to less than $4 per watt today.
 
Direct Thermal
Direct-use thermal systems are usually located on individual buildings, where they use solar energy directly as a source of heat. The most common systems use sunlight to heat water for houses or swimming pools, or use collector systems or passive solar architecture to heat living and working spaces. These systems can replace electric heating for as little as three cents per kilowatt-hour, and utility and state incentives reduce the costs even further in some cases.
 
Environmental Impacts
Solar power is an extremely clean way to generate electricity. There are no air emissions associated with the operation of solar modules or direct application technologies. Residential-scale passive construction, photovoltaic, solar water heating, and other direct applications reduce power generation from traditional sources and the associated environmental impacts.
 
Net Metering
Utilities in all four Northwestern states offer net metering programs, which make it easy for customers to install solar electric systems at their homes or businesses. In a net metering program, customers feed extra power generated by their solar equipment during the day into the utility’s electrical grid for distribution to other customers. Then, at night or other times when the customer needs more power than their system generates, the building draws power back from the utility grid.
 
Net metering allows customers to install solar equipment without the need for expensive storage systems, and without wasting extra power generated when sunlight is at its peak. Such programs also provide a simple, standardized way for customers to use solar systems while retaining access to utility-supplied power.
 
In most net metering programs, the utility installs a special ‘dual-reading’ meter at the customers building which keeps track of both energy consumed by the building, and energy generated by the solar array. The customer is billed only for the net amount of electricity that they draw from the utility, effectively receiving the utility’s full retail price for the electricity they generated themselves.
 
Annual U.S. Solar Installations
by Technology

Source: Interstate Renewable Energy Council 6
 
Net metering is available from utilities throughout Oregon and Washington, and law requires most Montana utilities to offer it as well. Additionally, Idaho Power and Rocky Mountain Power offer net metering in Idaho in accord with a Public Utilities Commission rule.
 
Incentive Programs in the Northwest
Every state in the Northwest offers incentives for solar energy development. Oregon, Idaho and Montana all offer low-interest loans and substantial tax credits for solar systems bought by businesses, individuals or governments. Washington now offers a production incentive of $0.15/kilowatt-hour or more for electricity from solar energy, depending on where the technology was manufactured. Montana and Oregon also exempt solar systems from property tax assessment, while Idaho and Washington exempt solar system purchases from sales taxes. Many local utilities and regional organizations also provide incentives. For example, the Energy Trust of Oregon offers additional rebates and loans to customers of Oregon’s two largest utilities and many utilities offer additional rebates, loans, or production incentives for solar energy systems.

Saturday 18 May 2013

Flying Cars


FLYING CARS:



Just a decade and a half after the Wright Brothers took off in their airplane over the plains of Kitty Hawk, N.C., in 1903, other pioneering men began chasing the dream of a flying car. There was even one attempt in the 18th century to develop a gliding horse cart, which, to no great surprise, failed. There are nearly 80 patents on file at the United States Patent and Trademark Office for various kinds of flying cars. Some of these have actually flown. Most have not. And all have come up short of reaching the goal of the mass-produced flying car. Here's a look back at a few of the flying cars that distinguished themselves from the pack:
Curtiss Autoplane - In 1917, Glenn Curtiss, who could be called the father of the flying car, unveiled the first attempt at such a vehicle. His aluminum Autoplane sported three wings that spanned 40 feet (12.2 meters). The car's motor drove a four-bladed propeller at the rear of the car. The Autoplane never truly flew, but it did manage a few short hops.
Arrowbile - Developed by Waldo Waterman in 1937, the Arrowbile was a hybrid Studebaker-aircraft. Like the Autoplane, it too had a propeller attached to the rear of the vehicle. The three-wheeled car was powered by a typical 100-horsepower Studebaker engine. The wings detached for storage. A lack of funding killed the project.
Airphibian - Robert Fulton, who was a distant relative of the steam engine inventor, developed the Airphibian in 1946. Instead of adapting a car for flying, Fulton adapted a plane for the road. The wings and tail section of the plane could be removed to accommodate road travel, and the propeller could be stored inside the plane's fuselage. It took only five minutes to convert the plane into a car. The Airphibian was the first flying car to be certified by the Civil Aeronautics Administration, the predecessor of the the Federal Aviation Administration (FAA). It had a 150-horsepower, six-cylinder engine and could fly 120 miles per hour and drive at 50 mph. Despite his success, Fulton couldn't find a reliable financial backer for the Airphibian.
ConvAirCar - In the 1940s, Consolidated-Vultee developed a two-door sedan equipped with a detachable airplane unit. The ConvAirCar debuted in 1947, and offered one hour of flight and a gas mileage of 45 miles (72 kilometers) per gallon. Plans to market the car ended when it crashed on its third flight.
Avrocar - The first flying car designed for military use was the Avrocar, developed in a joint effort between Canadian and British military. The flying-saucer-like vehicle was supposed to be a lightweight air carrier that would move troops to the battlefield.
Aerocar - Inspired by the Airphibian and Robert Fulton, whom he had met years before, Moulton "Molt" Taylor created perhaps the most well-known and most successful flying car to date. The Aerocar was designed to drive, fly and then drive again without interruption. Taylor covered his car with a fiberglass shell. A 10-foot-long (3-meter) drive shaft connected the engine to a pusher propeller. It cruised at 120 mph (193 kph) in the air and was the second and last roadable aircraft to receive FAA approval. In 1970, Ford Motor Co. even considered marketing the vehicle, but the decade's oil crisis dashed those plans
These pioneers never managed to develop a viable flying car, and some even died testing their inventions. However, they proved that a car could be built to fly, and inspired a new group of roadable aircraft enthusiasts. With advances in lightweight material, computer modeling and computer-controlled aircraft, the dream is very close to becoming reality. In the next section, we will look at the flying cars being developed today that eventually could be in our garages.

Aeronautical Engineering


Aeronautical engineering:












The roots of aeronautical engineering can be traced to the early days of mechanical engineering, to inventors’ concepts, and to the initial studies of aerodynamics, a branch of theoretical physics. The earliest sketches of flight vehicles were drawn by Leonardo da Vinci, who suggested two ideas for sustentation. The first was an ornithopter, a flying machine using flapping wings to imitate the flight of birds. The second idea was an aerial screw, the predecessor of the helicopter. Manned flight was first achieved in 1783, in a hot-air balloon designed by the French brothers Joseph-Michel and Jacques-Étienne Montgolfier. Aerodynamics became a factor in balloon flight when a propulsion system was considered for forward movement. Benjamin Franklin was one of the first to propose such an idea, which led to the development of the dirigible. The power-driven balloon was invented by Henri Gifford, a Frenchman, in 1852. The invention of lighter-than-air vehicles occurred independently of the development of aircraft. The breakthrough in aircraft development came in 1799 when Sir George Cayley, an English baron, drew an airplane incorporating a fixed wing for lift, an empennage (consisting of horizontal and vertical tail surfaces for stability and control), and a separate propulsion system. Because engine development was virtually nonexistent, Cayley turned to gliders, building the first successful one in 1849. Gliding flights established a data base for aerodynamics and aircraft design. Otto Lilienthal, a German scientist, recorded more than 2,000 glides in a five-year period, beginning in 1891. Lilienthal’s work was followed by the American aeronaut Octave Chanute, a friend of the American brothers Orville and Wilbur Wright, the fathers of modern manned flight.

Following the first sustained flight of a heavier-than-air vehicle in 1903, the Wright brothers refined their design, eventually selling airplanes to the U.S. Army. The first major impetus to aircraft development occurred during World War I, when aircraft were designed and constructed for specific military missions, including fighter attack, bombing, and reconnaissance. The end of the war marked the decline of military high-technology aircraft and the rise of civil air transportation. Many advances in the civil sector were due to technologies gained in developing military and racing aircraft. A successful military design that found many civil applications was the U.S. Navy Curtiss NC-4 flying boat, powered by four 400-horsepower V-12 Liberty engines. It was the British, however, who paved the way in civil aviation in 1920 with a 12-passenger Handley-Page transport. Aviation boomed after Charles A. Lindbergh’s solo flight across the Atlantic Ocean in 1927. Advances in metallurgy led to improved strength-to-weight ratios and, coupled with a monocoque design, enabled aircraft to fly farther and faster. Hugo Junkers, a German, built the first all-metal monoplane in 1910, but the design was not accepted until 1933, when the Boeing 247-D entered service. The twin-engine design of the latter established the foundation of modern air transport.

The advent of the turbine-powered airplane dramatically changed the air transportation industry. Germany and Britain were concurrently developing the jet engine, but it was a German Heinkel He 178 that made the first jet flight on Aug. 27, 1939. Even though World War II accelerated the growth of the airplane, the jet aircraft was not introduced into service until 1944, when the British Gloster Meteor became operational, shortly followed by the German Me 262. The first practical American jet was the Lockheed F-80, which entered service in 1945.

PEOPLE
TOPICS
A.P.J. Abdul Kalam (president of India)
Alexander M. Lippisch (German-American aerodynamicist)
B.J. Habibie (president of Indonesia)
Ben R. Rich (American engineer)
Bruce McCandless (American naval aviator and astronaut)
Burt Rutan (American aircraft and spacecraft designer)
Charles Lanier Lawrance (American aeronautical engineer)
Charles Stark Draper (American engineer)
Daniel Saul Goldin (American engineer)
Eugen Sänger (Austrian engineer)
George Michael Low (Austrian-born American aerospace engineer)
Georgy Ivanov (Bulgarian cosmonaut)
Hermann Oberth (German scientist)
Hugo Eckener (German aeronautical engineer)
Igor Sikorsky (Russian-American engineer)
Jean-Felix Piccard (American chemical engineer)
Jerome C. Hunsaker (American aeronautical engineer)
Juan de la Cierva (Spanish engineer)
Kelly Johnson (American aeronautical engineer)
Konstantin Eduardovich Tsiolkovsky (Soviet scientist)
Lawrence Hargrave (British aeronautical engineer)
Marcel Dassault (French industrialist)
Max Faget (American engineer)
Michael Griffin (American aerospace engineer)
Octave Chanute (American engineer)
Otto Lilienthal (German aeronautical engineer)
Paul Cornu (French engineer)
Percy Sinclair Pilcher (British engineer)
Qian Xuesen (Chinese scientist)
Robert C. Seamans, Jr. (American aeronautical engineer)
Robert Hutchings Goddard (American scientist)
Samuel Kurtz Hoffman (American engineer)
Samuel Pierpont Langley (American engineer)
Sergey Pavlovich Korolyov (Soviet scientist)
Sergey Vladimirovich Ilyushin (Soviet aircraft designer)
Sir Barnes Wallis (British military engineer)
Theodore von Kármán (American engineer)
Valentin Petrovich Glushko (Soviet scientist)
Vladimir Nikolayevich Chelomey (Soviet scientist)
Walter Robert Dornberger (German engineer)
Wernher von Braun (German-born American engineer)
William Hayward Pickering (American engineer and physicist)
Willy Messerschmitt (German engineer)
Commercial aircraft after World War II continued to use the more economical propeller method of propulsion. The efficiency of the jet engine was increased, and in 1949 the British de Havilland Comet inaugurated commercial jet transport flight. The Comet, however, experienced structural failures that curtailed the service, and it was not until 1958 that the highly successful Boeing 707 jet transport began nonstop transatlantic flights. While civil aircraft designs utilize most new technological advancements, the transport and general aviation configurations have changed only slightly since 1960. Because of escalating fuel and hardware prices, the development of civil aircraft has been dominated by the need for economical operation.

Technological improvements in propulsion, materials, avionics, and stability and controls have enabled aircraft to grow in size, carrying more cargo faster and over longer distances. While aircraft are becoming safer and more efficient, they are also now very complex. Today’s commercial aircraft are among the most sophisticated engineering achievements of the day.

Smaller, more fuel-efficient airliners are being developed. The use of turbine engines in light general aviation and commuter aircraft is being explored, along with more efficient propulsion systems, such as the propfan concept. Using satellite communication signals, onboard microcomputers can provide more accurate vehicle navigation and collision-avoidance systems. Digital electronics coupled with servo mechanisms can increase efficiency by providing active stability augmentation of control systems. New composite materials providing greater weight reduction; inexpensive one-man, lightweight, noncertified aircraft, referred to as ultralights; and alternate fuels such as ethanol, methanol, synthetic fuel from shale deposits and coal, and liquid hydrogen are all being explored. Aircraft designed for vertical and short takeoff and landing, which can land on runways one-tenth the normal length, are being developed. Hybrid vehicles such as the Bell XV-15 tilt-rotor already combine the vertical and hover capabilities of the helicopter with the speed and efficiency of the airplane. Although environmental restrictions and high operating costs have limited the success of the supersonic civil transport, the appeal of reduced traveling time justifies the examination of a second generation of supersonic aircraft.

Thursday 16 May 2013

Gasoline

The First Oil Well Was Dug Just Before the Civil War
Edwin Drake dug the first oil well in 1859 and distilled the petroleum to produce kerosene for lighting. Drake had no use for the gasoline or other products, so he discarded them. It wasn't until 1892 with the invention of the automobile that gasoline was recognized as a valuable fuel. By 1920, there were 9 million vehicles on the road powered by gasoline, and service stations were popping up everywhere.
A Field of Dozens of Oil Wells Just Offshore, at Summerland, California (Santa Barbara County) in 1915
Photograph of a Field of Dozens of Oil Wells Just Offshore, at Summerland, California (Santa Barbara County) in 1915

Higher Octane and Lead Levels

By the 1950s, cars were becoming bigger and faster. Octane levels increased and so did lead levels; lead was added to gasoline to improve engine performance.

Leaded Gasoline Was Taken Off the U.S. Market

Unleaded gasoline was introduced in the 1970s, when the health problems from lead became apparent. In the United States, leaded gasoline was completely phased out in the 1980s, but it is still being used in some parts of the world.
.

Generators


Today, everybody is familiar with electricity. Let's say, almost everybody uses electricity as a ready-for-use energy that is provided in a clean way. This is the result of long research and engineering work which can be traced back for centuries. The first generators of electricity were not electrodynamic as today's machines, but they were based on electrostatic principles. Long before electrodynamic generators were invented, electrostatic machines and devices had their place in science. Due to their principle of operation, electrostatic generators produce high voltage, but low currents. The output is always a unipolar static voltage. Depending from the used materials, it may be positve or negative.
Friction is the key of the operation! Although most mechanical energy needed to power an electrostatic generator is converted into heat, a fraction of the work (not a fraction of friction - got the point?) is used to generate electric potential by splitting charges.

The Beginnings

In ancient greece, the amber was known to attract little objects after being rubbed with cloth or fur. From the Greek expression elektron, the modern term electricsis directly derived. In 1600, William Gilbert (1544-1603) coined the expression electrica in his famous book De Magnete.
In ancient Greece, there was no effort to mechanize the rubbing of a piece of amber in order to get a continous effect. Although light could be observed in the dark, nobody made a connection between this and the lightning which was regarded as Zeus' weapon. The knowledge about this type of electricity remained almost unchanged until the beginning of the seventeenth century. Several antique authors like Pliny the elder or Renaissance men like Giovanni Battista della Porta describe the effect but without drawing further conclusions.

The Sulphur Ball

Otto von Guericke (1602-1686) who became famous for his Magdeburg vacuum experiments invented a first simple electrostatic generator. It was made of a sulphur ball which rotated in a wooden cradle. The ball itself was rubbed by hand. As the principles of electric conduction had not been discovered yet, von Guericke transported the charged sulphur ball to the place where the electric experiment should happen.

von Guericke's first electrostatic generator around 1660
Guericke made the ball by pouring molten sulphur into a hollow glass sphere. After the sulphur was cold, the glass hull was smashed and removed. Some day, a researcher found out that the empty glass sphere itself provided the same results.

A Baroque Gas Discharging Lamp

In 1730 scientific research has discovered the principles of electric conduction. An inspiriation for electric research came from an area which at the first glance had absoluteley nothing to contribute: the mercury barometric device invented by Evangelista Torricelli. If the mercury-filled tube was shaken and the evacuated portion of the tube was observed in the dark, a light emission could be seen. William Hauksbee, both inventive and inquisitive, designed a rotor to rub a small disk of amber in a vacuum chamber. When the chamber contained some mercury vapour, it lit up! This was the first mercury gas discharge lamp! The engravings show surprising similarities to today's lightning spheres.

Hauksbee's amber rotor

Hauksbee's setup to demonstrate
light effects caused by static electricity.

 The Beer Glass Generator

Glass proved to be an ideal material for an electrostatic generator. It was cheaper than sulphur and could easily be shaped to disks or cylinders. An ordinary beer glass turned out to be a good isolating rotor in Winkler's electrostatic machine.

An electrostatic machine invented by
Johann Heinrich Winkler (1703-1770)
Machines like these were not only made for scientific research, but a preferred toy for amusement. In the 18th century, everybody wanted to experience the electric shock. Experiments like the "electric kiss" were a salon pastime. Although the French Abbé Nollet demonstrated in 1745 that little animals like birds and fish were killed instantaneuosly by the discharge of a Leyden jar, nobody was aware of the latent dangers of this type of experiments.

The electric kiss provided a very special thrill
Soon after the effects of electrostatic discharge were found, researchers and charlatans started to cure diseases with electric shocks. Here we find parallels to the "Mesmerists" who claimed to use magnetic powers for therapy.

Toothache therapy around 1750
Being ill at that time was no fun!

 The Leyden Jar

In 1745, the so-called Leyden Jar (or Leyden Bottle) was invented by Ewald Jürgen von Kleist (1700-1748). Kleist searched for a way to store electric energy and had the idea to fill it into a bottle! The bottle contained water or mercury and was placed onto a metal surface with ground connection. No wonder: the device worked, but not because of the fact that electricity could be filled into bottles.One year after Kleist, the physicist Cunnaeus in Leyden/the Netherlands independently invented this bottle again. Thus the term Leyden Jar became more familiar, although in Germany, this device sometimes also was called Kleist's bottle.
An intense research work began to find out which liquid is the most suitable. A few years later, researchers had learned that water is not necessary, but a metal hull inside and outside the jar was sufficient for storing electrostatic energy. Thus the first capacitors were born.
Early Leyden jarsAn advanced electrostatic battery in 1795

Frequently, several jars were connected in order to multiply the charge. Experimenting with this type of capacitors started to become pretty dangerous. In 1783, while trying to charge a battery during a thunderstorm, Prof. Richmann was killed by unintendedly getting too close to a conductor with his head. He is the first known victim of high voltage experiments in the history of physics. Benjamin Franklin had a good deal of luck not to win this honour when performing his kite experiments.
St. Petersburg, 6 August 1783. Prof. Richman and his assistant being struck by lightning while charging capacitors. The assistant escaped almost unharmed, whereas Richman was dead immediately. The pathologic analysis revealed that "he only had a small hole in his forehead, a burnt left shoe and a blue spot at his foot. [...] the brain being ok, the front part of the lung sane, but the rear being brown and black of blood." The conclusion was that the electric discharge had taken its way through Richmann's body. The scientific community was shocked.

 The Disk Rotor

Generators based on disks were invented around 1800 by Winter. Their characteristic construction element is a mercury-prepared leather cushion that covers approximately one forth of the surface area. The leather cushion replaced the experimentor's hand and gave a more continous result. In 1799, first experiments of electrolysis by electrostatic energy were made. It turned out that the recently invented chemical elements caused same or better effect than many thousand electric discharges of a Leyden bottle battery. Experiments like these helped to shape the understanding of electric energy.

An early disk generator by Winter

 The Advanced Rotor

Inventors found out that it is a good idea to laminate metal or cardboard sheets onto the isolating disks of electrostatic generators.

The so-called influence machine by Holtz, 1865
Disks for advanced generators of this type were made of glass, shellac and ebonite (hard rubber). Especially hard rubber turned out to be a very suitable material as it did not get damaged so easily than glass or shellac.

 The Wimshurst Machine

Wimshurst machines are the end point of the long development of electrostatic disk machines. They caused very good results and were frequently used to power X-ray tubes. The characteristic construction element of these machines are disks which are laminated with radially arranged metal sheets. The advantage of disks is that they can be stacked onto one axle in order to multiply the effect.

A Wimshurst machine around 1905.
The end point of a long development.
The invention of the electomagnetic inductor by Ruhmkorff in 1857 began to make the electrostatic disk machines obsolete. Today, both devices only serve as useful demonstration objects in physics lessons to show how electric charges accumulate. For technical applications, high voltages can be easier generated by electronic and electromagnetic methods.

A Ruhmkorff inductor to power an X-ray tube (1910)

 The Van-DeGraaff Generator

The principle of this machine is to transport voltage by the aid of a tape made of isolating flexible material e.g. rubber. Early in the development of machinery, it was observed that mechanical transmission belts gave reason for unintended high voltage production, which harmed persons or buildings by igniting parts of a workshop. The same effect caused by transporting the highly inflammable celluloid films inside the projector was the reason for more than one cinema perishing in fire.

A 5 Megavolt Van-deGraaff generator
The principle is based on an isolating endless tape which transports an electric charge to a conductor. Although the device can be operated without an additional electric power source, normally a DC high voltage is applied to the tape, thus considerably increasing the output voltage. Van DeGraaff generators are still in use in particle accellerator labs. The largest machines produce up to 10 million Volts.

 The Steam Electrostatic Generator

Wet steam which is pressed through a nozzle causes electric chargement. This was the origin of the idea to construct an electrostatic generator based on steam. Although these machines caused good results, they were difficult to maintain. As they also were expensive, comparatively few were built and have survived in museum collections.

A steam electrostatic generator

Conclusion

Electrostatic generators have their place in the history of science. They accompanied the way to understand electricity. However, their efficiency is poor, compared to the mechanical effort which is needed to produce electrical energy. In this context, I'd like to seriously warn all would-be inventors of electrostatic PMMs based on disk rotors or on the Van deGraaf principle. Machines of this type are no toy and even small devices can be dangerous if carelessly handled. As a rule of thumb, a charged Leyden jar of 1/2 liter (=1/8 gallon) volume can endanger your life!

Tuesday 14 May 2013

Drones



"[Drones are a] game-changing technology, akin to gunpowder, the steam engine, the atomic bomb -- opening up possibilities that were fiction a generation earlier but also opening up perils that were unknown a generation ago." -- Peter Singer, senior fellow at the Brookings Institution
America will never be a "no drone zone."
That must be acknowledged from the outset. There is too much money to be made on drones, for one, andtoo many special interest groups -- from the defense sector to law enforcement to the so-called "research" groups that are in it for purely "academic" reasons -- who have a vested interest in ensuring that drones are here to stay.
At one time, there was a small glimmer of hope that these aerial threats to privacy would not come home to roost, but that all ended when Barack Obama took office and made drones the cornerstone of his war efforts. By the time President Obama signed the FAA Reauthorization Act into law in 2012, there was no turning back. The FAA opened the door for drones, once confined to the battlefields over Iraq and Afghanistan, to be used domestically for a wide range of functions, both public and private, governmental and corporate. It is expected that at least 30,000 drones will occupy U.S. airspace by 2020, ushering in a $30 billion per year industry.
Those looking to the skies in search of Predator drones will be in for a surprise, however, because when the drones finally descend en masse on America, they will not be the massive aerial assault vehicles favored by the Obama administration in their overseas war efforts. Rather, the drones coming to a neighborhood near you will be small, some nano in size, capable of flying through city streets and buildings almost undetected, while hovering over cityscapes and public events for long periods of time, providing a means of 24/7 surveillance.
One type of drone sensor, the Gorgon Stare, can keep track of an area 2.5 miles across from 12 different angles. Another sensor system, ARGUS, can find an object that is only 6 inches long, from 20,000 feet up in the air. A drone equipped with this kind of technology could spy on an entire city at once. For example, police in California are about to begin using Qube drones, which are capable of hovering for 40 minutes at heights of about 400 ft. to conduct surveillance on targets as far as one kilometer away. Michael Downing, the LAPD deputy chief for counter-terrorism and special operations, envisions drones being flown over large-scale media events such as the Oscars, using them to surveil political protests, and flying them through buildings to track criminal suspects.
These micro-drones will be the face of surveillance and crowd control in the coming drone age.
Modeled after birds, insects, and other small animals, these small airborne surveillance devices can remain hidden in plain view while navigating spaces off limits to conventional aircraft. Able to take off and land anywhere, able to maneuver through city streets and hallways, and able to stop and turn on a dime, these micro-drones will still pack a lethal punch, equipped with an array of weapons and sensors, including tasers, bean-bag guns, "high-resolution video cameras, infrared sensors, license plate readers, [and] listening devices."
You can rest assured, given the pace of technology and the fervor of the drone industry (and its investors), that the sky is the limit when it comes to the many uses (and abuses) for drones in America. The following is just a small sampling of what will be descending from the skies in the near future.
Cyborg drones. 
The Defense Advanced Research Projects Agency (DARPA) has begun to develop a Micro-Electro-Mechanical System (MEMS) for the manipulation of insects into "cyborgs." Through genetic engineering, they are aiming to control the movement of insects and utilize them for surveillance purposes.
Dragonfly drone.
 First reportedly spotted in 2007 hovering over protesters at an anti-war rallyin Washington, DC, it turns out that the government's dragonfly drones are just the tip of the iceberg when it comes to small aerial surveillance devices designed to mimic nature. Just a year later, the U.S. Air Force "unveiled insect-sized spies 'as tiny as bumblebees' that could not be detected and would be able to fly into buildings to 'photograph, record, and even attack insurgents and terrorists.'"
Hummingbird drone.
Shaped like a bird, the "Nano Hummingbird" drone is negligibly larger than an actual hummingbird and fits in the palm of one's hand. It flits around effortlessly, blending in with its surroundings. DARPA, the advanced research division of the Department of Defense, gets the credit for this biotic wonder.
Nano Quadrators.
 Similar to the hummingbird drone, these small, four-propellered nano quadrator drones, developed by researchers at the University of Pennsylvania, operate based upon the flight dynamics of insects, enabling them to operate as a swarm. Using 20 drones, researchers demonstrated how, moving compactly as a unit, the drones were able to navigate obstacles, form complex patterns, and even execute a fluid figure eight arrangement.
Black Hornet Nano drone.
 Weighing in at roughly half an ounce and four inches long, comparable to a finch, the Black Hornet Nano helicopter drone was designed to capture and relay video and still images to remote users, and can fly even in windy conditions.
DASH Roachbot drone. 
Developed at UC Berkeley's PolyPEDAL Lab, DASH, a 10-centimeter long, 16-gram Dynamic Autonomous Sprawled Hexapod strives to mimic a cockroach's speed and ability to remain covert and a gecko's speed and agility. Trained to perform "rapid inversion" maneuvers that include dashing up to a ledge and then swinging itself around to end up underneath the ledge and upside-down, DASH is being trained to make rapid transitions between running and climbing.
Samarai drone.
 Lockheed Martin's compact "Samarai" drone, inspired by the design of a maple seed, is capable of high speeds, low battery consumption, vertical movement, and swift ground deployment.
MicroBat drone.
 Additionally, CIT Group, Aerovironment, and UCLA have produced a "MicroBat" ornithopter; it was designed in part by zoologists who have attempted to make the MicroBat mimic the movement of birds and other flying animals.
Spy-butterfly drone.
 In 2012, Israel unveiled its new insect-inspired drone which they dubbed the "spy-butterfly" because of its two sizable wings. Weighing in at only 20g, this drone was developed for indoor surveillance, including public places such as "train stations and airport terminals -- or office buildings." The size and muted sound of the "virtually noiseless" machines makes them unnoticeable and therefore ideal for intelligence gathering. The spy-butterfly is so realistic that, when tested, "birds and flies tended to fall behind the device arranging into a flock."
Switchblade drone.
 A more sinister example is the Switchblade, a small military drone intended to act as a kamikaze weapon. Weighing in at a mere six pounds and two feet in length, it flies effortlessly through urban environments before zeroing in on its target, a person, at which point itexplodes, unceremoniously killing him or her.
Mosquito drone.
 More lethal than its real-life counterpart, the mosquito drone, while an engineering marvel, is also a privacy advocate's nightmare with its potential to land on someone and use a needle-like-pincer to extract DNA from its victims or, alternatively, inject drugs or other foreign substances. As software engineer Alan Lovejoy notes:
Such a device could be controlled from a great distance and is equipped with a camera, microphone. It could land on you and then use its needle to take a DNA sample with the pain of a mosquito bite. Or it could inject a micro RFID tracking device under your skin. It could land on you and stay, so that you take it with you into your home. Or it could fly into a building through a window. There are well-funded research projects working on such devices with such capabilities.
Raven drone.
 Weighing in at 4 pounds, the RQ-11 Raven drone is not as small, nor is it as agile as its smaller counterparts, but with more than 19,000 out there already, it is the most common. Useful for seeing around corners and sending footage back to its handlers, the Raven resembles a rudimentary model airplane and crumbles like Legos upon landing.
With 63 active drone sites across the nation and 56 government agencies presently authorized to use drones, including 22 law enforcement agencies and 24 universities, drones are here to stay. Indeed, the cost of drones -- underwritten by a $4 million Homeland Security program whichencourages local law enforcement to adopt drone technology as quickly as possible -- makes them an easy sell for most police departments. Moreover, while manned airplanes and helicopters can cost $600/hour to operate, a drone can be put in the sky for less than $25/hour. That doesn't even begin to cover drone use by the private sector, which is already chomping at the bit at the prospect.
No matter what the future holds, however, we must ensure that Americans have a semblance of civil liberties protections against the drones. Given the courts' leniency towards police, predicating drone use on a warrant requirement would provide little to no protection. Thus, the only hope rests with Congress and state legislatures that they would adopt legislation specifically prohibiting the federal government from using data recorded via police spy drones in criminal prosecutions, as well as preventing police agencies from utilizing drones outfitted with anti-personnel devices such as tasers and tear gas.
Either way, we'd better get ready. As Peter W. Singer, author of Wired for War, a book about military robotics, warns: "The debate over drones is like debating the merits of computers in 1979: They are here to stay, and the boom has barely begun. We are at the Wright Brothers Flier stage of this. There's no stopping this technology. Anybody who thinks they can put this genie back in the box -- that's silliness."

Saturday 11 May 2013

History Of Indonesia


In the past two decades, Indonesia has begun to emerge as an economic power in Southeast Asia, as well as a newly democratic nation.
Its long history as the source of spices coveted around the Indian Ocean world shaped Indonesia into the multi-ethnic and religiously diverse nation that we see today. Although this diversity causes friction at times, Indonesia has the potential to become a major world power.

Capital and Major Cities:

Capital: Jakarta, pop. 8,800,000
Major Cities: Surabaya, pop. 3,000,000
Medan, pop. 2,500,000
Bandung, pop. 2,500,000
Serang, pop. 1,786,000
Yogyakarta, pop. 512,000

Indonesia's Government:

The Republic of Indonesia is centralized (non-federal) and features a strong President who is both Head of State and Head of Government. The first direct presidential election took place only in 2004; the president can serve up to two 5-year terms.
The tricameral legislature consists of the People's Consultative Assembly, which inaugurates and impeaches the president and amends the constitution but does not consider legislation; the 560-member House of Representatives, which creates legislation; and the 132-member House of Regional Representatives who provide input on legislation that affects their regions.
The judiciary includes not only a Supreme Court and Constitutional Court, but also a designated Anti-Corruption Court.

People of Indonesia:

Indonesia is home to over 240 million people; it is the fourth most populous nation on Earth (after China, India and the US).
Indonesians belong to more than 300 ethno-linguistic groups, most of which are Austronesian in origin. The largest ethnic group is the Javanese, at almost 42% of the population, followed by the Sundanese with just over 15%. Others with more than 2 million members each include: Chinese (3.7%), Malay (3.4%), Madurese (3.3%), Batak (3.0%), Minangkabau (2.7%), Betawi (2.5%), Buginese (2.5%), Bantenese (2.1%), Banjarese (1.7%), Balinese (1.5%) and Sasak (1.3%).

Languages of Indonesia:

Across Indonesia, people speak the official national language of Indonesian, which was created after independence as a lingua franca from Malay roots. However, there are more than 700 other languages in active use throughout the archipelago, and few Indonesians speak the national language as their mother tongue.
Javanese is the most popular first language, boasting 84 million speakers. It is followed by Sundanese and Madurese, with 34 and 14 million speakers, respectively.
The written forms of Indonesia's multitude of languages may be rendered in modified Sanskrit, Arabic or Latin writing systems.

Religion in Indonesia:

Indonesia is the world's largest Muslim country, with 86% of the population professing Islam. In addition, almost 9% of the population is Christian, 2% are Hindu, and 3% are Buddhist or animist.
Nearly all of the Hindu Indonesians live on the island of Bali; most of the Buddhists are ethnic Chinese. The Constitution of Indonesia guarantees freedom of worship, but the state ideology specifies a belief in only one God.
Long a commercial hub, Indonesia acquired these faiths from traders and colonizers. Buddhism and Hinduism came from Indian merchants; Islam arrived via Arab and Gujarati traders. Later, the Portuguese introduced Catholicism, and the Dutch Protestantism.

Indonesian Geography:

With more than 17,500 islands, of which more than 150 are active volcanoes, Indonesia is one of the most geographically and geologically interesting countries on Earth. It was the site of two famous nineteenth-century eruptions, those of Tambora and Krakatau, as well as being the epicenter of the 2004 Southeast Asian tsunami.
Indonesia covers about 1,919,000 square kilometers (741,000 square miles). It shares land borders with Malaysia, Papua New Guinea and East Timor.
The highest point in Indonesia is Puncak Jaya, at 5,030 meters (16,502 feet); the lowest point is sea level.

Indonesia's Climate:

Indonesia's climate is tropical and monsoonal, although the high mountain peaks can be quite cool. The year is divided into two seasons, the wet and the dry.
Because Indonesia sits astride the equator, temperatures do not vary much from month to month. For the most part, coastal areas see temperatures in the mid to upper 20s Celsius (the low to mid-80s Fahrenheit) throughout the year.

The Indonesian Economy:

Indonesia is the economic powerhouse of Southeast Asia, a member of the G20 group of economies. Although it is a market economy, the government owns significant amounts of the industrial base following the 1997 Asian financial crisis. During the 2008-2009 global financial crisis, Indonesia was one of the few nations to continue its economic growth.
Indonesia exports petroleum products, appliances, textiles and rubber. It imports chemicals, machinery and food.
The per capita GDP is about $4,000 US. Unemployment is only 7.7%; 47% of Indonesians work in industry, 39% in services, and 14% in agriculture. Nonetheless, almost 18% live below the poverty line.

History of Indonesia:

Human history in Indonesia goes back at least 1.5-1.8 million years, as shown by the fossil "Java Man" - a Homo erectus individual discovered in 1891.
Archaeological evidence suggests that Homo sapiens had walked across Pleistocene land bridges from the mainland by 45,000 years ago. They may have encountered another human species, the "hobbits" of the island of Flores; the exact taxonomic placement of the diminutiveHomo floresiensis is still up for debate. Flores Man seems to have become extinct by 10,000 years ago.
The ancestors of most modern Indonesians reached the archipelago around 4,000 years ago, arriving from Taiwan, according to DNA studies. Melanesian peoples already inhabited Indonesia, but they were displaced by the arriving Austronesians across much of the archipelago.

Early Indonesia

Hindu kingdoms sprang up on Java and Sumatra as early as 300 BCE, under the influence of traders from India. By the early centuries CE, Buddhist rulers controlled areas of those same islands, as well. Not much is known about these early kingdoms, due to the difficulty of access for international archaeological teams.
In the 7th century, the powerful Buddhist kingdom of Srivijaya arose on Sumatra. It controlled much of Indonesia until 1290, when it was conquered by the Hindu Majapahit Empire from Java. Majapahit (1290-1527) united most of modern-day Indonesia and Malaysia. Although large in size, Majapahit was more interested in controlling trade routes than in territorial gains.
Meanwhile, Islamic traders introduced their faith to Indonesians in the trade ports around the 11th century. Islam slowly spread throughout Java and Sumatra, although Bali remained majority Hindu. In Malacca, a Muslim sultanate ruled from 1414 until it was conquered by the Portuguese in 1511.

Colonial Indonesia

The Portuguese took control of parts of Indonesia in the sixteenth century, but did not have enough power to hang on to their colonies there when the much wealthier Dutch decided to muscle in on the spice trade beginning in 1602.
Portugal was confined to East Timor.

Nationalism and Independence

Throughout the early 20th century, nationalism grew in the Dutch East Indies. In March of 1942, the Japanese occupied Indonesia, expelling the Dutch. Initially welcomed as liberators, the Japanese were brutal and oppressive, catalyzing nationalist sentiment in Indonesia.
After Japan's defeat in 1945, the Dutch tried to return to their most valuable colony. The people of Indonesia launched a four-year independence war, gaining full freedom in 1949 with U.N. help.
The first two presidents of Indonesia, Sukarno (r. 1945-1967) and Suharto (r. 1967-1998) were autocrats who relied upon the military to stay in power. Since 2000, however, Indonesia's president have been selected through reasonably free and fair elections.