Google
 

Friday, October 19, 2007

Space.com : Why the Universe is All History



By David Powell
Special to SPACE.com
posted: 16 October 2007
06:46 am ET

It took 300 years of experiment and calculation to pin down the speed at which light travels in a vacuum: an impressive 186,282 miles per second.

Light will travel slightly slower than this through air, and some wild experiments have actually slowed light to a crawl and seemingly made it go backward, but at the scales encountered in our everyday lives, light is so fast that we perceive our surroundings in real time.

Look up into the night sky and this illusion begins to falter.

"Because light takes time to get here from there, the farther away 'there' is the further in the past light left there and so we see all objects at some time in the past," explains Floyd Stecker of NASA's Goddard Space Flight Center in Greenbelt, Maryland.

Light-years

We see the relatively close moon as it was 1.2 seconds ago and the more distant sun as it was about 8 minutes ago. These measurements—1.2 light-seconds and 8 light-minutes—can be thought to describe both time and distance.

The distance to more remote objects such as other stars is so great it is measured in light-years—the distance light will travel in a year, or about 6 trillion miles (10 trillion kilometers).. Even the nearest star system, Proxima Centauri, lies more than four light-years away, so it appears to us on Earth as it was just over four years ago when the light began its journey.

In this way, light's finite speed gives us a valuable view into the past, and as we strain our gaze deeper into the universe we look further back in time.

"In the case of distant galaxies, we see them as they were billions of years ago when the universe was relatively young," Stecker said.

Out of sight

Some galaxies are so remote that their light hasn't had sufficient time to reach us yet, despite about 13.7 billion years of travel. There could also be more distant objects that will forever remain unknown to us.

"Because the universe is expanding and the expansion appears to be accelerating, there may be distant galaxies which if we can't see them now because their light has not had time to reach us, we will never see," Stecker said.

So we can never see the universe as it is, only as it was at various stages of its development.

To interact with remote parts of the universe—to see them as they are now—would require some exotic means of travel, such as to travel faster than light which, according to Einstein's special theory of relativity, is impossible as it would require an infinite amount of energy.

"The equations of special relativity imply that nothing can go faster then the speed of light in empty space. Therefore, if super-luminal speeds are possible in empty space, they violate the principle of special relativity," Stecker told SPACE.com.

Offbeat theories

There are ways to travel faster than light that do not violate special relativity, but these either outpace light in a transparent medium such as water or do not involve the transmission of information.

To break light speed in space and gain the same easy interaction with the universe that we experience everyday on Earth is a task considered practically impossible even when offbeat theories are considered.

"There are some postulated but unproven theoretical models, inspired by the motivation to unite the quantum theory with the general theory of relativity, which violate special relativity," Stecker said.

These theories involve accelerating particles with mass to super-luminal speeds using ultra high energies. It may also be possible to take a shortcut to distant parts of the universe through a tunnel in space-time known as a wormhole.

"If stable wormholes can exist in space-time and if we can survive traveling through them, then they could provide shortcuts as in the sci-fi movies," Stecker said.

ESA : Science and Galileo - working together

Galileo science colloquium

16 October 2007
Galileo is a promising tool for the scientific community, even though it is mainly intended for a set of practical services such as guiding cars, supporting safe aircraft landings or helping blind people to find their way.

This was clearly demonstrated during the first colloquium on scientific and fundamental aspects of the Galileo programme that took place at the 'Cité de l'Espace' in Toulouse from 1 until 4 October. The colloquium was organised by the Air and Space Academy, the Bureau des Longitudes, the Académie de Marine and ESA.

Indeed, the main objective of this world premiere was reached beyond expectations: enhancing the scientific use of Galileo and contributing to the science-based development of Global Navigation Satellite Systems (GNSS).

Around 200 scientists, coming from 25 countries world wide - with 19 being European, gathered and showed their interest in using GNSS systems and in particular Galileo's accuracy and integrity to improve their research in a wide scope that spans Earth sciences (for example: geodesy, meteorology, geophysics), quantum metrology (for example: atomic clocks, inter-satellite links, the Galileo timing system) and relativity (for example: spacetime symmetries, relativistic reference frames, astronomy and GNSS).

At the same time, the scientist's expertise can be of great help in improving the Galileo system itself. This is a 'win-win' situation, since a more precise tool can give more accurate data and therefore improve the measurements needed by the scientists for their research.

The scientists need to have access to GNSS data and ESA will facilitate further access to EGNOS and GIOVE-A data, which are already available to some extent. Dedicated solutions will be found for the scientists, with restrictions only being sometimes applied to commercial or PRS service data. Access to registered, stored data - which are the types most wanted by the scientific community - will be easily granted.


Galileo science colloquium poster

With this conference ESA was also expecting recommendations to improve the system itself and several were expressed so as to ensure the best environment for the scientific exploitation of Galileo. Of course, the requirements are now frozen for the first generation system but within the GNSS evolution programme, supported by ESA member states for technology accompaniment and the new Galileo generation, there is time to implement these particular needs - this is totally open in this programme envelope and new ideas are welcome.

The colloquium also led to reflection on the way the scientific community can organise itself for the use of Galileo. The event complements the already established effort, carried out by ESA, to contact scientific institutes in the fields of timekeeping, frequency standards and geodesy. Following this conference, the Galileo scientific community also includes the domains of quantum metrology and relativistic mechanics.

Although important progress was made, the debate still remain open on how scientists wish to express their specific needs to make the best use of Galileo. This will lead, in the medium and long term, to a privilege working relationship between scientific teams and the project teams responsible for building the next generation of European GNSS.

The colloquium was a great and unique opportunity for the Galileo partners to discover the numerous uses of satellite navigation, in the fields of Earth sciences, quantum metrology and relativistic mechanics, and to identify how scientific requirements can contribute to making the most of the present systems and to define their possible future evolution.

The use of quantum entanglement in the overall GNSS constellation, for example for clock synchronization, and the development of fully relativistic reference frames implemented through the GNSS constellation itself are of particular interest.

ESA : Hubble shows ‘baby’ galaxy is not so young after all


Panning on I Zwicky 18

16 October 2007
The NASA/ESA Hubble Space Telescope has found out the true nature of a dwarf galaxy that was reputed to be one of the youngest galaxies in the Universe.

Astronomers using Hubble have made observations of the galaxy I Zwicky 18 which seem to indicate that it is in fact much older and much farther away than previously thought.


I Zwicky 18

I Zwicky 18
Observations of I Zwicky 18 at the Palomar Observatory about 40 years ago seemed to show that it was one of the youngest galaxies in the nearby Universe. The studies suggested that the galaxy had erupted with star formation thousands of millions of years after its galactic neighbours, like our galaxy the Milky Way.

Back then it was an important finding for astronomers, since this young galaxy was also nearby and could be studied in great detail - something not easy with observations made across great distances when the universe was much younger.



I Zwicky 18
I Zwicky 18

But the new Hubble data has quashed that possibility. The telescope found fainter, older, red stars contained within the galaxy, suggesting its star formation processes started at least one thousand million years ago and possibly as much as 10 thousand million years ago. The galaxy, therefore, may have formed at the same time as most other galaxies.

“Although the galaxy is not as youthful as was once believed, it is certainly developmentally challenged and unique in the nearby universe,” said astronomer Alessandra Aloisi from the European Space Agency/Space Telescope Science Institute, who led the new study. Spectroscopic observations with ground-based telescopes have shown that I Zwicky 18 is mostly composed of hydrogen and helium, the main ingredients created in the Big Bang. In other words, the stars within it have not created the same amounts of heavier elements as seen in other galaxies nearby.




Zoom in I Zwicky 18

Thus, the galaxy’s primordial makeup suggests that its rate of star formation was much lower than that of other galaxies of similar ages. The galaxy has been studied with most of NASA’s telescopes, including the Spitzer Space Telescope, the Chandra X-ray Observatory, and the Far Ultraviolet Spectroscopic Explorer (FUSE). However, it remains an outstanding mystery as to why I Zwicky 18 formed few stars in the past, and why it is forming so many new stars right now.


I Zwicky 18

I Zwicky 18
The new Hubble data also suggests that I Zwicky 18 is 59 million light-years from Earth, almost 10 million light-years more distant than previously believed. By extragalactic standards, this is still in our own backyard yet the galaxy’s larger-than-expected distance may now explain why astronomers have had difficulty detecting older, fainter stars within the galaxy until now. In fact, the faint old stars in I Zwicky 18 are almost at the limit of Hubble’s sensitivity and resolution.

Aloisi and her team discerned the new distance by observing blinking stellar distance-markers within I Zwicky 18. Some massive stars, called Cepheid variables, pulse with a regular rhythm. The timing of their pulsations is directly related to their brightness. By comparing their actual brightness with their observed brightness, astronomers can precisely measure their distance.



I Zwicky 18
I Zwicky 18

The team determined the observed brightness of three Cepheids and compared it to the actual brightness predicted by theoretical models specifically tailored for the low metal content of I Zwicky 18 to determine the distance to the galaxy. The Cepheid distance was also validated with another distance indicator, specifically the observed brightness of the brightest red stars in a characteristic stellar evolutionary phase (the so-called ‘giant’ phase).

Cepheid variables have been studied for decades, especially by Hubble, and have been instrumental in the determination of the scale of our universe. This is the first time, however, that variable stars with such few heavy elements were found. This may provide unique new insights into the properties of variable stars, which is now a topic of ongoing study.



Notes for editors:

The Hubble Space Telescope is a project of international cooperation between NASA and ESA.

Aloisi and her team’s results appear in the 1 October issue of the Astrophysical Journal Letters in ‘I Zw 18 Revisited with HST ACS and Cepheids: New Distance and Age’.

Aloisi’s team consists of Francesca Annibali, Jennifer Mack, and Roeland van der Marel of the Space Telescope Science Institute, Marco Sirianni of ESA and Space Telescope Science Institute, Abhijit Saha of the National Optical Astronomy Observatories, and Gisella Clementini, Rodrigo Contreras, Giuliana Fiorentino, Marcella Marconi, Ilaria Musella, and Monica Tosi of the Italian National Astrophysics Institutes in Bologna and Naples.

ESA : Hummocky and shallow Maunder crater

ESA/ DLR/ FU Berlin (G. Neukum)
Maunder Crater

16 October 2007
The High Resolution Stereo Camera (HRSC) on ESA’s Mars Express orbiter has obtained pictures of the Noachis Terra region on Mars, in particular, the striking Maunder crater.

The images were taken in orbits 2412 and 2467 on 29 November and 14 December 2005 respectively, with a ground resolution of approximately 15 metres per pixel.


Noachis Terra context map

Noachis Terra context map
Maunder crater lies at 50° South and 2° East, approximately in the center of Noachis Terra. The sun illuminates the scene from the north-east (top left in the image).

The impact crater, named after the british astronomer Edward W. Maunder (1851-1928), is located halfway between Argyre Planitia and Hellas Planitia on the southern Highlands of Mars.



A perspective view of Maunder Crater
A perspective view of Maunder Crater

With a diameter of 90 kilometres and a depth of barely 900 metres, the crater is not one of the largest impact craters on Mars at present, but it used to be much deeper. It has since been filled partially with large amounts of material.

The west of the crater experienced a major slope failure, during which a large landslide transported loose material eastward, to the inner parts of the crater. The edges of the crater rim that collapsed exhibit gullies which might be associated with the mass transport of the material.



Maunder Crater, perspective view

Maunder Crater, perspective view
The transition zone from the western rim of the crater to the rather smooth crater floor on the eastern edge shows hummocky terrain. Such terrain exhibits small, irregularly-shaped hills and valleys. The hummocky terrain in the Maunder crater was formed by deposition of landslide debris.

In the east, the crater floor is bounded by a trough, approximately 700 metres deep. The trough may be associated with a landslide on the western edge of the crater. Some gullies can be seen on the upper edge of the trough which is possible evidence for water seepage.



Maunder Crater, Noachis Terra
Maunder Crater, Noachis Terra

The small, 500 to 2500-metre long, dark features on the crater floor are eye-catching. These features are called Barchan dunes, one of the most abundant dune forms in arid environments. Dunes of this kind are also found on Earth, for example in the West-African Namib desert.

The colour scenes have been derived from the three HRSC-colour channels and the nadir channels. The perspective views have been calculated from the digital terrain model derived from the HRSC stereo channels. The anaglyph image was calculated from the nadir channels and two stereo channels, stereoscopic glasses are required for viewing. The 3-D (anaglyph) picture has been put together from several individual 3-D images of different scenes, enhancing the view over larger areas.



Maunder Crater, Noachis Terra
Maunder Crater, Noachis Terra

For more information on Mars Express HRSC images, please read our updated FAQ (frequently asked questions).

Bad Astronomy : Deep Impact interview with Brian Cox

On July 4, 2005, NASA’s Deep Impact mission smacked an 800-pound block of copper into a comet.

image of Dr. Brian Cox and me

That’s pretty cool all by itself. But the science behind the mission was to find out what happens when a comet is hit, what materials a comet is made of, and what materials lie beneath the surface. I wrote a series of posts about the mission (here, here, and here) back when it happened.

I also flew down to LA to do an interview for a British TV show called Star Date. The interviewer was rock star/physicist Brian Cox, and we had a smashing (har har) time talking about how Hollywood portrays asteroid and comet impacts. We sat on the roof of a hotel and chatted about the movie "Deep Impact". After much woe (converting VHS tapes! Converting VOB files! Uploading! Downloading! Fire! Destruction! And I still couldn’t get the frackin’ aspect ratio right), I was finally able to put the interview up on YouTube.

This was just a bit of fun, but it was excellent meeting Brian. I hope to do more with him in the future; I can see we have very similar vision on how science should be presented to the public.

Bad Astronomy : Cassini: 10 years and counting

Has it really been 10 years since the launch of the Cassini Saturn probe?

Wow.

To celebrate the anniversary, NASA has released a whole bunch of cool images and animations. They’re all incredibly beautiful, but how can you resist this one in particular?

Cassini image of Saturn from above

[Click the images for much larger versions!]

There’s something about seeing Saturn from a height. Wow again.

I’m also fond of this one:

rainbow on Saturn's rings

See the rainbow? In this image, the Sun is directly behind the camera. The sunlight hits the ice particles in the rings and gets refracted back toward you, making a bright spot in the rings. But why the rainbow? At first I thought it was a glory, but actually it’s an illusion! Cassini doesn’t take color images like your digital camera does. It takes a series of images with different filters which are then combined on the ground to produce color. As Cassini swept past this point over the rings, it took three images (red, green, and blue) which were then added together. Since Cassini was moving, the spot smeared out, and since the color images were taken sequentially, we see an elongated rainbow. We can also see yellow and other colors in the rainbow because the spot was big, bigger than the amount it got smeared out by Cassini’s motion. The right part of the spot in one image overlaps the location of the left part of the spot in the next image, so the primary RGB colors add together to get the secondary colors.

Kewlll.

This next one is incredible. It’s an animation of tiny Prometheus and its effect on Saturn’s thin F ring:

animation of Saturn's moon Prometheus hitting the F ring

[To see this better, click the animation for a larger version.] Prometheus orbits Saturn every 14.7 hours in an ellipse. The top of the ellipse brings it just out to the orbit of the F ring particles. When it gets close, it pulls out a streamer of material. The camera stays centered on the moon, but the overall orbital motion of Prometheus and the ring is to the right. Prometheus, closer to Saturn, moves a little bit faster than the ring particles. As it pulls out the streamer of particles, they fall toward the moon and wind up orbiting Saturn a little faster than they did before. However, they still aren’t moving as quickly as Prometheus, and fall back to the left as the moon leaves them behind. The view is odd since the moon stays centered; if the point of view of the camera were stationary and everything swept past from left to right, it would look different. You’d see the actual elliptical motion of the moon as a big arc from left to right, and the ring particles would be seen moving that direction as well, just not as quickly as the moon does.

It all depends on your POV.

And that, BABloggees, is the whole point. We don’t go to these exotic locations in the solar system because we know everything that’s going on, or because we know what we’ll expect to see. We go because we don’t know. But we also go because we need to have our positions rattled, our notions shaken, our ideas tested. When we see Saturn from above, or co-orbit with a moon, or see a rainbow reflected in particles of ice a billion kilometers away, the only thing we can be sure of is that we’ll see new things, unexpected things.

That’s how we learn. That’s how we grow. And that’s what science does for us.

Tuesday, October 16, 2007

Space.com : Mystery of Io's Atmosphere Solved


By Dave Mosher
Staff Writer
posted: 15 October 2007
06:53 am ET

Jupiter's volcanic moon Io is veiled by a thin atmosphere, but how much its volcanoes and chunks of frozen gas contribute to its atmosphere has puzzled scientists for decades.

The New Horizons spacecraft recently documented the moon's glowing aurora, however, giving researchers a chance to solve the atmospheric mystery.

Io is the most volcanically active object in the solar system. The moon's pockmarked and colorful appearance is not unlike a pepperoni pizza.

"Io is volcanically active, and that volcanism ultimately is the source material for Io's sulfur-dioxide atmosphere," said Kurt Retherford, a space scientist at the Southwest Research Institute in San Antonio. "But the relative contributions of volcanic plumes and sublimation of frosts deposited near the plumes have remained a question for almost 30 years."

Io's volcanoes spew out sulfur dioxide, which is a gas that stinks of freshly lit matches and almost entirely makes up the moon's atmosphere. As Io rotates from daylight into darkness, chilling the yellowish rock down to -226 F (-143 C), the gas freezes into a solid, much like dry ice (frozen carbon dioxide gas).

"The atmosphere at that point collapses down so that all that is left supplying the atmosphere are the volcanoes," Retherford said.

Because Io's volcanic gas stays warm enough not to freeze and creates glowing auroras, scientists were able to find out how much the volcanoes supply Io's atmosphere by measuring the moon's nightside aurora.

About 1 to 3 percent of Io's dayside atmosphere, it turns out, is created by the volcanoes. The rest is generated from frozen sulfur dioxide turning directly into gas which, over eons, has accumulated on Io's surface.

New Horizon's used its Alice ultraviolet spectrograph to capture images of Io's auroras on the spacecraft's way to Pluto, which mission scientists expect to reach in 2015. Retherford and his colleagues' findings based on the Alice data are detailed in a recent issue of the journal Science.

ESA : Double Star TC-1 completes its mission

Orbits of Cluster and Double Star
Orbits of Cluster and Double Star


16 October 2007
TC-1, one of the two satellites of the CNSA/ESA Double Star mission, was decommissioned on 14 October as its designed orbit lifetime came to an end. The satellite re-entered Earth’s atmosphere and turned to dust during its descent.

Along with its twin TC-2, TC-1 is the first satellite built and operated by the Chinese National Space Administration (CNSA) in cooperation with ESA. Along with its twin and the four Cluster satellites, TC-1 has helped accomplish much during its lifetime.

The four years during which Double Star was operational brought in new perspectives concerning the boundaries of the magnetosphere and the fundamental processes that are playing a role in the transport of mass, momentum and energy into the magnetosphere. Thanks to the measurements of TC-1, there was a chance to observe the evolution of structures and physical processes at small scales with Cluster, and then on large scales with Double Star.

Here we list some of the most interesting results where TC-1 played a crucial role.



Artist's impression of the four Cluster spacecraft

Artist's impression of the four Cluster spacecraft
Space is fizzy

Above our heads, at the bow shock, where the Earth’s magnetic field meets the constant stream of gas from the Sun, thousands of bubbles of superheated gas, or ion density holes, are constantly growing and popping. These bubbles were discovered by Cluster and Double Star together, and the discovery allowed scientists to better understand the interaction between the solar wind and the Earth’s magnetic field.


Celestial chorus further away

Chorus emissions are waves naturally generated in space close to the magnetic equator. They play an important role in creating killer electrons that can damage solar panels and electronic equipments of satellites and are a hazard for astronauts. It was found that these waves are created further away from Earth during high geomagnetic activity. This information is crucial to be able to forecast their impact.

Listen to the celestial chorus here



Artist's view of Double Star
Double Star, an artist's impression

Oscillations of Earth’s natural cloak of magnetism

The four Cluster satellites and TC-1 unexpectedly found themselves engulfed by waves of electrical and magnetic energy as they travelled through Earth’s night-time shadow. Something had set the tail of Earth’s natural cloak of magnetism oscillating, like waves created by a boat travelling across a lake. The data collected gave scientists an important clue to the effects of space weather on Earth’s magnetic field.


"Double Star has demonstrated mutual benefit and fostered scientific cooperation in space research between China and Europe. But there is still much more to come as the full, high-resolution data archive becomes available," says Philippe Escoubet, ESA’s Cluster and Double Star Project Scientist.


Notes for editors:

Double Star is CNSA’s first scientific mission and also the first in collaboration with ESA. The mission consisted of two spacecraft investigating global physical processes in Earth’s magnetic environment and their responses to solar disturbances. The mission has lasted four years, surviving way past its nominal lifetime of one year.

ESA contributed to the development and pre-integration of eight European instruments for the mission. Ground station support was provided with the download of data for four hours each day using the Vilspa-2 ground station near Madrid, Spain. ESA also coordinated the scientific operations for the European instruments, via ESTEC and the European payload operations service at Rutherford Appleton laboratory, UK.

CNSA’s contribution included the two spacecraft buses, eight other scientific experiments, and the launches. CNSA was in charge of spacecraft control and operation of the Chinese instruments. After its nominal lifetime of one year, operations of Double Star were extended twice by both agencies until the end of September 2007.

A large number of papers have appeared in the literature since the launch of Double Star. A special issue of Annales Geophysicae on the first Double Star results was published in November 2005.

A special issue of the Journal of Geophysical Research on the latest results of Double Star and Cluster is under review and will be published in the coming months. As of September 2007, the combined list of Cluster and Double Star consists of 691 scientific papers.

Monday, October 15, 2007

NASA Announces Opportunities for Students

NASA 50th Anniversary Essay Contest for Students

The NASA 50th Anniversary Essay Competition for middle and junior high school students is now accepting entries. The competition consists of two separate topics, each with a limit of 500 words. The first topic challenges students to describe how they benefit in their everyday lives from space technologies built by NASA over the last 50 years. The second topic requires students to imagine how their everyday lives will have changed because of NASA space technology in the next 50 years.

Students may submit two separate essays, each responding to a separate topic. Participants must be U.S. students in grades 5-9 and under the age of 15.

An optional notice of intent is due on Dec. 7, 2007. Final entries are due on or before Jan. 7, 2008. For more information, visit:

http://www.nasa.gov/audience/foreducators/5-8/features/F_Essay_Competition.html


NASA Undergraduate Student Research Program Spring 2008 Internship Session

NASA's Undergraduate Student Research Program is currently accepting applications for 15-week spring 2008 internships. These internships offer students the opportunity to work alongside NASA scientists and engineers at NASA's centers, laboratories and test facilities.

Applicants must be U.S. college sophomores, juniors or seniors with majors or course work concentration in engineering, mathematics, computer science, or physical or life sciences. Applicants must be U.S. citizens.

The application deadline for the spring 2008 session is Oct. 22, 2007.

For more information, visit: http://education.nasa.gov/usrp

NASA Announces Opportunities for Students

NASA 50th Anniversary Essay Contest for Students

The NASA 50th Anniversary Essay Competition for middle and junior high school students is now accepting entries. The competition consists of two separate topics, each with a limit of 500 words. The first topic challenges students to describe how they benefit in their everyday lives from space technologies built by NASA over the last 50 years. The second topic requires students to imagine how their everyday lives will have changed because of NASA space technology in the next 50 years.

Students may submit two separate essays, each responding to a separate topic. Participants must be U.S. students in grades 5-9 and under the age of 15.

An optional notice of intent is due on Dec. 7, 2007. Final entries are due on or before Jan. 7, 2008. For more information, visit:

http://www.nasa.gov/audience/foreducators/5-8/features/F_Essay_Competition.html


NASA Undergraduate Student Research Program Spring 2008 Internship Session

NASA's Undergraduate Student Research Program is currently accepting applications for 15-week spring 2008 internships. These internships offer students the opportunity to work alongside NASA scientists and engineers at NASA's centers, laboratories and test facilities.

Applicants must be U.S. college sophomores, juniors or seniors with majors or course work concentration in engineering, mathematics, computer science, or physical or life sciences. Applicants must be U.S. citizens.

The application deadline for the spring 2008 session is Oct. 22, 2007.

For more information, visit: http://education.nasa.gov/usrp

ESA : Satellites help ensure efficient use of pesticides



15 October 2007
A new service, developed in the framework of an ESA-supported project, is using satellite images to compare agricultural crop sites across Europe in order to ensure the more efficient use of pesticides.

Pesticides currently used within the European Union (EU) must be registered with the national members of the European and Mediterranean Plant Protection Organization (EPPO), which requires efficiency data derived from field trials. EPPO has defined zones of comparable climates across Europe that allow data generated in one country to support registration in another country within the same climatic zone.

The new service, Site Similarity Certification (SSC), merges satellite images with conventional data like temperature, precipitation, soil characteristics and recurring natural phenomena to improve the scientific approach in defining comparable zones and the transferability of field trial results achieved in one EU member state to another.




Weekly comparison of sites in Germany and Poland
"In view of the needs for testing and regulating Plant Protection Products within EPPO member countries, the continuation of the already successfully started efforts to integrate the use of satellite images into the process of pesticide registration seems to be a promising tool," Dr Udo Heimbach a member of the EPPO Working Party said. "Satellite images are intended to be used to prove the similarity of trial sites and herewith to improve the procedure of mutual recognition of trial results throughout Europe, which is one of the aims of EPPO."

Proving the comparability of cropping sites saves the pesticide industry from carrying out expensive perennial trials, allows field trials to be planned more efficiently and creates the possibility of substituting missing field trials for Site Similarity Certifications.

Spatial Business Integration GmbH developed this new service as part of an ESA Earth Observation Market Development (EOMD) project. EOMD is a programme aimed at strengthening the European and Canadian capacity to provide geoinformation services based mainly on Earth Observation data, with a particular emphasis on addressing the needs of small value-adding companies.

ESA : Recently concluded activities to validate innovative technologies for space






15 October 2007
ESA’s Innovation Triangle Initiative (ITI) is an opportunity for European Industry and Academia to validate their new ideas for Space.

Every year, ESA organises a ITI Final Presentations Day (FPD) event where the entities involved in recently concluded ITI activities are invited to present the results of their work to an audience that includes industry colleagues, interested ESA experts and delegations.

This year the ITI FPD will occur on 16 and 17 October in ESTEC. The draft Agenda foresees the presentation of 17 ITI activities covering many different domains from electrical to mechanical including new materials and software. Just two examples:

Current mote size

Current mote size

SSTL (UK) will present their results on an activity entitled - "Wireless sensor motes for on-board networking and inter-satellite communications", whose scope was to "Establish the feasibility of adapting free standing compact wireless sensor packages or "motes" for communication between spacecraft subsystems (harness reduction) and ultimately between small low cost spacecraft flying in formation or an ad-hoc swarm (ultra-low power inter-satellite networking)".

Schematic cut-through of the experimental assembly

Schematic cut-through of the experimental assembly

The “Ionic Liquid Ion Source Array for Electrical Propulsion” developed by EPFL is an interesting candidate for low thrust propulsion system in interferometric missions and as attitude or orbit control system on small satellites. Based on an array of electrospray thruster systems, ideal for miniaturization using micromachining technologies, these emitters are capable of generating thrust in the range from micro- to millinewtons.

About ITI

The objective of the Innovation Triangle Initiative (ITI) is to foster the fast introduction of disruptive innovations into any technical domain of the European space industry by combining the creativity, know-how and experience of the research community and industry with the end customers.

ITI is based on the ‘Innovation Triangle’ concept, which states that the rapid and successful introduction of disruptive innovations in industry requires the collaboration of three different entities: a customer, a developer and an inventor.

Current status

For 2007, the ITI Announcement of Opportunity was published on 5 April in EMITS (reference AO/1-5403/07/NL/CB). Currently the ITI Evaluation Board is conducting the third and last evaluation round within 2007.

ITI was launched in 2004. Since then, 81 contracts have been placed with a total value of €7.5 million. ITI promotes spin-in, meaning that innovations originating from non-space industrial or research sectors are particularly welcome.

Sunday, October 14, 2007

Universe Today : Kaguya Releases Its Second Baby Satellite


Written by Fraser Cain

The release of VRAD. Image credit: JAXA
As we mentioned in past articles, the Japanese Kaguya spacecraft, now orbiting the Moon, is actually a collection of satellites. The largest satellite is Kaguya. It's the one equipped with all the cameras and the suite of scientific instruments.

But Kaguya was also carrying two baby satellites. The first Relay satellite, nicknamed Okina, was released on October 9th. Today Kaguya released its second sub-satellite: the tiny Very Long Baseline Interferometer (or VRAD). VRAD's job will be to help Kaguya carefully map out the Moon's gravity field.

Original Source: JAXA News Release

Universe Today : Expedition 16 Docks with the Station


Written by Fraser Cain

Expedition 15 and 16. Image credit: NASA
The International Space Station now has 6 crew members on board, after the Soyuz capsule carrying Expedition 16 docked earlier today. Commander Peggy Whitson, Flight Engineer Yuri Malenchenko and spaceflight participant Sheikh Muszaphar Shukor floated into the station when the hatches were opened at 12:22 p.m. EDT on Friday.

The station is going to be a busy place for the next week, with all 6 crew members aboard. And then three members will depart on October 21st. Shukor will return with Expedition 15 members Commander Fyodor Yurchikhin and Flight Engineer Oleg Kotov.

Whitson is the first female commander of the station. And if the shuttle mission STS-120 launches on schedule, it will bring shuttle commander Pam Melroy to the station. This will be the first time that two female mission commanders are in orbit at the same time.

Original Source: NASA News Release

Universe Today : Has Dark Energy Always Been Constant?


Written by Fraser Cain

Hubble Deep Field. Image credit: Hubble
Dark energy is that mysterious force that seems to be accelerating the expansion of the Universe. But the question is: has it always been pushing the Universe apart with the same force, or was it weaker or stronger in the past, and will it get stronger in the future? Researchers from the Harvard-Smithsonian Center for Astrophysics have a plan to study distant clumps of hydrogen, to get to the bottom of this question, once and for all.

Dark energy was first discovered nearly a decade ago, when astronomers noticed that distant supernovae were further away than their calculations were expecting. Some mysterious force appears to be accelerating the expansion of the Universe from every point in space. As space expands, more dark energy seems to appear. And although the amount of dark energy in any one point in space is tiny, across the vast reaches of space, it really adds up, accounting for more than 70% of the Universe.

If dark energy is increasing, however, you could imagine it eventually becoming so strong that it starts to tear galaxy clusters apart, and then galaxies themselves, and even star systems. Maybe it might even become so strong that it tears apart atoms and even the fabric of space itself. Astronomers call this theory the "Big Rip". Or maybe just the opposite is true, and dark energy will eventually become negligible to the expansion of the Universe.

In order to see if the strength of dark energy is changing over time, astronomers are planning to carefully plot the position of clouds of neutral hydrogen, shortly after they formed from the Big Bang. Although it's not possible now, future planned observatories should be able to trace this material all the way back to a time when the Universe was only 200 million years old.

In the early Universe, small fluctuations in energy density and pressure caused oscillations. Although tiny in the beginning, these ripples have been magnified by the expansion of the Universe so that they stretch 500 million light-years across today. The clouds of neutral hydrogen should follow the same ripple pattern, so astronomers will know they're looking at those first, primordial clouds, and not some closer ones.

And so, astronomers will be able to look back in time, and study the distance to the clouds at each epoch in our Universe's expansion. They should be able to trace how much dark energy was affecting space at each time, and get a sense if this energy has always remained constant, or if it's changing.

Their answers will shape our understanding of the Universe's evolution, and its future.

Original Source: CfA News Release

Universe Today : Titan has Drizzling Methane Rain


Written by Fraser Cain

Titan
If you're planning a visit to Saturn's moon Titan, make sure you bring an umbrella. You'll need it. Not to protect you from water raining down; on frigid Titan, where temperatures dip below 180-degrees Celsius, all the water is completely frozen. No, according to scientists, there's a steady drizzle of liquid methane coming down in the mornings.

New infrared images gathered by Hawaii's W.M. Keck Observatory and Chile's Very Large Telescope show that Titan's Xanadu region experiences a steady drizzle of methane during its lengthy morning. The concept of morning is a little misleading, since Titan takes about 16 Earth days to complete one rotation. So, the "morning" drizzle actually lasts around 3 Earth days, dissipating around 10:30 a.m. local time.

Astronomers aren't actually sure if this is a moon-wide phenomenon, or just localized around the Xanadu region of Titan. Even though large lakes and seas have been discovered around the moon's poles, no process had been discovered that fills them with liquid… until now.

Reporting their findings in the latest issue of the online journal Science Express, researchers from UC Berkeley note that, "widespread and persistent drizzle may be the dominant mechanism for returning methane to the surface from the atmosphere and closing the methane cycle."

The new Keck/VLT images show a widespread cloud cover of frozen methane at a height of 25 to 35 kilometres. And then there are liquid methane clouds below 20 kilometres, and finally rain falling at the lowest elevations.

The droplets of liquid methane in the rain clouds are 1,000 times larger than water vapour here on Earth, and this surprisingly makes them harder to detect. Since the droplets are larger, but still carry the same amount of moisture, they're much more spread out, making the clouds extremely diffuse, and nearly invisible.

How much liquid is trapped in the clouds? If you squeezed them all out and spread the liquid across the surface of Titan, it would coat the entire moon to a depth of about 1.5 cm. And that's actually the same amount as we'd get if you did the same thing with the Earth's clouds.

Original Source: UC Berkeley News Release

Space.com : Report Urges U.S. to Pursue Space-Based Solar Power


By Brian Berger
Space News Staff Writer
posted: 12 October 2007
ET

WASHINGTON – A Pentagon-chartered report urges the United States to take the lead in developing space platforms capable of capturing sunlight and beaming electrical power to Earth.

Space-based solar power, according to the report, has the potential to help the United States stave off climate change and avoid future conflicts over oil by harnessing the Sun's power to provide an essentially inexhaustible supply of clean energy.

The report, "Space-Based Solar Power as an Opportunity for Strategic Security," was undertaken by the Pentagon's National Security Space Office this spring as a collaborative effort that relied heavily on Internet discussions by more than 170 scientific, legal, and business experts around the world. The Space Frontier Foundation, an activist organization normally critical of government-led space programs, hosted the website used to collect input for the report.

Speaking at a press conference held here Oct. 10 to unveil the report, U.S. Marine Corps Lt. Col. Paul Damphousse of the National Space Security Space Office said the six-month study, while "done on the cheap," produced some very positive findings about the feasibility of space-based solar power and its potential to strengthen U.S. national security.

"One of the major findings was that space-based solar power does present strategic opportunity for us in the 21st century," Damphousse said. "It can advance our U.S. and partner security capability and freedom of action and merits significant additional study and demonstration on the part of the United States so we can help either the United State s develop this, or allow the commercial sector to step up."

Demonstrations needed

Specifically, the report calls for the U.S. government to underwrite the development of space-based solar power by funding a progressively bigger and more expensive technology demonstrations that would culminate with building a platform in geosynchronous orbit bigger than the international space station and capable of beaming 5-10 megawatts of power to a receiving station on the ground.

Nearer term, the U.S. government should fund in depth studies and some initial proof-of-concept demonstrations to show that space-based solar power is a technically and economically viable to solution to the world's growing energy needs.

Aside from its potential to defuse future energy wars and mitigate global warming, Damphousse said beaming power down from space could also enable the U.S. military to operate forward bases in far flung, hostile regions such as Iraq without relying on vulnerable convoys to truck in fossil fuels to run the electrical generators needed to keep the lights on.

As the report puts it, "beamed energy from space in quantities greater than 5 megawatts has the potential to be a disruptive game changer on the battlefield. [Space-based solar power] and its enabling wireless power transmission technology could facilitate extremely flexible 'energy on demand' for combat units and installations across and entire theater, while significantly reducing dependence on over-land fuel deliveries."

Although the U.S. military would reap tremendous benefits from space-based solar power, Damphousse said the Pentagon is unlikely to fund development and demonstration of the technology. That role, he said, would be more appropriate for NASA or the Department of Energy, both of which have studied space-based solar power in the past.

The Pentagon would, however, be a willing early adopter of the new technology, Damphousse said, and provide a potentially robust market for firms trying to build a business around space-based solar power.

"While challenges do remain and the business case does not necessarily close at this time from a financial sense, space-based solar power is closer than ever," he said. "We are the day after next from being able to actually do this."

Damphousse, however, cautioned that the private sector will not invest in space-based solar power until the United States buys down some of the risk through a technology development and demonstration effort at least on par with what the government spends on nuclear fusion research and perhaps as much as it is spending to construct and operate the international space station.

"Demonstrations are key here," he said. "If we can demonstrate this, the business case will close rapidly."

Charles Miller, one of the Space Frontier Foundation's directors, agreed public funding is vital to getting space-based solar power off the ground. Miller told reporters here that the space-based solar power industry could take off within 10 years if the White House and Congress embrace the report's recommendations by funding a robust demonstration program and provide the same kind of incentives it offers the nuclear power industry.

Military applications

The Pentagon's interest is another important factor. Military officials involved in the report calculate that the United States is paying $1 per kilowatt hour or more to supply power to its forward operating bases in Iraq.

"The biggest issue with previous studies is they were trying to get five or ten cents per kilowatt hour, so when you have a near term customer who's potentially willing to pay much more for power, it's much easier to close the business case," Miller said.

NASA first studied space-based solar power in the 1970s, concluding then that the concept was technically feasible but not economically viable. Cost estimates produced at the time estimated the United States would have to spend $300 billion to $1 trillion to deliver the first kilowatt hour of space-based power to the ground, said John Mankins, a former NASA technologist who led the agency's space-based solar power research and now consults and runs the Space Power Association.

Advances in computing, robotics, solar cell efficiency, and other technologies helped drive that estimate down by the time NASA took a fresh look at space-based solar power in the mid-1990s, Mankins said, but still not enough justify the upfront expense of such an undertaking at a time when oil was going for $15 a barrel.

With oil currently trading today as high as $80 a barrel and the U.S. military paying dearly to keep kerosene-powered generators humming in an oil-rich region like Iraq, the economics have change significantly since NASA pulled the plug on space-based solar power research in around 2002.

On the technical front, solar cell efficiency has improved faster than expected. Ten years ago, when solar cells were topping out around 15 percent efficiency, experts predicted that 25 percent efficiency would not be achieved until close to 2020, Mankins said, yet Sylmar, Calif.-based Spectrolab – a Boeing subsidiary – last year unveiled an advanced solar cell with a 40.7 percent conversion efficiency.

One critical area that has not made many advances since the 1990s or even the 1970s is the cost of launch. Mankins said commercially-viable space-based solar power platforms will only become feasible with the kind of dramatically cheaper launch costs promised by fully reusable launch vehicles flying dozens of times a year.

"If somebody tries to sell you stock in a space solar power company today saying we are going to start building immediately, you should probably call your broker and not take that at face value," Mankins said. "There's a lot of challenges that need to be overcome."

Mankins said the space station could be used to host some early technology validation demonstrations, from testing appropriate materials to tapping into the station's solar-powered electrical grid to transmit a low level of energy back to Earth. Worthwhile component tests could be accomplished for "a few million" dollars, Mankins estimated, while a space station-based power-beaming experiment would cost "tens of millions" of dollars.

Placing a free-flying space-based solar power demonstrator in low-Earth orbit, he said, would cost $500 million to $1 billion. A geosynchronous system capable of transmitting a sustained 5-10 megawatts of power down to the ground would cost around $10 billion, he said, and provide enough electricity for a military base. Commercial platforms, likewise, would be very expensive to build.

"These things are not going to be small or cheap," Mankins said. "It's not like buying a jetliner. It's going to be like buying the Hoover Dam."

While the upfront costs are steep, Mankins and others said space-based solar power's potential to meet the world's future energy needs is huge.

According to the report, "a single kilometer-wide band of geosynchronous earth orbit experiences enough solar flux in one year to nearly equal the amount of energy contained within all known recoverable conventional oil reserves on Earth today."