• Nie Znaleziono Wyników

1 American Institute of Aeronautics and Astronautics Conference;

N/A
N/A
Protected

Academic year: 2021

Share "1 American Institute of Aeronautics and Astronautics Conference;"

Copied!
10
0
0

Pełen tekst

(1)

American Institute of Aeronautics and Astronautics Conference;

14-17 July 2003 Dayton, Ohio, USA

THE FUTURE OF REMOTE SENSING - A PERSPECTIVE FROM THE SENSORS AND ELECTRONICS TECHNOLOGY PANEL OF THE NATO RESEARCH AND TECHNOLOGY ORGANIZATION

Dr. R. G. Buser, RGB and Associates, Inc., New Jersey, USA Dr. D. Faubert, Defence R&D Canada – Valcartier, Canada

Dr. Y. King, Spacecraft Technology Division, Air Force Research Laboratory, USA Mr. R. Hintz, NAVAIR Weapons Division, China Lake, USA

Dr. M. Holinko, US Army CECOM, New Jersey, USA

Anthony K. Hyder, Professor of Physics, University of Notre Dame, USA Dr. R. Klemm, Ph-D, FGAN-FHR/EL, Germany

Prof. M. Tacke, FGAN-FOM, Germany Dr. M. Vant, Defence R&D Canada – Ottawa, Canada

Abstract

This paper will give the perspective of the Sensors and Electronics Technology Panel of the NATO Research and Technology Organisation on the future of remote sensing technology for airborne and spaceborne applications. We will first evoke the fundamental character of future military air and space operations (joint and combined) and deduce the implications on sensor technology requirements and their operation (distributed and network-centric).

We will then review the trends with respect to the following areas: radar-sensing technology, radar signal processing, passive-electrooptical-sensing (including hyperspectral), laser-sensing technology, image processing (including ATR) and information-centric sensor management.

AEROSPACE POWER

Aerospace power can be defined has the capability to conduct military operations in or passing through the total expanse of air and space above the earth surface.1 Aerospace platforms have the inherent advantages of speed, reach, elevation, surprise and precision. They are also flexible, mobile, an d responsive. Greater mobility combined with greater speed allows faster response times. In peace time, aerospace power is used to deter aggression, to support, to sustain life, to observe and verify. In war time, it is used to respond to escalation, deny, destroy, dislocate, divert, delay, observe and demoralize. Aerospace power can:

• Conduct operations that have significant impacts on all three level of conflict – strategic, operational and tactical – at the same time.

• Strike quickly and directly at an adversary’s sources of strength, over long distances and with few restrictions based on geographic boundaries.

• Apply military power directly or indirectly to help shape and respond to international events.

• Provide powerful, visible and effective response capabilities at relatively low employment costs.

• Carry out major operations with a minimum footprint and reduced exposure to hostile action, limiting the risk of casualties.

• Rapidly deploy to and withdraw from operational theatres.

• Remain forward-based (on the ground) and capable of reaching into hostile territory without sustaining ground forces in hostile territory.

Aerospace power is particularly sensitive to technology.

Therefore, to a large extent, modern combat effectiveness is determined by the effectiveness of high-technology weapons and weapon systems.

Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR) is a capability that already is and will continue to be in high demand. The gathering, processing and employment of information in decision-making and action will be the most important technological advance. Improved situational awareness will result in a common view of the battle space. The major implication will be that future warfare will be compressed in time and expanded in space.

This paper will focus on Aerospace Surveillance and Reconnaissance (ASR). Its purpose is to collect information on resources and activities of an enemy or potential enemy by airborne, space borne and ground- based assets. Surveillance can be regarded as an area operation whereas reconnaissance is directed at specific targets. These operations rest on visual, photographic,

(2)

radar, electromagnetic and electro optical sensors. The platforms can be manned or not.

Among others, the following areas are of particular interest in the field of ASR:

• Wide area aerospace surveillance systems capable of small target detection.

• Airborne target detection, sorting, identification and tracking.

• Surface target detection, sorting, identification and tracking.

• Sub-surface target detection, sorting, identification and tracking.

• Threat detection, classification, warning and protection for aircraft.

• Airborne sensor systems with day, night and adverse weather capability.

• Three dimensional virtual reality real time situation awareness displays.

• The display of sensor data in real time on virtual reality 3-D displays.

• Situational awareness data transmission and display.

CHARACTER OF MODERN WARFARE AND ITS IMPLICATIONS FOR SENSING REQUIREMENTS

In recent years, we have gone from the world of the big, centralized, well-defined threat, which could be assumed to behave in a rational manner, to what we could characterize as a diffuse and asymmetric threat.

As Planet Earth becomes more akin the a large village, conflicts are becoming global. Rather than fighting all- out conflicts, we are likely to tailor our response to achieve limited objectives. Because they are global conflicts, we will operate in coalitions, which implies a need for interoperability.

Modern operations are characterized by high tempo, both day and night and in almost all weather. The operations are often conducted remotely and will increasingly be automated. Precision engagement ad pervasive oversight is fast becoming the norm. This new threat environment poses significant problems for C4ISR that require a response from the R&D community. There is a need for more knowledge &

information. With less time for decision-making, the information must be available within short time frame.

This implies capable and flexible sensing, more automation and large-bandwidth information systems.

Networking is absolutely essential to fulfill these requirements.

Hence, although this paper will focus on sensors, it is clear that the C4 systems collating, analyzing and distributing the information are just as critical. Data

must be transformed into information and then knowledge. Visualization is an issue to help make sense of all the information. Finally, automated decision aid systems must be developed to assist the commander perform his or her mission.

FUTURE REQUIREMENTS FOR SPACEBORNE SENSING

As far as space sensing goes, the big breakthroughs have been in computing, processing power and communications. One can now look at realistically doing on-board processing for large data requirements.

One can also envision passing information from one satellite to another for near real time info to the users.

Multi-phenomenology solutions like polarimetry, hyperspectral imaging and panchromatic, will be used all together or in sequence, to obtain the most complete and useable set of information for a specific task.

A big issue is to create cheaper, lighter structures, both for the bus and the large antennas required for radar.

T/R module development in lighter (tiled) configurations, where the module itself forms the structure hold promise. RF MEMs will likely be quite important in achieving the size and weight reductions.

Distributed systems. Large satellites requiring years of development are things of the past. They also become obsolete rather quickly, and cost a prohibitive amount for technology insertion. The future is in constellations of smaller satellites that are not designed necessarily for very long on orbit. Clusters, for example, will replace large aperture optics. Micro-satellites that perform a single mission but are reasonably priced will replace large platforms that tend to be designed to do many functions to make the investment worthwhile. To operate these distributed systems, we need links, station keeping and methods of removing grating lobes in the thinned array if actually forming one array.

Hyperspectral imaging. This is an incredibly rich area of R&D and with each day, users are realizing benefits that were not predicted even five years ago. The advent of better-performance sensors, very fast processors, cheap memory, and more robust data-fusion techniques are collectively being brought to bear on this issue. The future is bright and the horizons seem to be unlimited.

More use of Mid-Earth orbits. Space programs have traditionally favored LEO (low-Earth orbits <1000 km) or GEO (geosynchronous, 36,000 km) for parking satellites. GEO was chosen for obvious reasons related to the nature of communications satellites. LEO is cheaper to reach and the satellites can operate on lower power since the distance to Earth is less. With MEO (1,000 to 10,000 km), there are the complicating issues

(3)

of solar ultraviolet radiation and the trapped protons and electrons of the Van Allen Belts. But electronics are now better hardened to that radiation. Also, the crowds forming at GEO, and the move to distributed systems have all forced a re-look at MEO. Many believe that this will become the place to be although there is additional aperture required for EO and RF sensors. This of course translates to weight/cost that may make them less attractive. It should also be noted that one could make use of highly elliptical (Molniya) orbits.

TRENDS IN RADAR SIGNAL AND IMAGE PROCESSING

The efficiency of sensors for reconnaissance, surveillance and target acquisition (RSTA) depends strongly on the processing capabilities. This relates to algorithms, software implementation and digital technology as well. In the past decade rapid advances were made in all three disciplines. Nevertheless there is a large number of unresolved issues which awaits solutions in the decades to come. The key requirements are global surveillance, moving target detection and target identification. The first two items require moderate geometrical resolution whereas the latter item needs very high resolution as well as the exploitation of other clues such as polarization. These tasks have to be performed under adverse propagation conditions and in presence of electronic and physical countermeasures.

The state of the art in moving target detection via STAP (space-time adaptive processing) and ground imaging SAR is illustrated in Figures 1 and 2. 3, 4, 5

Since signal, image and data processing are enabling technologies it is impossible to predict the evolution of radar signal processing as such. Trends in future processing systems are merely dictated by innovative sensor concepts which require individual processing architectures and processing power.

Bi- and multistatic systems.

Future radar operation will be conducted in presence of increasing electronic countermeasures and physical threat. At the same time, cross-sections of radar targets will be reduced significantly. Bi- and multistatic radar configurations offer the potential of radar operation with a covert receiver not detectable by passive electronic support measures (ESM). The transmitter should be located in a remote position (e.g. combination of a spaceborne transmitter with an airborne receiver). Transmitters of opportunity (e.g., TV stations) may also play a role.

The detection of stealthy targets might be improved significantly by bi- or multistatic radar. The design of signal processors for bi- and multistatic radar systems is a special challenge for future system design. So far

there is no efficient focus algorithms for bistatic SAR image generation but can be expected for the foreseeable future. The implementation of STAP or other processing techniques for moving target indication is a specific challenge

.

Antenna arrays. Antenna array will increasingly support multifunction radar operation (search, track, target identification, guidance, counter- countermeasures) and also handle simultaneous operation of radar, radiometry and communications.

Future array antenna for RSTA will require 360°

azimuthal coverage which in some applications may be implemented by conformal arrays. Fully digitised arrays with the A/D converter directly connected with the sensing antenna element promise new dimensions in the use of arrays antenna technology, including robustness against electronic countermeasures. Except for the positioning of the sensing elements, reconfigurable arrays are defined by software only which offers new dimensions of flexibility of radar and communications antennas.2 Arrays of small satellites will form new architectures of spaceborne array antennas with new options in resolution, interferometric imaging and moving ground target detection.

Experience with RADARSAT-2 will foster development in this direction.7

Complex Image Analysis. Complex image analysis may offer a new challenging way of detailed analysis of complex radar images (in addition to the amplitude the phase information in a radar image is evaluated).6 In conjunction with ultra-high resolution radar such techniques promise high reliability in target identification.

Real-time processing. All the aforementioned system concepts will need specific signal and data processing schemes for target detection, imaging, tracking and identification. Advanced electronics technology in conjunction with a high degree of parallel computing algorithms will yield significant progress in real-time radar operation for global surveillance and target identification.

Summary. Future observation systems based on radar will have heavily increased performance capability in terms of coverage, resolution, timeliness, covert operation, and target classification.

TRENDS IN PASSIVE-ELECTRO-OPTICAL- SENSING SENSING

The sensor material technology in the ultraviolet, visible, short, medium, long and very-long wave infrared (SWIR, MWIR, LWIR, VLWIR) and their

(4)

trade-offs is rather well established. For example, in the IR, reasonably sized focal plane arrays based upon MCT, InSB, PtSi, GaAS/AlGaAs (cooled), as well as micro bolometric and ferroelectrics approaches (uncooled) are becoming available. Further improvements and even breakthroughs may be expected as our capability of material modeling will be dramatically extended in the next few decades.

In the near future, the emphasis will be on large/very large two-dimensional arrays (elimination of scanners), and the development of manufacturing processes leading to an affordable cost for the selected sensor geometry and required systems performance. Further down the road multicolor and broadband focal plane arrays are expected to become available improving dramatically the overall Signal (target) to Noise (background, clutter, noise) ratio. This will enable sensor fusion capabilities and will provide rich opportunities in the area of multi- and hyperspectral sensing. Equally, hybrid approaches combining passive and active/coherent pixel segments on the same focal plane array may become possible, providing additional orthogonal information, such as imagery with vibration.

On the very horizon are also wavelength tunable arrays, and quantum dots, possibly replacing quantum wells with increased sensitivity and performance.

Another significant step will be attained when on-chip hybrid electronic/optical structures become readily available, allowing on-plane processing and adjustment to the specifically required sensing functionality.

Multiple functionalities with their required performance also may be adaptively implemented (such as laser IFF, and peripheral warning). By that time, wherever necessary, appropriate hardening of the sensor system, and proper shielding against intentional blinding/destruction will be part of the overall design.

Looking into the rather distant future, it appears possible to combine on the same focal plane array sensor structure for other sensing modalities. Beyond the conventional electro-optics spectrum we may include millimeter waves and combine IR imaging with mmwave imagery/information. In addition we may have on line radar and acoustic sensor input, all feeding one integrated processor. This can be done through extensive use of MEMS and MEOMs technology, coupled with advanced ultra wide band material selection and appropriate geometries. As a result single integrated multifunctional sensing apertures will become feasible, with significant impact on space, weight, and power consumption reduction for many UAV and satellite applications.

As sensors and processors become more and more integrated appropriate hardware and software architectures need to be developed. While a large amount of algorithm development has taken place (preprocessing/signal conditioning techniques, multispectral change detection, automatic and aided target detection) there is great need for new ideas and approaches to be pursued. On the hardware side we may expect in the next decades orders of magnitude of increase of computing speed (1035 calculations/sec) and significant increase of memory capability, providing unprecedented on line processing capability. Hardware structures may shrink to matchbox sizes. To the

"Ohmic" computers the "Quantum" computers will be added. It has been projected, that if our "biological"

knowledge and know-how will proceed as envisioned, these computing capabilities will enable and allow the construction of the ultimate sensor/sensing robots for many platforms.

The usefulness of this expected giant progress will encompass the following: preplanning and operational support; worldwide (earth +) surveillance and reconnaissance with on line continuous large FOV air asset control; near "all" target detection, identification, tracking including all camouflage/deception techniques;

target vulnerability and damage assessment; chem/bio and advanced nuclear detection; mine/minefield detection; advanced detection and identification of terrorist activities; and more.

TRENDS IN LASER-SENSING TECHNOLOGY

Lasers as of today already probe the environment in several ways. In the future, they will be used to make multi-dimensional measurements using the principles of radar (i.e. LADAR). They will interact and stimulate chemicals, pollutants, or organisms to emit light by which the material may be identified. Lasers will measure motion of gases, liquids and solids. They will measure characteristic vibration of living or man-made objects. They will measure the density of particulates in the air or in liquids. They will be used to detect &

identify (IFF) objects through obscuring media (aerosols and foliage penetration).

At present multi-dimensional measurements are limited by laser power, computer processing power, and the limitations of incoherent and coherent focal plane arrays and their signal processors. As a result systems are limited in range and the number of object measurements that can be made in a given amount of time. Today long-range sensors require bulky and inefficient scanning mechanisms to provide the necessary area coverage. Progress will be made, especially at "eye-safe" wavelengths as more efficient

(5)

and powerful lasers are developed. Larger and larger staring detector arrays are being made which can measure time of flight or phase change in each pixel.

Coherent focal plane arrays will be developed that can measure target macro(velocity) and micro(vibration) Doppler in near real time. Computer processing is continually becoming faster and shrinking in size.

Within ten years we could see mega pixel arrays making real-time multi-dimensional measurements at 10 km with frame rates of 30 frames/second or faster.

These systems will be smaller and more efficient than the present systems. Over the following decades the parameters of pixel-rate and range should increase by orders of magnitude, with further decreases in size and weight.

Lasers can be used to "fingerprint" substances at a distance. This would include remote analysis and tracking of battlefield aerosols, and man-made and natural agents. In general this will require other classes of lasers and detectors, but will build on the same advances in computing power as the multi-dimensional LADAR. Similar techniques will also evolve for making measurements in liquids of the presence and nature of foreign substances

We can expect in the next 10 years airborne and spaceborne systems to remotely measure wind velocity profiles. We will be able to measure the velocity vector of objects in space at distances 10 times greater than at present, and with accuracies 10 times greater than today. There will be similar but somewhat smaller improvements in the measurement of objects in the atmosphere. Measurements on the surface (land or water) will only be limited by lines of sight, as new techniques using ultra-short pulses and coherent LADAR sensors evolve to penetrate many classes of aerosols.

Laser based remote sensing will allow highly directional inspection of distant phenomenology at any time of day, and under a broad range of atmospheric conditions. Space-based application will continue to evolve so that wind sensing will provide the missing element of weather forecasting. The commercial applications for remote laser sensing will outstrip the military applications, and will ultimately drive the advances in this area.

TRENDS IN IMAGE PROCESSING

The ultimate goal of image processing is an understanding of the image contents, or rather the contents of a scene represented by the image, at least as good an understanding as an observer would be able to generate. In limited application areas this is possible,

usually in cooperative environments, that is, for instance, with uniform lighting and with a limited number of object classes present. Under such circumstances, image processing results in pure information as an output, and automatic vision systems have the well known advantages of high speed, high reliability, and high availability without fatigue, as compared to a human operator. These features are of central value for military systems, whether humans are present at the same location, or whether in automatic systems that go into unsafe areas or operate under other conditions not allowing men to be present. One trend of image processing is in fact, that hardly an innovative defense related system is found or imagined, that does not contain image processing features.

Military applications usually operate under vastly different lighting conditions and very different environments: under non-cooperative conditions. Under such conditions one would moreover like image processing techniques to be able to discriminate between friend and foe. As a rule, such decisions ask for human intelligence to be in the loop somewhere. A prominent example is tracking, where the observer detects and identifies a specific object and marks it in a monitor image. Such an object is then tracked, and weapons can be directed automatically with a precision not attained by a human operator, or autonomous systems can follow the object. This capability is already present in comparatively simple hand held weapons for air defense, and image processing has changed air force strategy through them; a success story, if not seen from inside a cockpit.

At present, tracking asks to indicate the object in the sensor data to be used for tracking. A hand-over from other sensors usually is limited to comparable sensors.

Present research will widen the range of input data, thus allowing remote targeting. It will also widen the range of acceptable variability of background and object appearances, and will allow to detect and override countermeasures like flares. In a way these trends are just qualitative improvements, but in their sum they will allow true fire and forget capability as a basic new quality.

Sensor data of unprecedented resolution are available with increasing data rate. This provides an information bottleneck for instance for getting down sensor data from unmanned aerial vehicles (UAVs) due to limited effective transmission bandwidth. Image compression is already used with success, but these techniques have two fundamental limits: one is, that compression factor gain with new generations of software gets smaller, the other is, that the compression suppresses information which is of low value for a human observer, but may be

(6)

essential for the function of automatic image exploitation. Hence the accessible road to further optimize the use of transmission bandwidth is rather not improvement of compression, but on board screening for those images that ought to be transmitted with high resolution.

The amount of image data accumulated by optronic and radar sensors is overwhelming, and already today cannot be exploited fully due to lack of trained personal. Image processing for change detection will fill this exploitation gap. This is a complex task, since there are always changes in such image data due to variation of daytime, weather and seasons, and today screening by humans in general is still more reliable and faster. A trend in image proccessing is to think of software agents, that for instance are made to do automatic target recognition and change detection in large data sets generated or stored in remote located places.

While the largest amount of image data comes from two dimensional projections like from video cameras or infrared sensors, active systems with laser radiation deliver three dimensional (3D) data. Such data allow a better recognition of patterns of interest, and at present already laser radar data can be used to detect obstacles for helicopters like power lines in real time. At present, laser radar data is evaluated for the closest and highest evaluations, and the pilots are either asked to get and keep the position of the cross in the left image over the displayed line. The cross heads into the flight direction, and the limiting elevation line is given as direct data into the video images (as shown in the upper right figure3). The pilot flies by the monitor image and his own orientation experience. Alternatively, a line is calculated from the evaluation and the helicopter flight data, in this case the pilot can keep the flight steering as it is, if the cross is above the line, which now does not have any real meaning in the monitor image (upper left).

The disadvantage of both procedures lies in the fact, that harmless objects like birds might influence these lines. The pilot will thus have to verify the obstacle, for instance before going to an unsafe altitude. This can only be done by looking from the window, and it may be difficult to locate and identify the object in this view.

Hence a new solution was sought and found by pattern recognition in the 3D-data. The data for the obstacle can be filtered and displayed in the 2D monitor (lower left figure 3, power cable and pole), with this aid the pilot can identify the object on the monitor and hence react quickly without taking his eyes from the monitor, or he can orient himself at once in the real scene if wanted. The trend is to do sensor fusion in such a way

in order to give and add information in a natural way that does not need substantial training to operators. The lower left section shows a map-like view indicating the sensor field of view (triangle, the helicopter location is at the center), with the short newly identified sector of power line, and one behind and to the left of the helicopter, that had been recorded before during prior maneuvers.

As a rule, such fusion of image data needs to infer 3D data from the sensor data, or from data of multiple sensors. Such 3D data can be stored and used by forming a 2D projection which looks like a monitor image from a given perspective. Such procedures are used presently for simulation for instance in order to assess sensor concepts. This is demonstrated in Figure 4, showing how such a simulated scene builds up from (from the left) elevation data, superimposed with map data, visible and infrared images (which might for instance be taken from a UAV), and then with artificial objects added, like the house in the stripe to the right, which was not present in full resolution in the original data, but the shape of which could be inferred. The calculation of such data at present is tedious and not feasible in real time, so that at present it would have to be prepared on the ground before a plane or helicopter leaves. Trend is to investigate automatic procedures that allow to do the job in real time, for instance with fresh image data coming in from internal and external sensors, and to generate something like “real virtual reality”.

TRENDS IN INFORMATION-CENTRIC SENSOR MANAGEMENT

Within the future Network Centric battlefield environments, advanced sensors, sensor suites, and processors will provide the dominant data inputs for the 21st century battlefield situation awareness. Each soldier and platform will need access to the same data and a common picture of the battlefield to ensure better planning and more effective execution; reduce likelihood of fratricide and collateral damage; and improve situation awareness to protect forward deployed forces. In a very fluid, asymmetrical battlefield, situation awareness and sensor integration will become extremely difficult to achieve and maintain. Overall a more robust, integrated sensor system will be required to, not only improve situation awareness, but also targeting and interoperability.

Although “network-centric” thinking is now well established at the conceptual and viewgraph level, it is far from institutionalized in an operational sense. There are many implications when implementing Network Centric Warfare, including:

(7)

• Each battlefield entity is dependent upon and relies primarily upon “The Network” to survive and accomplish its mission.

• Each battlefield entity contributes information to the Network in accord with its sensor capabilities.

• Network Centric Warfare attributes are independent of Platforms, Weapons, Force Structure, and Scenarios.

In a Network Centric environment it will be exceedingly difficult to separate the Information Systems from the Sensor Systems and Weapon Systems. Additionally, sensors, sensing, and sensor networks will be essential to achieve and to fully implement Network Centric Warfare. Sensors must provide:

• Intelligence- resulting in the in-depth understanding of situation or entity.

• Situation Awareness- to provide full knowledge of own location, friendly forces, and enemy forces.

• Targeting and Target Acquisition.

• Support for very precise/selective attack.

• Attack of mobile, time critical targets.

• Damage assessment and re-strike capabilities.There will be many resulting implications of Network Centric Warfare on Sensors, including:

• Greater reliance on sensors:

- With fewer soldiers, sensors will be used to acquire information (environment and enemy).

- Commander’s will rely more heavily on sensor data.

• There will be more remote sensor operations:

- Most sensors will operate remotely, either autonomously or controlled over the network.

- Sensor information will have to be transported to other locations over the network.

- Sensor information could become the major consumer of network resources.

- Network transport may introduce some degradation in the initial sensor quality.

• Sensors will be networked:

- Distributed sensors will function as a group.

- Automatic cross cueing will improve location accuracy, tracking, and probability of identification.

- Sensor & target level fusion will reduce false target alarm rates and improve detection / identification.

• There will be more “sharing” of sensor assets:

- Sensor tasking and prioritization will have to be addressed and resolved.

- Additional security will be needed to prevent compromise of sensors.

- Multiple warfighters will need direct access to sensor information.

- Information overload could become a major problem.

- Tools for automatic deconfliction, fusion, and simplification will become more important.

• Expanded areas of sensor coverage will be essential:

- Lighter forces will seek to trade distance for armor.

- There will be more capable sensors and more available sensors.

- Direct access to sensor information will be required across echelon lines.

Sensors will become smaller and less expensive, and connect, seamlessly, via wireless means. Additionally, multiple types of sensors will be deployed together providing broader and more complete coverage.

Overall, the management of the sensor assets will become more critical, especially in the areas of sensor placement, tasking, movement, and sustainment.

Future information -centric sensor management must be able to operate in the tempo and fluidity of future conflicts. Its characteristic must include:

• Implemented in open architectures.

• Adaptable to changing situations.

• Implemented with a multi-layered architecture.

• Adaptive, multi-layered, managing a network of networks.

• Extendable, so that sensor sources and actors can be easily added and removed without having to stop and restart.

• Scalable, so that it can work with potentially thousands of sensor sources and actors.

• Self-synchronising, so that it automatically adjusts itself to any changes.

• Self-repairable, so that it can continue to function even if it is damaged, broken up into smaller grids, or has new grids connected to it.

• Adaptable to changes in available bandwidth.

• Reliable and secure, end-to-end.

• Capable of prioritizing and putting a “value” on the required data.

• Information encryption and validation.

Although the future Network Centric Warfare certainly has the potential to maximize the effectiveness of the joint and coalition forces on the battlefield, the underlying “info-structure” - the information, information flow, networking, and management, is imperative for the success of the operations. Extremely robust, integrated, and ubiquitous sensor management with integrated sensor systems of systems, will be the key, to ensure the warfighter has the vital intelligence and situation awareness, to simultaneously, protect the force and engage the enemy with targeting and target acquisition, and precision engagement.

(8)

[5] Ender, J. H. G.: “Space-Time Processing for Multichannel Synthetic Aperture Radar“. IEE ECEJ, Vol. 11, No. 1, February 1999, pp. 29-38.

REFERENCES

[1] Canadian Aerospace Capability Framework; 2003 [2] Younis, M., Fischer, C., Wiesbeck, W.: “An

Evaluation of Performance Parameters of Software Defined Radar Sensors” EUSAR 2002, 4-6 June 2002, Cologne, Germany, pp. 191-194.

[6] Hershkowitz, S. J., Rihaczek, A. W.: “Radar Resolution and Complex Image Analysis”. Artech House, 2000.

[7] Evans, N. B., Lee, P., Girard, R.: “The RADARSAR-2&3 Topopgraphic Mission“.

EUSAR 2002, 4-6 June 2002, Cologne, Germany, pp. 37-39.

[3] Klemm, R.: “Principles of Space-Time Adaptive Processing”. IEE Publishers, London, 2002.

[4] Brenner, A. R., Ender, J. H. G.: First Experimental Results Achieved with the New very Wideband System PAMIR“. EUSAR 2002, 4-6 June 2002, Cologne, Germany, pp. 81-84

Figure 1 – Moving target detection: SAR image (colours denote radial velocity) showing highway traffic (courtesy J. Ender, FGA)

(9)

Figure 2 - High-resolution (10x10 cm) SAR image of the Reichstag in Berlin (courtesy A. Brenner, FGAN)

Figure 3 - Monitor views for a helicopter obstacle warning system. (taken from W. Armbruster, Hinderniswarnung und automatische Hindernisvermeidung für Hubschrauber, Prac. In Artificial Intelligence 9, Ulm 2000, p. 129-134)

(10)

Figure 4 – Scene build-up from left to right: elevation data, map data, visible and infrered images and with artificuial object added. Provided by E. Repasi, FGAN-FOM, Ettlingen,Germany.

Cytaty

Powiązane dokumenty

29 Act of Bogotá: measures for social improvement and economic development within the framework of Operation Pan America, Pan American Union 1961, pp... out that

To get a better view of the fault at depth and of the exact location of the fault we used a combination of different geophysical measurements perpendicular to the

1. Wyniku fi nansowego – wskaźniki wpływu ostrożnej wyceny na wynik fi nansowy. Przeprowadzone badania utwierdziły autora w przekonaniu, że obecnie obo- wiązujący układ

bec niżej podpisanego... tolicy pozostali tylko na terenie zaboru austriackiego w Galicji. W 1944 roku do Gali­ cji wkroczyli Sowieci i realizowali program carski: zaczęły

An SN-curve approach based on a nominal stress range approach, a structural stress range approach, and a notch stress range approach has been applied to numerous test SN-data

W e­ dług tego autora tradycja łączenia ze sobą tych dwóch motywów zaczyna się jed­ nakże w polskiej literaturze już w twórczości Klemensa Janickiego a

At the end of the final design phase the following conclusions on the feasibility of a solar turbine power station with floating chimney can be drawn. The STPS with floating

stFDIuJI>vFD>wDAt@vDCJA@B?FD@vDA>DBCIAuJFD>GFD>wDAtFDxFyD wFCAuJFvD>wDAtFDI>?@A@BC?DAt>uztADAtCAD{F|F?>IF{D@GDAtFD