Information Technology and Software
NASA develops information technology and software to support a wide range of activities, including mission planning and operations, data analysis and visualization, and communication and collaboration. These technologies also play a key role in the development of advanced scientific instruments and spacecraft, as well as in the management of NASA's complex organizational and technical systems. By leveraging NASA's advances in IT and software, your business can gain a competitive edge.
MERRA/AS and Climate Analytics-as-a-Service (CAaaS)
NASA Goddard Space Flight Center now offers a new capability for meeting this Big Data challenge: MERRA Analytic Services (MERRA/AS). MERRA/AS combines the power of high-performance computing, storage-side analytics, and web APIs to dramatically improve customer access to MERRA data. It represents NASAs first effort to provide Climate Analytics-as-a-Service. Retrospective analyses (or reanalyses) such as MERRA have long been important to scientists doing climate change research. MERRA is produced by NASAs Global Modeling and Assimilation Office (GMAO), which is a component of the Earth Sciences Division in Goddards Sciences and Exploration Directorate. GMAOs research and development activities aim to maximize the impact of satellite observations in climate, weather, atmospheric, and land prediction using global models and data assimilation. These products are becoming increasingly important to application areas beyond traditional climate science. MERRA/AS provides a new cloud-based approach to storing and accessing the MERRA dataset. By combining high-performance computing, MapReduce analytics, and NASAs Climate Data Services API (CDS API), MERRA/AS moves much of the work traditionally done on the client side to the server side, close to the data and close to large compute power. This reduces the need for large data transfers and provides a platform to support complex server-side data analysesit enables Climate Analytics-as-a-Service. MERRA/AS currently implements a set of commonly used operations (such as avg, min, and max) over all the MERRA variables. Of particular interest to many applications is a core collection of about two dozen MERRA land variables (such as humidity, precipitation, evaporation, and temperature). Using the RESTful services of the Climate Data Services API, it is now easy to extract basic historical climatology information about places and time spans of interest anywhere in the world. Since the CDS API is extensible, the community can participate in MERRA/ASs development by contributing new and more complex analytics to the MERRA/AS service. MERRA/AS demonstrates the power of CAaaS and advances NASAs ability to connect data, science, computational resources, and expertise to the many customers and applications it serves.
The Hilbert-Huang Transform Real-Time Data Processing System
The present innovation is an engineering tool known as the HHT Data Processing System (HHTDPS). The HHTDPS allows applying the Transform, or 'T,' to a data vector in a fashion similar to the heritage FFT. It is a generic, low cost, high performance personal computer (PC) based system that implements the HHT computational algorithms in a user friendly, file driven environment. Unlike other signal processing techniques such as the Fast Fourier Transform (FFT1 and FFT2) that assume signal linearity and stationarity, the Hilbert-Huang Transform (HHT) utilizes relationships between arbitrary signals and local extrema to find the signal instantaneous spectral representation. Using the Empirical Mode Decomposition (EMD) followed by the Hilbert Transform of the empirical decomposition data, the HHT allows spectrum analysis of nonlinear and nonstationary data by using an engineering a-posteriori data processing, based on the EMD algorithm. This results in a non-constrained decomposition of a source real value data vector into a finite set of Intrinsic Mode Functions (IMF) that can be further analyzed for spectrum interpretation by the classical Hilbert Transform. The HHTDPS has a large variety of applications and has been used in several NASA science missions. NASA cosmology science missions, such as Joint Dark Energy Mission (JDEM/WFIRST), carry instruments with multiple focal planes populated with many large sensor detector arrays with sensor readout electronics circuitry that must perform at extremely low noise levels. A new methodology and implementation platform using the HHTDPS for readout noise reduction in large IR/CMOS hybrid sensors was developed at NASA Goddard Space Flight Center (GSFC). Scientists at NASA GSFC have also used the algorithm to produce the first known Hilbert-Transform based wide-field broadband data cube constructed from actual interferometric data. Furthermore, HHT has been used to improve signal reception capability in radio frequency (RF) communications. This NASA technology is currently available to the medical community to help in the diagnosis and prediction of syndromes that affect the brain, such as stroke, dementia, and traumatic brain injury. The HHTDPS is available for non-exclusive and partial field of use licenses.
Enhanced Project Management Tool
A searchable skill set module lists a name of each worker employed by the company and/or employed by one or more companies that contract services for the company, and a list of skills possessed by each such worker. When the system receives a description of a skill set that is needed for a project, the skill set module is queried. The name and relevant skill(s) possessed by each worker that has at least one skill set provided in the received skill set list is displayed in a visually perceptible format. Provisions are provided for customizing and linking, where feasible, a subset of reports and accompanying illustrations for a particular user, and for adding or deleting other reports, as needed. This allows a user to focus on the 15 reports of immediate concern and to avoid sorting through reports and related information that is not of concern. Implementation of this separate-storage option would allow most or all users who have review access to a document to write, edit, and otherwise modify the original version, by storing the modified version only in the users own memory space. Where a user who does not have at least review-access to a report explicitly requests that report, the system optionally informs this user of the lack of review access and recommends that the user contact the system administrator. The system optionallystores preceding versions of a present report for the preceding N periods for historical purposes. The comparative analysis includes an ability to retrieve and reformatnumerical data for a contemplated comparison.
The Estimated Spectrum Adaptive Postfilter (ESAP)
ESAP looks into the decoded JPEG image and determines the location of the edges of the image. An edge is the spatial transition from lets say, a human face to a landscape background or the silhouette or cartoon of a person or object. In the areas away of the edges, ESAP performs adaptive filtering to remove the pixelation (or blocking artifacts) created by highly compressed JPEG images. In general, ESAP improves both, the subjective visual quality of highly compressed JPEG images, as well as their objective quality measure known as the Peak-Signal-to-Noise Ratio (PSNR), as compared to baseline JPEG images. It does this without requiring any additional overhead in the data stream.
Integrated Genomic and Proteomic Information Security Protocol
NASA GSFC has developed a cybersecurity security protocol consisting of message exchanges utilizing message authentication codes and encryption codes derived from the genetic encoding system for key generation, Proteins and the processes of transcription and translation of DNA and RNA into proteins. These are used in conjunction with the existing principles of a public key infrastructure and traditional encryption and authentication algorithms and processes. This security protocol requires a cryptanalysis infrastructure not available to most attackers. By using the processes of transcription and translation of actual genes (referred to as biogenes) in conjunction with a genomic and proteomic based encryption and authentication approach, security is achieved in two simultaneous domains. An attacker has to successfully breach both domains simultaneously for successful network attack.
First Stage Bootloader
The First Stage Bootloader reads software images from flash memory, utilizing redundant copies of the images, and launches the operating system. This bootloader finds a valid copy of the OS image and the ram filesystem image in flash memory. If none of the copies are fully valid, the bootloader attempts to construct a fully valid image by validating the images in small sections and piecing together validated sections from multiple copies. Periodically, throughout this process, the First Stage Bootloader restarts the watchdog timer. The First Stage Bootloader reads a boot table from a default location in flash memory. This boot table describes where to find the OS image and its supporting ram filesystem image in flash. It uses header information and a checksum to validate the table. If the table is corrupt, it reads the next copy until it finds a valid table. There can be many copies of the table in flash, and all will be read if necessary. The First Stage Bootloader reads the ram filesystem image into memory and validates its contents. Similar to the boot table, if one copy of the image is corrupt, it will read the remaining copies until it finds one with a valid header. If it doesn't find a valid copy, it will break the image down into smaller portions. For each section, it checks each copy until it finds a valid copy of the section and copies the valid section into a new copy of the image. The First Stage Bootloader reads the OS image and interprets it. If anything in the image is corrupt, it reads the remaining copies until it finds a fully valid copy. If no copy is fully valid, it will use individual valid records from multiple copies to create a fully valid image.
Intelligent Conversational Research Assistant
Previous virtual digital assistants (e.g., Siri®, Google Assistant®, Cortana®, Alexa®, etc.) allow users to speak requests to computers or mobile phones, which then perform speech-to-text recognition and execute web searches based on the content of the user's request. However, such systems cannot provide answers that are geospatially and space-time aware, for example, what the weather was like in San Francisco three days ago. In addition, typical virtual assistants only allow community users to indirectly influence system behavior through their queries. MATA provides a conversational assistant and associated computing to turn 40 years and hundreds of terabytes of Earth science of NASA Earth science data into usable knowledge. This system engages with users in an integrated, conversational manner using natural language dialogue, and invokes external web services, when appropriate, to obtain information and perform various actions on a variety of satellite and geospatial data to provide spatio-temporally aware answers. MATA is conversational computing, not just a conversational assistant. Broadly, this technology takes a query, translates that query into to an intent, then that intent invokes the right capability, which in turn invokes the right APIs for computations that take milliseconds. MATA does not simply retrieve an answer from a database, but it intelligently answers a user's question within its specific context.
Radiation Hardened 10BASE-T Ethernet Physical Interface
Currently there is no radiation hardened Ethernet interface device/circuit available commercially. In this Ethernet solution, the portion of the PHY in the FPGA is responsible for meeting the IEEE 802.3 protocol, decoding received packets and link pulses, and encoding transmitted data packets. The decoded payload data is sent to a user interface internal to the FPGA which sends data for transmission back to the FPGA PHY. The transmit portion is composed of two AD844 op amps from Analog Devices with appropriate filtering. The receive portion is composed of a transformer, an Aeroflex Low-Voltage Differential Multi-drop device, and appropriate filtering.
Context Based Configuration Management System
Context Based Configuration Management (CBCM) is a hybrid tool-suite that directly supports the dynamic, distributed strategic planning and decision making environment. The CBCM system marries Decision Map technology with Commercial Off-the-Shelf (COTS) configuration management work flow (Xerox Docushare), embedded component models (events models, configuration item models, and feedback models) all on top of a web based online collaboration technology (e.g., NASA/Xerox Netmark middleware engine). CBCM drives an enterprise management configuration system with built-in analysis functions, tightly interoperable with other enterprise management systems (through middleware connections) that deliver integrated reports and enable enterprise-wide inputs on decisions/actions/events, and present context-based query-driven views on configuration management information. The theory of operation flows from the most senior level of decision making and creates a master Configuration Decision Map. This map track events, configuration objects, and contexts. Additional Configuration Decision Maps can be created independently and/or as a result of the Master Configuration Decision Map by successive organizations and/or individuals in the entire enterprise. The integrated icon-objects have intelligent triggers embedded within them configurable by the users to provide automatic analysis, forecasts, and reports. All information is stored in an object-relational database that provides robust query and reporting tools to help analyze and support past and current decisions as well as track the enterprise baseline and future potential vectors.
View more patents