Information Technology and Software
The development, implementation and maintenance of computer hardware and software systems to produce, store, organize, analyze, model, simulate and communicate information electronically.
MERRA/AS and Climate Analytics-as-a-Service (CAaaS)
NASA Goddard Space Flight Center now offers a new capability for meeting this Big Data challenge: MERRA Analytic Services (MERRA/AS). MERRA/AS combines the power of high-performance computing, storage-side analytics, and web APIs to dramatically improve customer access to MERRA data. It represents NASAs first effort to provide Climate Analytics-as-a-Service. Retrospective analyses (or reanalyses) such as MERRA have long been important to scientists doing climate change research. MERRA is produced by NASAs Global Modeling and Assimilation Office (GMAO), which is a component of the Earth Sciences Division in Goddards Sciences and Exploration Directorate. GMAOs research and development activities aim to maximize the impact of satellite observations in climate, weather, atmospheric, and land prediction using global models and data assimilation. These products are becoming increasingly important to application areas beyond traditional climate science. MERRA/AS provides a new cloud-based approach to storing and accessing the MERRA dataset. By combining high-performance computing, MapReduce analytics, and NASAs Climate Data Services API (CDS API), MERRA/AS moves much of the work traditionally done on the client side to the server side, close to the data and close to large compute power. This reduces the need for large data transfers and provides a platform to support complex server-side data analysesit enables Climate Analytics-as-a-Service. MERRA/AS currently implements a set of commonly used operations (such as avg, min, and max) over all the MERRA variables. Of particular interest to many applications is a core collection of about two dozen MERRA land variables (such as humidity, precipitation, evaporation, and temperature). Using the RESTful services of the Climate Data Services API, it is now easy to extract basic historical climatology information about places and time spans of interest anywhere in the world. Since the CDS API is extensible, the community can participate in MERRA/ASs development by contributing new and more complex analytics to the MERRA/AS service. MERRA/AS demonstrates the power of CAaaS and advances NASAs ability to connect data, science, computational resources, and expertise to the many customers and applications it serves.
The Hilbert-Huang Transform Real-Time Data Processing System
The present innovation is an engineering tool known as the HHT Data Processing System (HHTDPS). The HHTDPS allows applying the Transform, or 'T,' to a data vector in a fashion similar to the heritage FFT. It is a generic, low cost, high performance personal computer (PC) based system that implements the HHT computational algorithms in a user friendly, file driven environment. Unlike other signal processing techniques such as the Fast Fourier Transform (FFT1 and FFT2) that assume signal linearity and stationarity, the Hilbert-Huang Transform (HHT) utilizes relationships between arbitrary signals and local extrema to find the signal instantaneous spectral representation. Using the Empirical Mode Decomposition (EMD) followed by the Hilbert Transform of the empirical decomposition data, the HHT allows spectrum analysis of nonlinear and nonstationary data by using an engineering a-posteriori data processing, based on the EMD algorithm. This results in a non-constrained decomposition of a source real value data vector into a finite set of Intrinsic Mode Functions (IMF) that can be further analyzed for spectrum interpretation by the classical Hilbert Transform. The HHTDPS has a large variety of applications and has been used in several NASA science missions. NASA cosmology science missions, such as Joint Dark Energy Mission (JDEM/WFIRST), carry instruments with multiple focal planes populated with many large sensor detector arrays with sensor readout electronics circuitry that must perform at extremely low noise levels. A new methodology and implementation platform using the HHTDPS for readout noise reduction in large IR/CMOS hybrid sensors was developed at NASA Goddard Space Flight Center (GSFC). Scientists at NASA GSFC have also used the algorithm to produce the first known Hilbert-Transform based wide-field broadband data cube constructed from actual interferometric data. Furthermore, HHT has been used to improve signal reception capability in radio frequency (RF) communications. This NASA technology is currently available to the medical community to help in the diagnosis and prediction of syndromes that affect the brain, such as stroke, dementia, and traumatic brain injury. The HHTDPS is available for non-exclusive and partial field of use licenses.
Enhanced Project Management Tool
A searchable skill set module lists a name of each worker employed by the company and/or employed by one or more companies that contract services for the company, and a list of skills possessed by each such worker. When the system receives a description of a skill set that is needed for a project, the skill set module is queried. The name and relevant skill(s) possessed by each worker that has at least one skill set provided in the received skill set list is displayed in a visually perceptible format. Provisions are provided for customizing and linking, where feasible, a subset of reports and accompanying illustrations for a particular user, and for adding or deleting other reports, as needed. This allows a user to focus on the 15 reports of immediate concern and to avoid sorting through reports and related information that is not of concern. Implementation of this separate-storage option would allow most or all users who have review access to a document to write, edit, and otherwise modify the original version, by storing the modified version only in the users own memory space. Where a user who does not have at least review-access to a report explicitly requests that report, the system optionally informs this user of the lack of review access and recommends that the user contact the system administrator. The system optionallystores preceding versions of a present report for the preceding N periods for historical purposes. The comparative analysis includes an ability to retrieve and reformatnumerical data for a contemplated comparison.
The Estimated Spectrum Adaptive Postfilter (ESAP)
ESAP looks into the decoded JPEG image and determines the location of the edges of the image. An edge is the spatial transition from lets say, a human face to a landscape background or the silhouette or cartoon of a person or object. In the areas away of the edges, ESAP performs adaptive filtering to remove the pixelation (or blocking artifacts) created by highly compressed JPEG images. In general, ESAP improves both, the subjective visual quality of highly compressed JPEG images, as well as their objective quality measure known as the Peak-Signal-to-Noise Ratio (PSNR), as compared to baseline JPEG images. It does this without requiring any additional overhead in the data stream.
Integrated Genomic and Proteomic Information Security Protocol
NASA GSFC has developed a cybersecurity security protocol consisting of message exchanges utilizing message authentication codes and encryption codes derived from the genetic encoding system for key generation, Proteins and the processes of transcription and translation of DNA and RNA into proteins. These are used in conjunction with the existing principles of a public key infrastructure and traditional encryption and authentication algorithms and processes. This security protocol requires a cryptanalysis infrastructure not available to most attackers. By using the processes of transcription and translation of actual genes (referred to as biogenes) in conjunction with a genomic and proteomic based encryption and authentication approach, security is achieved in two simultaneous domains. An attacker has to successfully breach both domains simultaneously for successful network attack.
First Stage Bootloader
The First Stage Bootloader reads software images from flash memory, utilizing redundant copies of the images, and launches the operating system. This bootloader finds a valid copy of the OS image and the ram filesystem image in flash memory. If none of the copies are fully valid, the bootloader attempts to construct a fully valid image by validating the images in small sections and piecing together validated sections from multiple copies. Periodically, throughout this process, the First Stage Bootloader restarts the watchdog timer. The First Stage Bootloader reads a boot table from a default location in flash memory. This boot table describes where to find the OS image and its supporting ram filesystem image in flash. It uses header information and a checksum to validate the table. If the table is corrupt, it reads the next copy until it finds a valid table. There can be many copies of the table in flash, and all will be read if necessary. The First Stage Bootloader reads the ram filesystem image into memory and validates its contents. Similar to the boot table, if one copy of the image is corrupt, it will read the remaining copies until it finds one with a valid header. If it doesn't find a valid copy, it will break the image down into smaller portions. For each section, it checks each copy until it finds a valid copy of the section and copies the valid section into a new copy of the image. The First Stage Bootloader reads the OS image and interprets it. If anything in the image is corrupt, it reads the remaining copies until it finds a fully valid copy. If no copy is fully valid, it will use individual valid records from multiple copies to create a fully valid image.
Reconfigurable Image Generator and Database Generation System
The system, the Reconfigurable Image Generator (RIG), consists of software and a hardware configuration, and a Synthetic Environment Database Generation System (RIG-DBGS). This innovative Image Generator (IG) uses Commercial-Off-The-Shelf (COTS) technologies and is capable of supporting virtually any display system. The DBGS software leverages high-fidelity real-world data, including aerial imagery, elevation datasets, and vector data. Through a combination of COTS tools and in-house created applications, the semi-automated system can process large amounts of data in days rather than weeks or months, a disadvantage of manual database generation. A major benefit of the RIG technology is that existing simulation users can leverage their investment in existing real-time 3D databases (such as OpenFlight) as part of the RIG system.
Intelligent Conversational Research Assistant
Previous virtual digital assistants (e.g., Siri®, Google Assistant®, Cortana®, Alexa®, etc.) allow users to speak requests to computers or mobile phones, which then perform speech-to-text recognition and execute web searches based on the content of the user's request. However, such systems cannot provide answers that are geospatially and space-time aware, for example, what the weather was like in San Francisco three days ago. In addition, typical virtual assistants only allow community users to indirectly influence system behavior through their queries. MATA provides a conversational assistant and associated computing to turn 40 years and hundreds of terabytes of Earth science of NASA Earth science data into usable knowledge. This system engages with users in an integrated, conversational manner using natural language dialogue, and invokes external web services, when appropriate, to obtain information and perform various actions on a variety of satellite and geospatial data to provide spatio-temporally aware answers. MATA is conversational computing, not just a conversational assistant. Broadly, this technology takes a query, translates that query into to an intent, then that intent invokes the right capability, which in turn invokes the right APIs for computations that take milliseconds. MATA does not simply retrieve an answer from a database, but it intelligently answers a user's question within its specific context.
Radiation Hardened 10BASE-T Ethernet Physical Interface
Currently there is no radiation hardened Ethernet interface device/circuit available commercially. In this Ethernet solution, the portion of the PHY in the FPGA is responsible for meeting the IEEE 802.3 protocol, decoding received packets and link pulses, and encoding transmitted data packets. The decoded payload data is sent to a user interface internal to the FPGA which sends data for transmission back to the FPGA PHY. The transmit portion is composed of two AD844 op amps from Analog Devices with appropriate filtering. The receive portion is composed of a transformer, an Aeroflex Low-Voltage Differential Multi-drop device, and appropriate filtering.
View more patents