Search

PATENT PORTFOLIO
Information Technology and Software
Information Technology and Software
NASA develops information technology and software to support a wide range of activities, including mission planning and operations, data analysis and visualization, and communication and collaboration. These technologies also play a key role in the development of advanced scientific instruments and spacecraft, as well as in the management of NASA's complex organizational and technical systems. By leveraging NASA's advances in IT and software, your business can gain a competitive edge.
The Yellow Sea
MERRA/AS and Climate Analytics-as-a-Service (CAaaS)
NASA Goddard Space Flight Center now offers a new capability for meeting this Big Data challenge: MERRA Analytic Services (MERRA/AS). MERRA/AS combines the power of high-performance computing, storage-side analytics, and web APIs to dramatically improve customer access to MERRA data. It represents NASAs first effort to provide Climate Analytics-as-a-Service. Retrospective analyses (or reanalyses) such as MERRA have long been important to scientists doing climate change research. MERRA is produced by NASAs Global Modeling and Assimilation Office (GMAO), which is a component of the Earth Sciences Division in Goddards Sciences and Exploration Directorate. GMAOs research and development activities aim to maximize the impact of satellite observations in climate, weather, atmospheric, and land prediction using global models and data assimilation. These products are becoming increasingly important to application areas beyond traditional climate science. MERRA/AS provides a new cloud-based approach to storing and accessing the MERRA dataset. By combining high-performance computing, MapReduce analytics, and NASAs Climate Data Services API (CDS API), MERRA/AS moves much of the work traditionally done on the client side to the server side, close to the data and close to large compute power. This reduces the need for large data transfers and provides a platform to support complex server-side data analysesit enables Climate Analytics-as-a-Service. MERRA/AS currently implements a set of commonly used operations (such as avg, min, and max) over all the MERRA variables. Of particular interest to many applications is a core collection of about two dozen MERRA land variables (such as humidity, precipitation, evaporation, and temperature). Using the RESTful services of the Climate Data Services API, it is now easy to extract basic historical climatology information about places and time spans of interest anywhere in the world. Since the CDS API is extensible, the community can participate in MERRA/ASs development by contributing new and more complex analytics to the MERRA/AS service. MERRA/AS demonstrates the power of CAaaS and advances NASAs ability to connect data, science, computational resources, and expertise to the many customers and applications it serves.
Tropical Cyclone Ita Off-Shore Queensland, Australia; Credit: NASA/NOAA via NOAA Environmental Visualization Laboratory
The Hilbert-Huang Transform Real-Time Data Processing System
The present innovation is an engineering tool known as the HHT Data Processing System (HHTDPS). The HHTDPS allows applying the Transform, or 'T,' to a data vector in a fashion similar to the heritage FFT. It is a generic, low cost, high performance personal computer (PC) based system that implements the HHT computational algorithms in a user friendly, file driven environment. Unlike other signal processing techniques such as the Fast Fourier Transform (FFT1 and FFT2) that assume signal linearity and stationarity, the Hilbert-Huang Transform (HHT) utilizes relationships between arbitrary signals and local extrema to find the signal instantaneous spectral representation. Using the Empirical Mode Decomposition (EMD) followed by the Hilbert Transform of the empirical decomposition data, the HHT allows spectrum analysis of nonlinear and nonstationary data by using an engineering a-posteriori data processing, based on the EMD algorithm. This results in a non-constrained decomposition of a source real value data vector into a finite set of Intrinsic Mode Functions (IMF) that can be further analyzed for spectrum interpretation by the classical Hilbert Transform. The HHTDPS has a large variety of applications and has been used in several NASA science missions. NASA cosmology science missions, such as Joint Dark Energy Mission (JDEM/WFIRST), carry instruments with multiple focal planes populated with many large sensor detector arrays with sensor readout electronics circuitry that must perform at extremely low noise levels. A new methodology and implementation platform using the HHTDPS for readout noise reduction in large IR/CMOS hybrid sensors was developed at NASA Goddard Space Flight Center (GSFC). Scientists at NASA GSFC have also used the algorithm to produce the first known Hilbert-Transform based wide-field broadband data cube constructed from actual interferometric data. Furthermore, HHT has been used to improve signal reception capability in radio frequency (RF) communications. This NASA technology is currently available to the medical community to help in the diagnosis and prediction of syndromes that affect the brain, such as stroke, dementia, and traumatic brain injury. The HHTDPS is available for non-exclusive and partial field of use licenses.
Enhanced Project Management Tool
Enhanced Project Management Tool
A searchable skill set module lists a name of each worker employed by the company and/or employed by one or more companies that contract services for the company, and a list of skills possessed by each such worker. When the system receives a description of a skill set that is needed for a project, the skill set module is queried. The name and relevant skill(s) possessed by each worker that has at least one skill set provided in the received skill set list is displayed in a visually perceptible format. Provisions are provided for customizing and linking, where feasible, a subset of reports and accompanying illustrations for a particular user, and for adding or deleting other reports, as needed. This allows a user to focus on the 15 reports of immediate concern and to avoid sorting through reports and related information that is not of concern. Implementation of this separate-storage option would allow most or all users who have review access to a document to write, edit, and otherwise modify the original version, by storing the modified version only in the users own memory space. Where a user who does not have at least review-access to a report explicitly requests that report, the system optionally informs this user of the lack of review access and recommends that the user contact the system administrator. The system optionallystores preceding versions of a present report for the preceding N periods for historical purposes. The comparative analysis includes an ability to retrieve and reformatnumerical data for a contemplated comparison.
India-Pakistan Border at Night
Integrated Genomic and Proteomic Information Security Protocol
NASA GSFC has developed a cybersecurity security protocol consisting of message exchanges utilizing message authentication codes and encryption codes derived from the genetic encoding system for key generation, Proteins and the processes of transcription and translation of DNA and RNA into proteins. These are used in conjunction with the existing principles of a public key infrastructure and traditional encryption and authentication algorithms and processes. This security protocol requires a cryptanalysis infrastructure not available to most attackers. By using the processes of transcription and translation of actual genes (referred to as biogenes) in conjunction with a genomic and proteomic based encryption and authentication approach, security is achieved in two simultaneous domains. An attacker has to successfully breach both domains simultaneously for successful network attack.
The heart of the NASA Center for Climate Simulation (NCCS) is the Discover supercomputer. In 2009, NCCS added more than 8,000 computer processors to Discover, for a total of nearly 15,000 processors.
First Stage Bootloader
The First Stage Bootloader reads software images from flash memory, utilizing redundant copies of the images, and launches the operating system. This bootloader finds a valid copy of the OS image and the ram filesystem image in flash memory. If none of the copies are fully valid, the bootloader attempts to construct a fully valid image by validating the images in small sections and piecing together validated sections from multiple copies. Periodically, throughout this process, the First Stage Bootloader restarts the watchdog timer. The First Stage Bootloader reads a boot table from a default location in flash memory. This boot table describes where to find the OS image and its supporting ram filesystem image in flash. It uses header information and a checksum to validate the table. If the table is corrupt, it reads the next copy until it finds a valid table. There can be many copies of the table in flash, and all will be read if necessary. The First Stage Bootloader reads the ram filesystem image into memory and validates its contents. Similar to the boot table, if one copy of the image is corrupt, it will read the remaining copies until it finds one with a valid header. If it doesn't find a valid copy, it will break the image down into smaller portions. For each section, it checks each copy until it finds a valid copy of the section and copies the valid section into a new copy of the image. The First Stage Bootloader reads the OS image and interprets it. If anything in the image is corrupt, it reads the remaining copies until it finds a fully valid copy. If no copy is fully valid, it will use individual valid records from multiple copies to create a fully valid image.
Hubble Spies Charming Spiral Galaxy Bursting with Stars
Radiation Hardened 10BASE-T Ethernet Physical Interface
Currently there is no radiation hardened Ethernet interface device/circuit available commercially. In this Ethernet solution, the portion of the PHY in the FPGA is responsible for meeting the IEEE 802.3 protocol, decoding received packets and link pulses, and encoding transmitted data packets. The decoded payload data is sent to a user interface internal to the FPGA which sends data for transmission back to the FPGA PHY. The transmit portion is composed of two AD844 op amps from Analog Devices with appropriate filtering. The receive portion is composed of a transformer, an Aeroflex Low-Voltage Differential Multi-drop device, and appropriate filtering.
Big Data Analysis
Context Based Configuration Management System
Context Based Configuration Management (CBCM) is a hybrid tool-suite that directly supports the dynamic, distributed strategic planning and decision making environment. The CBCM system marries Decision Map technology with Commercial Off-the-Shelf (COTS) configuration management work flow (Xerox Docushare), embedded component models (events models, configuration item models, and feedback models) all on top of a web based online collaboration technology (e.g., NASA/Xerox Netmark middleware engine). CBCM drives an enterprise management configuration system with built-in analysis functions, tightly interoperable with other enterprise management systems (through middleware connections) that deliver integrated reports and enable enterprise-wide inputs on decisions/actions/events, and present context-based query-driven views on configuration management information. The theory of operation flows from the most senior level of decision making and creates a master Configuration Decision Map. This map track events, configuration objects, and contexts. Additional Configuration Decision Maps can be created independently and/or as a result of the Master Configuration Decision Map by successive organizations and/or individuals in the entire enterprise. The integrated icon-objects have intelligent triggers embedded within them configurable by the users to provide automatic analysis, forecasts, and reports. All information is stored in an object-relational database that provides robust query and reporting tools to help analyze and support past and current decisions as well as track the enterprise baseline and future potential vectors.
Heaven's Carousel premiere; Credit: NASA, ESA, and Pam Jeffries (STScI)
Otoacoustic Protection In Biologically-Inspired Systems
This innovation is an autonomic method capable of transmitting a neutralizing data signal to counteract a potentially harmful signal. This otoacoustic component of an autonomic unit can render a potentially harmful incoming signal inert. For selfmanaging systems, the technology can offer a selfdefense capability that brings new levels of automation and dependability to systems.
Hierarchical Image Segmentation (HSEG)
Hierarchical Image Segmentation (HSEG)
Currently, HSEG software is being used by Bartron Medical Imaging as a diagnostic tool to enhance medical imagery. Bartron Medical Imaging licensed the HSEG Technology from NASA Goddard adding color enhancement and developing MED-SEG, an FDA approved tool to help specialists interpret medical images. HSEG is available for licensing outside of the medical field (specifically for soft-tissue analysis).
View more patents
Stay up to date, follow NASA's Technology Transfer Program on:
facebook twitter linkedin youtube
Facebook Logo Twitter Logo Linkedin Logo Youtube Logo