New Memory Storage Scheme

Votes: 0
Views: 1104

Artificial Intelligence is at the forefront of every area today, from Space/Defense to Medical, Transportation and Marketing. Data centers are huge using vast amounts of energy. And the results are problematic and can be easily fooled by a 5th grader. True AI requires vastly more data and less latency, on many orders of magnitude. Even with the unrealized addition of Quantum Computing, industry is not even in the realm of actual intelligence.

We observe data is stored in nature in very small and very energy efficient structures. The data to create an animal is stored in a few chromosomes. In those chromosomes are also a control program to tell the growing cells when and what to change into and grown. Additionally, operational data is stored there - a mother cat knows how to hunt and care for young even though she's never been around other cats. If we consider every molecule and every atom in every molecule and every sub-atomic aspect of each atom in a collection of chromosomes, there is not nearly enough memory cells to store that vast amount of data.

Elementally, we must realize that memory is re-stored (or overwritten without destruction of previous data) to understand what we see in nature. This is counter to the way memory systems are devised today.

Understanding this, we developed method to allow for a single data bit to be entered into a memory cell and changed again and again - in the same memory cell - without losing any of the previous entries. The problem with current computer memory systems is that a single memory cell can only be used once. It is thought that overwriting this cell will destroy the previous information, but that is not true. Current designs need vast amounts of memory locations to store vast amounts of data. This increases silicon area, power and heat. It also greatly increases access time. If memory cells can be re-used many, times without destroying the previous data, then silicon area, power, heat and latency is greatly reduced. Our method allows this by operating the FAT against the cells as a truth table, rather than simply as a register. Using the FAT as an operator on the existing data multiplies the available memory space by a factorial function. We have successfully programmed a spreadsheet to show that this works. A proper implementation of this paradigm would require a new operating system and potentially new silicon that would be configured into a crystalline type format. It would seem as though all the memory locations were networked with all the other memory locations in a fabric pattern. This system does require some (stepped, decreasing) memory to be fixed as the factorial process progresses. This means that the amount of bytes that can be stored (re-stored/overwritten) is limited by a factorial function. The theoretical limit to a memory array of 100 Mbytes is on the order of 99M factorial (minus the fixed overhead of the FAT/operator truth table).
www.industrico.net

Voting

Voting is closed!

  • ABOUT THE ENTRANT

  • Name:
    Richard Helms
  • Type of entry:
    individual
  • Profession:
    Engineer/Designer
  • Number of times previously entering contest:
    2
  • Richard is inspired by:
    I've worked with computers since data punch cards and am frustrated at the slow rate of progress. The future has been disappointing in computer development and what computers can do. I use computational analysis and have had models run on the Sequoia computer at Lawrence Livermore with disappointing results due to available memory.
  • Software used for this entry:
    Microsoft Excel
  • Patent status:
    none