06-06-2015, 04:44 PM

(06-06-2015, 11:08 AM)Drashner1 Wrote: To be honest, I'm not entirely clear on what you're describing here.

I'm somewhat familiar with the concept of reversible computing, but not with 'bit clearing' or why you would need to perform irreversible computing if you were doing reversible. Can you provide some references that describe this in more detail and/or an example of what this might look like with actual hardware, please? I think that would help everyone get a better grasp of what is being described/proposed so that we could consider it properly and offer a more cogent response.

Thanks in advance

Todd

http://en.wikipedia.org/wiki/Reversible_computing is spot-on. I'm talking about physical reversible computing. The laws of nature are reversible. Many processes can be reversed, like a ball bouncing off the floor, and other processes can return to their original state, like a planet spinning or an ion looping in a magnetic field. You try to minimize friction and uncertainty. Reversible physical computing is just using reversible physical processes to compute some function. Like atoms bouncing around in a gas, there's nothing to keep a completely reversible computer from running forever with no additional energy, but your uncertainty about the state tends to compound over time.

Physically reversible computing can only perform logically reversible operations, also known as permutations or bijections or one-to-one mappings. Logically reversible computations are easy to reason about today. They always have an equal number of input and output bits. For example, given the variables (x,y), you can reversibly do (x,y) -> (x+y, y). It's reversible because you could then do (x+y, y) -> (x+y-y, y) = (x,y), see you got (x,y) back. And (x) -> (-x) is reversible because if you apply it twice you get (x) back: (x) -> (-x) -> (--x) = (x). You could swap bits reversibly given only reversible operations for addition and subtraction and negation: (x,y) -> (x+y, y) -> (x+y, x+y-y) = (x+y, x) -> (x+y-x, x) = (y,x).

Most computations we do today are irreversible. For example, (x,y) -> (x+y) isn't reversible because it doesn't allow you to recover both x and y. You can turn any deterministic irreversible computation into a logically reversible one by saving all the inputs ... (x,y,z,0) -> (x,y,z,f(x,y,z)). If you view f(x,y,z) as 0 xor f(x,y,z), then that operation is its own reverse, (x,y,z,0) -> (x,y,z,0 xor f(x,y,z)) -> (x,y,z,0 xor f(x,y,z) xor f(x,y,z)) = (x,y,z,0). Often reversible computations would require temporary working memory, which would start 0 and end 0 but be something else in the middle.

Quantum computing is a strange thing I haven't understood well, but it's related to reversible computing in that all quantum computations are required to be reversible. It has uncertainty about the initial state, and the computation goes unobserved and the wave function goes through all possible paths, and interference of all those paths makes some outputs more likely than others. Reversible computing doesn't have to be quantum computing.

Being a child of my age, I'm thinking of a physically reversible computer as a non-quantum digital computer, where it knows how to do a set of logically reversible operations, and the input is the program and the input data, and the inputs are completely known beforehand. At the end of the computation you add the output to some zeros, then reverse everything except that final add, leaving you with the input plus the output instead of the input plus zeros. Then you copy the output somewhere useful, replace it in the computer with zeros, and swap the input program and data for the next program and data to work on.

The only thing that has to cost energy is taking bits you no longer want and setting them to zeros regardless of their original values (http://en.wikipedia.org/wiki/Landauer's_principle ). That's bit erasure. Everything else can be reversible.

What I was posting about was that, with physically reversible computing, setting garbage bits to zero doesn't have to be done near the computation. You can keep the computation cold by doing bit erasure elsewhere.