1
\$\begingroup\$

I've been studying Memristors (non-ideal) and some related circuits depending on pinched hysteresis of the IV for use in memory. I'm still puzzled by the lack of general framework for what constitutes memory and what features one needs to use a circuit as a memory volatile or non volatile.

It seems you certainly need some nonlinearity that is preserves history of input as in hysteresis (effectively non-locality of physical quantities such as Current or Voltage). For example recently I found out one of the criterion set by a well cited article is merely a mathematical constraint and has nothing to do with practical purposes or use for memory or engineering purposes of the circuit.

What is an operative definition of memory in circuits and what framework is used in defining such memory and protocols for reading and writing information at the circuit level?

Thanks in advance!

asked Jan 20, 2021 at 12:20
\$\endgroup\$
4
  • 1
    \$\begingroup\$ I'm wondering if anyone is aware... - this isn't really a valid question because someone, somewhere will be aware hence, the answer is "yes". Also, requests for information and where to find them is basically shopping and the site rules state this about shopping questions: Questions seeking recommendations for specific products or places to purchase them are off-topic as they are rarely useful to others and quickly obsolete. Instead, describe your situation and the specific problem you're trying to solve. \$\endgroup\$ Commented Jan 20, 2021 at 12:24
  • 1
    \$\begingroup\$ @Andyaka Thanks for pointing that out, I will edit the question to be more explicit and state a question not as vague. \$\endgroup\$ Commented Jan 20, 2021 at 12:33
  • \$\begingroup\$ You don't need nonlinearity. Consider a capacitor. They're great storage elements (DRAM) but they're also linear. \$\endgroup\$ Commented Jan 20, 2021 at 15:00
  • \$\begingroup\$ Please quote the article you were reading, I am interested. \$\endgroup\$ Commented Jan 27, 2021 at 13:55

2 Answers 2

1
\$\begingroup\$

The question is a bit vague, but as a general formal definition, memory is equivalent to internal state.

For example, a combinational circuit like an AND gate or a binary adder has no state - the output only depends on what the inputs are at this very time, and has no relation to what inputs were done in the past.

In contrast, a sequential circuit like a counter has an internal state, namely a value it's "remembering". So the output depends on what inputs there were in the past as well as the current input.

Mathematically, you can imagine a memoryless circuit to be only a function of the current input: $$f: X \to Y$$

While memory considers an internal state and potentially returns a modified state. Let S be an abstract state space, like the set of values a counter can hold or the magnetic state of an inductor. Then $$f: (X, S) \to (Y, S)$$

It takes its current input and state, and returns a new output and a new state.

Another example is a Markov chain, which has the Markov / memoryless property: Let's say the Markov chain has random states $$X_1, X_2, \cdots, X_t, \cdots $$

The Markov property says $$P(X_{t+1} | X_{t}, X_{t-1}, \cdots) = P(X_{t+1} | X_t) $$

That is, the next random state (represented as a random variable, so with associated probability distribution) is only a function of the immediately proceeding current state (and not of anything before)

answered Sep 23 at 20:41
\$\endgroup\$
2
  • \$\begingroup\$ Even this is hiding many simplifications; for example, an ideal AND gate might be stateless as claimed, but a real one has finite propagation delay and therefore a chain of them could be used as a memory element. Indeed such (transmission line) structures have been used for memory before (not so much as chains of gates, but wiring, or acoustic delay line, etc.). (Or chains of (clocked) flip-flops (shift register, bucket brigade), but those are explicit logic memory elements.) The practical side of it is basically, how much effort do you want to put into using some stateful phenomenon. \$\endgroup\$ Commented Sep 23 at 23:05
  • \$\begingroup\$ @TimWilliams Yes, this is an abstract overview. And using propagation delay is tricky timing; clocked memory elements are much easier to work with conceptually and in practice. \$\endgroup\$ Commented Sep 23 at 23:50
0
\$\begingroup\$

I think you are assuming that there must be some sort of formal, mathematically consistent definition of memory.

Consider that a piece of cardboard with holes punched in it is memory. Electrical connections that are either completed or left open are memory. A mechanical switch can be memory.

Memory is whatever engineers find useful to store information. Creating an arbitrary "framework" to define it, or the protocols to access it, would constrain innovation.

answered Jan 20, 2021 at 13:36
\$\endgroup\$

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.