After Software, What's Next?

C

Charles Moeller

---- snip ----
>The word/concept you're looking for is
>a called a semaphore.

---- snip ----

>What's your point?

Yes, I've implemented the semaphore concept, as well.

My point is that most industrial and consumer appliances would be better served with a parallel-concurrent mode of operation, rather than Turing-type machines and data-processing with linear-sequential software.

Best regards,
CharlieM
 
<< Charlie said :: "If the hardware did not need software, it could be more autonomous and simpler.">>

> William Sturm said: While I totally agree that simpler systems are exponentially more reliable,

OK .. then let's do distributed processing by a network of small devices.

> William Sturm said: I do not agree that hardware will be simpler without software. I think that hardware would need to be added to do the logic that was previously done in software. Then all you have done is change the programming techniques from software to hardware. >

There is no hardware without software ... but there is always to decide which logic should be done in hardware and what in software (with a MCU).

> William Sturm said: Hardware is much less likely to change during operation, that I could agree with. <

FPGAs can be incrementally updated ...

> William Sturm said: I also think that software is frequently too complex. Since it is "soft", extra features frequently get added. There is a Peter Principle for software, I suppose. Will that not happen with hardware design? <

Yes ... hardware design is more and more based on software which programs programmable hardware.

The only difference between hardware and software is how many parallel operations can be done physically.

Armin Steinhoff
 
V

Vladimir E. Zyubin

> We can do better with an alternate technology but if we could, who would be
> its champion?

All turn around a description and conceptual means. Descriptions are created by human beings for human beings. So, the implementation question (FPGA or RAM/ROM with a code) has no sense. Concepts rule. Concepts that must fit both specifics of the domain and human beings behaviours (limits) to process information.

The domain specifics are: event-driven nature, synchronism (necessity to work with timeouts, latencies, delays...), and concurrency (that reflects physical independency of processes on the controlled object).

The human limits are a matter of psychology, to be short -- necessity to structurise information (divide and rule) and the linear-sequential form.

So future (if it will be) is just to find domain specific language (aka 4th GL) for automation that allows programmer to reflect domain-specific aspects in structurised form with linear-sequential writing/reading

Something like that :)
 
W

William Sturm

< Charles said: "My point is that most industrial and consumer appliances would be better served with a parallel-concurrent mode of operation, rather than Turing-type machines and data-processing with linear-sequential software. >

Now that I can agree with.  Hopefully you are planning to find ways to get us there...
 Bill Sturm
 
C

Charles Moeller

Vladimir:

>So future (if it will be) is just to
>find domain specific language (aka 4th
>GL) for automation that allows
>programmer to reflect domain-specific
>aspects in structurised form with
>linear-sequential writing/reading
>
>Something like that :)

Yes! I read your interesting paper, Hyper-automation.

My view:
One of the major difficulties faced by software designers is although they live and work in three-dimensional space and multi-threaded time, they are constrained to create systems that reside completely within the space-domains of computers. These systems must sense and react to real world temporal effects. The designers therefore are required to repeatedly translate input information from time to space, and translate from space to time for relevant output. Any temporal operations in between must be performed through space-only transformations. It is no wonder these unfortunate software designers often make mistakes.

Agreed that a new language must be formulated to correct this.

Best regards,
CharlieM
 
C

Charles Moeller

Bill:

>>Charles said: "My point is that most
>>industrial and consumer appliances would
>>be better served with a
>>parallel-concurrent mode of operation,
>>rather than Turing-type machines and
>>data-processing with linear-sequential
>>software. 

>Bill said: Now that I can agree with.  Hopefully
>you are planning to find ways to get us
>there...

'Been working on it for some time.

It turns out that the TMs with shared-resource hardware and linear-sequential software can't be improved very much no matter what you do. There are certain impediments (I have a list) that are inherent in that type of machine. The current trend of wider words and higher speed does nothing to cure the fundamental problems of the method, although more and more things can be done in a given unit of time, but at greater and greater expense.

I have looked at it afresh and done a bit of rethinking on the general problem of control from the ground-up. Upon first principles, we could say that:

IF the spatially-bound Turing machines using shared-resource hardware and linear-sequential software doesn't always result in the best control systems,

THEN perhaps spatio-temporal machines using dedicated parallel-concurrent hardware will provide a good alternative for some of the control systems.

The trouble with this approach: humans dislike change, especially after being strongly schooled in an accepted method.

Best regards,
CharlieM
 
There is not much new in computing except for the increased availability of multicore processors in hardware. Unfortunately, there does not seem to be much in the way of software to really support true parallel processing. In the IEC 61131-3 standard, one of the "languages" listed is SFC, or sequential function charting, which is taken exactly from Grafcet. SFC specifies the sequential nature of events and parallel operations characteristic of most real manufacturing processes. SFC is the language of batch control since it has both serial and parallel processes. PLCs that implement SFC actually simulate the parallel operations on conventional Turing machine microprocessors.

Conventional microprocessors are now available with up to 8 cores, while advanced processors from IBM and others have up to 32 cores. Today, the definition of a Supercomputer is to have many cores, possibly up to 1024 or even more. There are applications for some of these huge parallel processing supercomputers in seismic analysis and atmospheric weather forecasting, but there are no languages developed for their programming. The parallelism for these problems is typically one main program thread used for all cores, while each core operates on a different segment of a database.

I would like to see SFC used as the base programming method for multicore parallel processing where each core operates on an independent thread until they join. I don't know if anyone is doing this, but it seems to be a trend for the future of computing. I see that future more as a strongly hardware-assisted platform for software.

Dick Caro
 
V

Vladimir E. Zyubin

CharlieM> There are certain impediments (I have a list) that are inherent in that type of machine.

It would be very interesting to talk about. Can you show the list? AFAIU, TM was designed for calculation tasks ("the faster the better" principle). So, adding temporal features (I call it sinchronism) qualitatively changes the Turing model.
 
V

Vladimir E. Zyubin

Dick Caro> Unfortunately, there does not seem to be much in the way of software to really support true parallel processing.

A lot of... it is just hard to program. Humans need no parallelism, we need independency (to simplify). BTW, the multicore architecture is a result of technological limits on the elements size. As to SFC with the Petri-net roots -- "Some sources [1] state that Petri nets were invented in August 1939 by Carl Adam Petri - at the age of 13 - for the purpose of describing chemical processes."

[1] Carl Adam Petri and Wolfgang Reisig (2008) Petri net. Scholarpedia, 3(4):6477

http://en.wikipedia.org/wiki/Petri_net#cite_ref-0
 
C

Charles Moeller

>> CharlieM said: There are certain impediments (I have a list) that are inherent in
>> that type of machine.

> Vladimir said: It would be very interesting to talk about. Can you show the list? AFAIU, TM
> was designed for calculation tasks ("the faster the better" principle). So,
> adding temporal features (I call it sinchronism) qualitatively changes the
> Turing model.

The TM was specifically designed for decrypting encoded messages. Technically: to modify, by substitution algorithms, an encoded character string until intelligible words appeared.

The following are a few of the characteristics of computing technology that I consider to be impediments:

1. NO TEMPORAL LOGIC: There are no verbs, dynamic operators, or temporal logic in the fundamental computer logic that has now completely pervaded our daily lives.

2. SMALL NUMBER OF OPERATORS: The number of accepted fundamental operators and corresponding logic elements is small, limited to Boolean AND, NOT, and their combinations, and STORE, the memory operator. The Boolean operators in combination can describe or perform 16 different functions between and upon two operands. The set can also perform binary arithmetic. Imagine writing a six-page paper (or a process-control scheme) while limited to such a small number of letters, words, or concepts.

3. NO ON-GOING TIME: When performed by physical logic elements, the operations are considered to be executed in a null-time zone, as the evaluations are ready at the next <i>live</i> moment (usually at the next clock pulse or instruction), which is designed to occur after any contributing settling times or gate-delays have run to completion.

4. ALL OPERATIONS ARE IN THE SPACE-DOMAIN: All higher-level computer languages (i.e., in software) are ultimately decomposable to, hence built up from, combinations of the Boolean operations and STORE. In machine language, those operations are used to determine explicitly: a) the locations from which to acquire the numerical or conditional operands, b) what Boolean operations to perform, c) where to put the results, and d) what to do next. Every step must be predetermined.

5. SPACE-DOMAIN RESULTANTS: Boolean logic used in such a manner is static, is unobservant of change, and can be said to inhabit the space-domain. The time-domain is an untapped resource.

One of the major difficulties faced by software designers is although they live and work in three-dimensional space and multi-threaded time, they are constrained to create systems that reside completely within the space-domains of computers. These systems must sense and react to real world temporal effects. The designers therefore are required to repeatedly translate input information from time to space, and translate from space to time for relevant output. Any temporal operations in between must be performed through space-only transformations. It is no wonder these unfortunate software designers often make mistakes. (Please excuse the repetition of my theme.)

Best regards,
CharlieM
 
>> Vladimir said: It would be very interesting to talk about. Can you show the list? AFAIU, TM
>> was designed for calculation tasks ("the faster the better" principle). So,
>> adding temporal features (I call it sinchronism) qualitatively changes the
>> Turing model.

> CharlieM said: The TM was specifically designed for decrypting encoded messages. T

IMHO ... the Turing Machine is a mathemathical model defined by Alan Turing. It makes statements about decision problems and the theory of the computionallity. ( very problem is decidable if it is computional by a TM ....)

Based on that TM are derived classes of different TMs ... Neumann a.s.o.

> CharlieM said: echnically: to modify, by substitution algorithms, an encoded character string until intelligible words appeared.

> The following are a few of the characteristics of computing technology that I consider to be impediments:

> 1. NO TEMPORAL LOGIC: There are no verbs, dynamic operators, or temporal logic in the fundamental computer logic that has now completely pervaded our daily lives.

Why is it a impediments ? The temporal logic has a much higher
complexity as the boolean logic.

Best Regards
Armin Steinhoff
 
Zyubin > As to SFC with the Petri-net roots...

Yes indeed, SFC was created at Telemechanique in France. The developers cited it as an implementation of Petri-net. SFC has been adopted by ISA88 as the "preferred language" for programming batch phase logic, the primary element of a batch control program.

Dick Caro
 
C

Charles Moeller

Armin:

>>> Vladimir said: It would be very interesting to talk about. Can you show the list? AFAIU, TM
>>> was designed for calculation tasks ("the faster the better" principle). So,
>>> adding temporal features (I call it sinchronism) qualitatively changes the
>>> Turing model.

>> CharlieM said: The TM was >specifically designed for decrypting encoded messages.

Armin Steinhoff said:
> IMHO ... the Turing Machine is a mathemathical model defined by Alan
> Turing. It makes statements about decision problems and the theory of the
> computionallity. (every problem is decidable if it is computional by a TM....)

>Based on that TM are derived classes of different TMs ... Neumann a.s.o.

>> CharlieM said: Technically: to modify, by substitution algorithms, an encoded
>> character string until intelligible words appeared.

>> The following are a few of the characteristics of computing technology
>>that I consider to be impediments:

>> 1. NO TEMPORAL LOGIC: There are no verbs, dynamic operators, or temporal
>> logic in the fundamental computer logic that has now completely pervaded our
>> daily lives.

Armin Steinhoff said:
> Why is it an impediment? The temporal logic has a much higher
> complexity as the boolean logic.

The fact that there is no temporal logic <b> native to the time-domain</b> in the list of operations for computation has the effect of requiring any and all desired temporal operations to be carried out via the space-domain. This requires transformation from real-world inputs to computer internal space-domain locations (data in numbered or labeled memory spaces). Furthermore, the accepted temporal logics that are able to be used via computer programming (see J.F. Allen or Amir Pnueli) are subject to that same constraint. All of the so-called temporal logic systems can't be operated in the real time domain (<b>as</b> the events and conditions change) but must be performed on static space-domain data after conversion by sampling and storing.

Yes, the impediment is that real-world temporal data must be changed into static space-domain data before computer data-processing, and then the static results must be translated from the artificial space-domain to the real-world outputs to join ongoing real time.

The conversions to the space-domain and back to the time-domain after processing are complications that take time to do and may result in conversion errors, lost data, or improper interpretation during several of the many steps necessary. A much better scenario would be to be able use the sensor information in real time as it occurs.

Best regards,
CharlieM
 
V

Vladimir E. Zyubin

Dick Caro >Zyubin > As to SFC with the Petri-net roots...

> Yes indeed, SFC was created at Telemechanique in France. The developers
> cited it as an implementation of Petri-net.

I agreed, at that time SFC is most powerful IEC 61131-3 language. But SFC has the same problem as Petri-net... problem with control flow convergence (poor controlled markings). (As well as poor sinchronism, and structurisation). So I am personally don't sure what will be the better solution -- try to solve current SFC problems or just to enhance ST syntax/semantics to add necessary abilities. The last is not difficult. The first can be impossible.

Best regards, Vladimir
 
> [clip]
> Armin Steinhoff said:
>> Why is it an impediment? The temporal logic has a much higher
>> complexity as the boolean logic.

> CharlieM said: The fact that there is no temporal logic native to the time-domain in the list of operations for computation has the effect of requiring any and all desired temporal operations to be carried out via the space-domain. This requires transformation from real-world inputs to computer internal space-domain locations (data in numbered or labeled memory spaces). Furthermore, the accepted temporal logics that are able to be used via computer programming (see J.F. Allen or Amir Pnueli) are subject to that same constraint. All of the so-called temporal logic systems can't be operated in the real time domain (as the events and conditions change) but must be performed on static space-domain data after conversion by sampling and storing. <

> Yes, the impediment is that real-world temporal data must be changed into static space-domain data before computer data-processing, and then the static results must be translated from the artificial space-domain to the real-world outputs to join ongoing real time. <

> The conversions to the space-domain and back to the time-domain after processing are complications that take time to do and may result in conversion errors, lost data, or improper interpretation during several of the many steps necessary. A much better scenario would be to be able use the sensor information in real time as it occurs. <

Yes ... and the best answer I could find today is here:
http://www.mnbtech.com/index.php?id=164

Software is the solution.

Best Regards
Armin Steinhoff
 
C

Charles Moeller

Armin:

Charlie Said:>> The conversions to the space-domain
>>and back to the time-domain after
>>processing are complications that take
>>time to do and may result in conversion
>>errors, lost data, or improper
>>interpretation during several of the
>>many steps necessary. A much better
>>scenario would be to be able use the
>>sensor information in real time as it
>>occurs.

Armin Steinhoff said:
>Yes ... and the best answer I could
>find today is here:
>http://www.mnbtech.com/index.php?id=164
>
>Software is the solution.

Your reference: http://www.mnbtech.com/index.php?id=164 describes "Mixed Technologies," which is the addition of <b>hardware solutions</b> (FPGAs and Graphical Processing Units) to General Processing Units (common computer processing).

So, hardware is at least some of the solution.

There is a distinction between:
<b>Computation</b>: the modification of input character strings (data) to produce displayable information, and
<b>Process Control</b>: activities taken to ensure a process is predictable, stable, and consistently operates at the target level of performance with only normal variation.

Computation can be <b>used</b> for process control, but it is not necessarily the best use for that technology. Appropriate uses of computation are cryptography, weather- and topological-map generation and updating, and art authentication, although we use it for most any task.

My point is that there is a more appropriate technology for process control than computing--and it is mostly hardware.

Best regards,
CharlieM
 
J

James Ingraham

CharlieM: "Simpler is better."

I think we'd all agree with that... up to a point. It can be difficult to define "simpler." For example, imagine a large machine (packaging, web press, doesn't matter) with a mechanical drive train. It some ways this is very simple. Turn the crank, machine moves. On the other hand there are lots of problems with this setup. The designer had to be very careful with his gear sizes and (mechanical) power requirements. He had to put a very robust transmission system, since the torque for the entire machine will transfer through it. Now consider multiple servo drives using electronic line shafting. In some ways this is more complicated. Servos have to be sized, wired, configured, tuned, etc. But by getting rid of that line shaft the machine becomes much more modular, and one part of the machine no longer directly influences the rest of it. Problems are easier to spot because they are narrowed down. If the line shaft jams you have to check the entire machine to find the jam. If a servo jams it puts an alarm up on an HMI and you walk right to it. So which is simpler?

CharlieM: "The applications I am thinking of are the toasters, home security, vehicle subsystems, factory automation, etc."

Ah, now I see the problem. We've been talking at cross-purposes because you equate factory automation with toasters. I suppose there are a handful of tasks that are as simpler as toasters; a simple zero-pressure chain driven live roller accumulation conveyor, for example. Guess what? Most zero pressure CDLR accumulation conveyors ARE done in hardware. This is the tiniest fraction of factory automation. A high-speed sortation conveyor would be a nightmare to do in hardware. And your earlier point that we had factories before we had software is correct, but we did not have 6-axis articulated arm robots.

I actually had to take a few deep breaths when I read "toasters" and "factory automation" in the same sentence. I don't think you actually meant to insult my life's work, or imply that the software I write has the difficulty level of programming a toaster. Nevertheless, you clearly don't have a good picture of what's hard vs easy in software. I'm glad that several other people have essentially agreed with my point that moving the complexity from the software to the hardware doesn't make the system more reliable.

-James Ingraham
Sage Automation, Inc.

P.S. I actually don't think programming a toaster is as easy as it sounds, either. Your point about the complexity of software is valid, and when making a toaster you have a lot of considerations to take in to account. Not least is price; when I do a million dollar automation job I can throw in processors where ever I want. If you're trying to make money off $20 toasters you have to REALLY work at getting the cost out.
 
J

James Ingraham

CharlieM: "The Bhopal disaster had nothing to do with electronic/electrical hardware, but was attributed to shoddy maintenance."

True, it was not electronic hardware that failed, but PHYSICAL hardware. Which means that Bhopal would have happened regardless of which way you controlled it.

CharlieM: "I am not familiar with the 737 rudder problem. Was it electronic logic hardware that was the problem?"

Again, mechanical in nature. The hydraulic system could get stuck at a point where the fluid went the OPPOSITE way from what the pilot intended, so the rudder would go left when they meant right. The pilot would of course react by going more to the right... which meant the rudder would go more to the left, and eventually the plane would get sideways and flip over and then crash.

CharlieM: "The Tacoma Narrows, to my knowledge, was a case of unpredicted harmonic oscillation caused by wind-shear strumming of the suspension cables. Hardly an electronic hardware problem."

Not electronic hardware, no, but hardware nonetheless. (Also, it was an entirely predictable harmonic oscillation, even then. They didn't predict it, but they SHOULD have.) I brought these examples up to counter your "hardware can be tested to perfection" argument. No, it can't.

CharlieM: "Software disasters..."

Well aware of the Therac-25 debacle. I've even posted it about a few times here on control.com. And I know about the Patriot Missile problem. That one is particularly galling, because they KNEW about the issue and simply put in the manual to reboot the system every few hours. Don't get me wrong, I'm not saying software is a panacea. But I don't see how you'd have solved those two problems with hardware.

The only well-known purely electronic hardware problem I can think of is the Pentium floating-point bug.

CharlieM: "IMHO the input sensors and output contactor or triac or thyratron hardware has to be there anyway. The software is yet another complicating set of factors..."

Okay, but how do you get the contactor or triac or whatever to do what it's supposed to at the right time? SOMETHING has to let it know.

CharlieM: "Software requires a TM and turning all information into data, voluminous amounts of which must be minded and stored or thrown away."

Okay. So what do I do instead? How am I going to get my dual 7-axis articulated arm robot to pour the right drink at the right time?

-James Ingraham
Sage Automation, Inc.
 
C

Charles Moeller

The human psychology and physiology is able to handle situational dynamics in a parallel-concurrent manner. The winning tennis professional's returns are never quite the same and occur sometimes at blinding speed, coordinating position on the court and racquet placement, angle of attack (for spin), and force. The highly practiced symphony violinist coordinates bowing and finger placement and in exact phase with the orchestra, no matter what tempo. These kinds of activities take place in concurrent fashion.

Great authors draw their readers into the scenes, painting them in parallel, although the words, sentences, paragraphs, and chapters are set down in serial fashion. Our thought processes and languages allow and support such parallel-concurrent processes described serially.

Computer programs, at first glance appear to be lists of line-by-line activities, the arrays of which might (possibly) be read from left to right a page at a time. But no, the lines of code are read and executed one at a time starting at the top of page one and proceeding downward, in generally the same order in which they were written.

Going back to a well-written book, a human can read about flowers and mentally build a garden based on those words as the description proceeds.

If something changes in the description, the mental image is updated immediately and automatically. That is precisely how a parallel-concurrent control scheme can be developed. Each sub-process contributing to the overall process runs by itself and all run concurrently and contribute to the overall process.

Such controller activities performed on a time-shared basis is what we have now with TMs. What is needed is the parallel-concurrent alternative.

Best regards,
CharlieM
 
V

Vladimir E. Zyubin

CharlieM > My point is that there is a more appropriate technology for process
> control than computing--and it is mostly hardware.

What about a set of physical relays? :) History tells us relays can be used for process control... as hardware (physical relays) and as software (LD IEC 61131-3).

So, I believe first-order question is the question about appropriate formal description... Question "hardware or software" is a second-order question.

BTW, I think PC can not be described in terms of Turing Machine because of the timers at least.
 
Top