Hard Light Productions Forums

Off-Topic Discussion => General Discussion => Topic started by: The E on June 02, 2012, 05:53:22 pm

Title: Layers of Complexity
Post by: The E on June 02, 2012, 05:53:22 pm
Via the @CompSciFact (https://twitter.com/#!/CompSciFact) twitter comes this interesting little post by Jean-Baptiste Queru (https://plus.google.com/112218872649456413744/posts/dfydM2Cnepe):

Quote
Dizzying but invisible depth

You just went to the Google home page.

Simple, isn't it?

What just actually happened?

Well, when you know a bit of about how browsers work, it's not quite that simple. You've just put into play HTTP, HTML, CSS, ECMAscript, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.

Let's simplify.

You just connected your computer to www.google.com.

Simple, isn't it?

What just actually happened?

Well, when you know a bit about how networks work, it's not quite that simple. You've just put into play DNS, TCP, UDP, IP, Wifi, Ethernet, DOCSIS, OC, SONET, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.

Let's simplify.

You just typed www.google.com in the location bar of your browser.

Simple, isn't it?

What just actually happened?

Well, when you know a bit about how operating systems work, it's not quite that simple. You've just put into play a kernel, a USB host stack, an input dispatcher, an event handler, a font hinter, a sub-pixel rasterizer, a windowing system, a graphics driver, and more, all of those written in high-level languages that get processed by compilers, linkers, optimizers, interpreters, and more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.

Let's simplify.

You just pressed a key on your keyboard.

Simple, isn't it?

What just actually happened?

Well, when you know about bit about how input peripherals work, it's not quite that simple. You've just put into play a power regulator, a debouncer, an input multiplexer, a USB device stack, a USB hub stack, all of that implemented in a single chip. That chip is built around thinly sliced wafers of highly purified single-crystal silicon ingot, doped with minute quantities of other atoms that are blasted into the crystal structure, interconnected with multiple layers of aluminum or copper, that are deposited according to patterns of high-energy ultraviolet light that are focused to a precision of a fraction of a micron, connected to the outside world via thin gold wires, all inside a packaging made of a dimensionally and thermally stable resin. The doping patterns and the interconnects implement transistors, which are grouped together to create logic gates. In some parts of the chip, logic gates are combined to create arithmetic and bitwise functions, which are combined to create an ALU. In another part of the chip, logic gates are combined into bistable loops, which are lined up into rows, which are combined with selectors to create a register bank. In another part of the chip, logic gates are combined into bus controllers and instruction decoders and microcode to create an execution scheduler. In another part of the chip, they're combined into address and data multiplexers and timing circuitry to create a memory controller. There's even more. Those are actually such incredibly complex technologies that they'll make any engineer dizzy if they think about them too much, and such that no single company can deal with that entire complexity.

Can we simplify further?

In fact, very scarily, no, we can't. We can barely comprehend the complexity of a single chip in a computer keyboard, and yet there's no simpler level. The next step takes us to the software that is used to design the chip's logic, and that software itself has a level of complexity that requires to go back to the top of the loop.

Today's computers are so complex that they can only be designed and manufactured with slightly less complex computers. In turn the computers used for the design and manufacture are so complex that they themselves can only be designed and manufactured with slightly less complex computers. You'd have to go through many such loops to get back to a level that could possibly be re-built from scratch.

Once you start to understand how our modern devices work and how they're created, it's impossible to not be dizzy about the depth of everything that's involved, and to not be in awe about the fact that they work at all, when Murphy's law says that they simply shouldn't possibly work.

For non-technologists, this is all a black box. That is a great success of technology: all those layers of complexity are entirely hidden and people can use them without even knowing that they exist at all. That is the reason why many people can find computers so frustrating to use: there are so many things that can possibly go wrong that some of them inevitably will, but the complexity goes so deep that it's impossible for most users to be able to do anything about any error.

That is also why it's so hard for technologists and non-technologists to communicate together: technologists know too much about too many layers and non-technologists know too little about too few layers to be able to establish effective direct communication. The gap is so large that it's not even possible any more to have a single person be an intermediate between those two groups, and that's why e.g. we end up with those convoluted technical support call centers and their multiple tiers. Without such deep support structures, you end up with the frustrating situation that we see when end users have access to a bug database that is directly used by engineers: neither the end users nor the engineers get the information that they need to accomplish their goals.

That is why the mainstream press and the general population has talked so much about Steve Jobs' death and comparatively so little about Dennis Ritchie's: Steve's influence was at a layer that most people could see, while Dennis' was much deeper. On the one hand, I can imagine where the computing world would be without the work that Jobs did and the people he inspired: probably a bit less shiny, a bit more beige, a bit more square. Deep inside, though, our devices would still work the same way and do the same things. On the other hand, I literally can't imagine where the computing world would be without the work that Ritchie did and the people he inspired. By the mid 80s, Ritchie's influence had taken over, and even back then very little remained of the pre-Ritchie world.

Finally, last but not least, that is why our patent system is broken: technology has done such an amazing job at hiding its complexity that the people regulating and running the patent system are barely even aware of the complexity of what they're regulating and running. That's the ultimate bikeshedding: just like the proverbial discussions in the town hall about a nuclear power plant end up being about the paint color for the plant's bike shed, the patent discussions about modern computing systems end up being about screen sizes and icon ordering, because in both cases those are the only aspect that the people involved in the discussion are capable of discussing, even though they are irrelevant to the actual function of the overall system being discussed.

CC:BY 3.0
Title: Re: Layers of Complexity
Post by: headdie on June 02, 2012, 06:03:09 pm
:headsplode:

A very insightful text which by it's nature can tell us very little, still, and interesting read thanks for highlighting that The E
Title: Re: Layers of Complexity
Post by: Nuke on June 02, 2012, 08:34:25 pm
ive been aware of the complexity of technology for some time. i like to call them "layers of bull****" because a lot of the systems involved are more complex than they need to be. im all for abolishing patents. they were meant so that people with no resources could have an idea for some gizmo and buy them time to drum up investment, develop a product, and turn a profit. i dont think it was ever meant so that mega companies which have resources to turn a profit on any idea that comes across em, and then use the patent to control a monopoly on that idea so that no one else can use it. i hate closed and proprietary technologies not because they cost money but because they are easily lost when the corporation goes tits up, gets nuked, destroyed by asteroids, etc.
Title: Re: Layers of Complexity
Post by: Flipside on June 02, 2012, 08:45:58 pm
I don't think it's so much 'layers of Bull****' as 'Layers of Technobabble' to be honest. Though I'll admit, some things are carry-overs from older systems (Anyone who's worked with Networking knows all about redundant stuff in several protocols). Part of the barrier to getting into technology is often not the complexity of the tech itself, but the vast quantity of 'catchy names' and acronyms that litter the entire field and make it sound more complex than it is.
Title: Re: Layers of Complexity
Post by: z64555 on June 02, 2012, 11:51:06 pm
Some layers of complexity are needed, other layers of complexity are for convenience by one party or another.

As an example, SDL.
Title: Re: Layers of Complexity
Post by: redsniper on June 03, 2012, 12:04:54 am
Quote
single-crystal silicon ingot

Oh holy ****, really? I didn't realize this.

Like.... it's all one uniform lattice, right?
Title: Re: Layers of Complexity
Post by: Nuke on June 03, 2012, 05:36:49 am
Quote
single-crystal silicon ingot

Oh holy ****, really? I didn't realize this.

Like.... it's all one uniform lattice, right?

it absolutely needs to be, especially as the size of the process decreases. a tiny flaw might ruin a transistor and cause a logic gate not to function properly. one bad transistor can **** up the entire chip.
Title: Re: Layers of Complexity
Post by: Aardwolf on June 03, 2012, 01:07:56 pm
(http://imgs.xkcd.com/comics/abstraction.png)
Title: Re: Layers of Complexity
Post by: Nuke on June 03, 2012, 01:57:00 pm
^ this
Title: Re: Layers of Complexity
Post by: headdie on June 03, 2012, 04:18:56 pm
(http://imgs.xkcd.com/comics/abstraction.png)
lol
Title: Re: Layers of Complexity
Post by: General Battuta on June 03, 2012, 04:54:05 pm
better hope randall munroe never learns any biology because then he'd really freak out
Title: Re: Layers of Complexity
Post by: Mongoose on June 03, 2012, 04:54:15 pm
<3 Maru
Title: Re: Layers of Complexity
Post by: Ravenholme on June 03, 2012, 07:42:39 pm
better hope randall munroe never learns any biology because then he'd really freak out

Weirdly enough, that was my exact thought.
Title: Re: Layers of Complexity
Post by: Rodo on June 03, 2012, 10:42:26 pm
Interesting, I'm certainly not unaware of the complexity of today's PCs, all thanks to a somewhat technical formation, it was fun recognising most of the things in there.

Now, I've got a doubt..

If I spill some milk all over my keyboard, would that be considered like... adding another layer of complexity to it?
Title: Re: Layers of Complexity
Post by: z64555 on June 03, 2012, 10:49:48 pm
If I spill some milk all over my keyboard, would that be considered like... adding another layer of complexity to it?

No, that just removes several layers of complexity and replaces them with a short-circuit.
Title: Re: Layers of Complexity
Post by: Nuke on June 03, 2012, 11:46:21 pm
at least till you stick it in the dishwasher
Title: Re: Layers of Complexity
Post by: X3N0-Life-Form on June 04, 2012, 02:57:34 am
Interesting article. It's good to be reminded every once in while just how abyssaly complex our technology actually is, and, in a way, how lovecraftianly incomprehensible it is to virtually everyone.

If I spill some milk all over my keyboard, would that be considered like... adding another layer of complexity to it?

No, that just removes several layers of complexity and replaces them with a short-circuit.
Depends on the kind of cumputer you're using ; if it's a desktop, it depends on the amount of milk, between me, my brother and my friends we've spilled all of stuff on our keyboards and I have yet to see a keyboard short-circuit because of it ; if it's a laptop, on the other hand, it might be a problem.
Title: Re: Layers of Complexity
Post by: Nuke on June 04, 2012, 03:32:33 am
its only 5 volts and a few hundreds (sometimes tens) of miliamps. its not really enough to have any catastrophic arcing. you might get a short but at most it will just cause erroneous operation, not catastrophic failure. microcontrollers are tough. especially the old skool 40-pin dip packages that a lot of keyboards still use for some reason.
Title: Re: Layers of Complexity
Post by: jr2 on June 04, 2012, 04:12:09 am
I'm aware of the layers... and I know that I'm not aware of plenty.  But, knowing that, I can research the layers on teh internetz if I need or want to learn more about what might be causing a specific problem.  I generally have a feel for which layers are responsible for what and the symptoms when they go wrong.  However, I've been totally stumped before.

You know what I'm talking about, right? When you spend hours and hours trying to fix a problem, only to have it disappear for no reason, leaving you to wonder if you a) accidentally fixed it, or if it was b) an intermittent problem, or if it was c) a cascade failure brought on by multiple sub-problems which you finally fixed enough of to get the system working properly again, and the failure won't be back until a) it gets broken again, b) the gremlins decide that it's time for the flip side of intermittent again, c) enough sub-problems are created again to cause the originating circumstances of the problem and re-create it.    :banghead:

EDIT: And if it's software related, this tends to make me lean more towards the Nuke all the things approach to bugfixing.
Title: Re: Layers of Complexity
Post by: Nuke on June 04, 2012, 04:56:41 am
say what you want about it, it works. hardware causes less headaches than software i think. you know the hardware is fine when the system posts, or when you pass memtest86. seems to me there are more layers in software than in hardware, especially on high end systems. i wonder how much cpu power gets squandered on just making every software subsystem plays nice with every other software subsystem.  hardware on the other hand is just vast arrays of the same old stuff. same old logic gates over and over again. increase the word size? no problem, just stick 2 32 bit adders together and you got a 64 bit adder. your just making the bus wider. you might get some proprietary hardware (most gpus) but its all made out of the same old transistors, making the same old logic gates on the same old silicon die.

like with fpgas, a single generic logic cell can be connected to large numbers of other completely identical cells in varying ways, allowing you to spin a custom cpu out a generic assembly line part. someone even spun a feature compatible cray-1 on an fpga which could run native cray code. same chip could easily be reconfigured to be feature compatible with a completely different processor architecture, like x86 or arm (granted im not sure these things have enough cells to be configured as cutting edge chips). and this is all from an array of completely identical parts. its a tech i want to get into eventually.

with software you have an os, which by nature of trying to do everything is always gonna be bloatware. you have to supoport hardware with drivers, you need a framework in which everything operates, then you need to provide ui, multitasking, the ability to load software, and to talk on the net. for some reason the bundle an os with a bunch on applications that you could have obtained yourself. why use wordpad when you can get notepad++? when all this is bundled into an operating system, you dont really have control of the layers, you just trust the software to deal with it all for you. your not allowed to get rid of layers you dont want. just by shear brute force of processing power all this bull**** is instantaneous and you dont notice most of the time that its happening. you just have a machine patiently waiting an eternity for input occasionally entertaining itself with the bull**** as usual. you only ever notice it when you absolutely need the whole computer's resources to solve a single complex problem (like encoding a video) and it takes multiple times longer than should be theoretically possible given the speed of the hardware..
Title: Re: Layers of Complexity
Post by: pecenipicek on June 04, 2012, 05:45:56 am
say what you want about it, it works. hardware causes less headaches than software i think. you know the hardware is fine when the system posts, or when you pass memtest86. seems to me there are more layers in software than in hardware, especially on high end systems. i wonder how much cpu power gets squandered on just making every software subsystem plays nice with every other software subsystem.  hardware on the other hand is just vast arrays of the same old stuff. same old logic gates over and over again. increase the word size? no problem, just stick 2 32 bit adders together and you got a 64 bit adder. your just making the bus wider. you might get some proprietary hardware (most gpus) but its all made out of the same old transistors, making the same old logic gates on the same old silicon die.

like with fpgas, a single generic logic cell can be connected to large numbers of other completely identical cells in varying ways, allowing you to spin a custom cpu out a generic assembly line part. someone even spun a feature compatible cray-1 on an fpga which could run native cray code. same chip could easily be reconfigured to be feature compatible with a completely different processor architecture, like x86 or arm (granted im not sure these things have enough cells to be configured as cutting edge chips). and this is all from an array of completely identical parts. its a tech i want to get into eventually.

with software you have an os, which by nature of trying to do everything is always gonna be bloatware. you have to supoport hardware with drivers, you need a framework in which everything operates, then you need to provide ui, multitasking, the ability to load software, and to talk on the net. for some reason the bundle an os with a bunch on applications that you could have obtained yourself. why use wordpad when you can get notepad++? when all this is bundled into an operating system, you dont really have control of the layers, you just trust the software to deal with it all for you. your not allowed to get rid of layers you dont want. just by shear brute force of processing power all this bull**** is instantaneous and you dont notice most of the time that its happening. you just have a machine patiently waiting an eternity for input occasionally entertaining itself with the bull**** as usual. you only ever notice it when you absolutely need the whole computer's resources to solve a single complex problem (like encoding a video) and it takes multiple times longer than should be theoretically possible given the speed of the hardware..
Have fun :D (http://www.linuxfromscratch.org/)
Title: Re: Layers of Complexity
Post by: Ghostavo on June 04, 2012, 06:09:09 am
say what you want about it, it works. hardware causes less headaches than software i think. you know the hardware is fine when the system posts, or when you pass memtest86. seems to me there are more layers in software than in hardware, especially on high end systems. i wonder how much cpu power gets squandered on just making every software subsystem plays nice with every other software subsystem.  hardware on the other hand is just vast arrays of the same old stuff. same old logic gates over and over again. increase the word size? no problem, just stick 2 32 bit adders together and you got a 64 bit adder. your just making the bus wider. you might get some proprietary hardware (most gpus) but its all made out of the same old transistors, making the same old logic gates on the same old silicon die.

Two words, cache coherence.
Title: Re: Layers of Complexity
Post by: Polpolion on June 04, 2012, 10:42:16 am
say what you want about it, it works. hardware causes less headaches than software i think. you know the hardware is fine when the system posts, or when you pass memtest86. seems to me there are more layers in software than in hardware, especially on high end systems. i wonder how much cpu power gets squandered on just making every software subsystem plays nice with every other software subsystem.  hardware on the other hand is just vast arrays of the same old stuff. same old logic gates over and over again. increase the word size? no problem, just stick 2 32 bit adders together and you got a 64 bit adder. your just making the bus wider. you might get some proprietary hardware (most gpus) but its all made out of the same old transistors, making the same old logic gates on the same old silicon die.

like with fpgas, a single generic logic cell can be connected to large numbers of other completely identical cells in varying ways, allowing you to spin a custom cpu out a generic assembly line part. someone even spun a feature compatible cray-1 on an fpga which could run native cray code. same chip could easily be reconfigured to be feature compatible with a completely different processor architecture, like x86 or arm (granted im not sure these things have enough cells to be configured as cutting edge chips). and this is all from an array of completely identical parts. its a tech i want to get into eventually.

with software you have an os, which by nature of trying to do everything is always gonna be bloatware. you have to supoport hardware with drivers, you need a framework in which everything operates, then you need to provide ui, multitasking, the ability to load software, and to talk on the net. for some reason the bundle an os with a bunch on applications that you could have obtained yourself. why use wordpad when you can get notepad++? when all this is bundled into an operating system, you dont really have control of the layers, you just trust the software to deal with it all for you. your not allowed to get rid of layers you dont want. just by shear brute force of processing power all this bull**** is instantaneous and you dont notice most of the time that its happening. you just have a machine patiently waiting an eternity for input occasionally entertaining itself with the bull**** as usual. you only ever notice it when you absolutely need the whole computer's resources to solve a single complex problem (like encoding a video) and it takes multiple times longer than should be theoretically possible given the speed of the hardware..

I'd say they're pretty similar, to be honest honest. I don't need to worry about operating temperature in software, but I need to worry about memory usage (ooh wait you need to worry about that in hardware too). I don't need to worry about self-inductance in software, but I need to worry about parallelisms (oh wait). Anyway, the list goes on. What it boils down to is what you're more comfortable with.

ed: no, actually I'd say that hardware is more headache prone just because if you're dealing with gen-purpose computer hardware you really need to make sure that you're making a viable target for software.
Title: Re: Layers of Complexity
Post by: z64555 on June 04, 2012, 12:12:29 pm
especially the old skool 40-pin dip packages that a lot of keyboards still use for some reason.

It's just like with the 555 timer chip, why design a new chip when you've already got one that works perfectly fine?
Title: Re: Layers of Complexity
Post by: Mongoose on June 04, 2012, 12:25:33 pm
What I take away from this is that, if civilization goes tits-up and we wind up in a post-apocalyptic wasteland, we're all screwed, because no one will be able to figure out how the hell everything used to work.
Title: Re: Layers of Complexity
Post by: Nuke on June 04, 2012, 12:30:15 pm
especially the old skool 40-pin dip packages that a lot of keyboards still use for some reason.

It's just like with the 555 timer chip, why design a new chip when you've already got one that works perfectly fine?

well my point was why not use a much smaller/cheaper qfp instead. actually i think the reason is that a lot of peripherals are still manufactured by manual labor. hince throughhole parts as opposed to smt.
Title: Re: Layers of Complexity
Post by: redsniper on June 04, 2012, 12:58:18 pm
What I take away from this is that, if civilization goes tits-up and we wind up in a post-apocalyptic wasteland, we're all screwed, because no one will be able to figure out how the hell everything used to work.

We figured it out once. We can figure it out again.

Plus now we'll have all kinds of cool stuff lying around to reverse engineer. :D
Title: Re: Layers of Complexity
Post by: Flipside on June 04, 2012, 01:03:35 pm
The thing about modern society is that it works on a 'no man is an island' mentality. Almost all high level tech solutions require several experts in several fields.

There was a book called 'Strata' by Pratchett that was quite interesting with regards to this subject, stating that something like a simple Space Rocket is built on a huge pyramid of agriculture, mining, chemistry, physics, communication etc, etc. You need that foundation in place first.

Yes, if we lost everything, we could get it back, but it wouldn't take much less time than it did the first time round.
Title: Re: Layers of Complexity
Post by: The E on June 04, 2012, 01:15:38 pm
Mostly because we would have to rediscover intermediate steps we have long since discarded from active memory while at the same time not losing sight of the eventual goal. There may be a few shortcuts we can take (or rather, dead ends we can avoid). For example, getting back to the steam age and electricity might not take as long as it did the first time around, but rebuilding the infrastructure needed to make integrated circuits will probably not be that much faster.
Title: Re: Layers of Complexity
Post by: Mongoose on June 04, 2012, 02:32:54 pm
Yeah, the jump from 4000 B.C. to 1870 or so presumably wouldn't be all that hard, but we'd have our work cut out for us to move through the 20th century again. :p
Title: Re: Layers of Complexity
Post by: MP-Ryan on June 04, 2012, 03:59:16 pm
Yeah, the jump from 4000 B.C. to 1870 or so presumably wouldn't be all that hard, but we'd have our work cut out for us to move through the 20th century again. :p

I beg to differ =)  The knowledge derived from the advances in the 5870 years you've mentioned in agriculture, biology, chemistry, physics, and mathematics are all prerequisites for the advances of the 20th century.  If you wiped out a large chunk of the human race and most of our infrastructure today, there are some major elementary hurdles we'd have to reconquer.
Title: Re: Layers of Complexity
Post by: The E on June 04, 2012, 04:05:20 pm
Yes, but unlike our distant ancestors, we know that certain things are possible already. And personally, I believe that a catastrophe that manages to wipe out such a large portion of the human race AND our printed knowledge base is pretty unlikely.....
Title: Re: Layers of Complexity
Post by: redsniper on June 04, 2012, 04:14:33 pm
Yeah, I didn't mean to say that recovering from some cataclysmic disaster would be a walk in the park or anything. I just think it would go a little easier than the first time around when you actually have examples of what could be done lying around. It's marginally easier to look at a thing and think "how can we build this again?" rather than "how can I invent and build something that has never existed before ever?" :p
Title: Re: Layers of Complexity
Post by: Mongoose on June 04, 2012, 04:22:34 pm
Yeah, the jump from 4000 B.C. to 1870 or so presumably wouldn't be all that hard, but we'd have our work cut out for us to move through the 20th century again. :p

I beg to differ =)  The knowledge derived from the advances in the 5870 years you've mentioned in agriculture, biology, chemistry, physics, and mathematics are all prerequisites for the advances of the 20th century.  If you wiped out a large chunk of the human race and most of our infrastructure today, there are some major elementary hurdles we'd have to reconquer.
Well, I was thinking more along the lines of the complexities required to potentially reverse-engineer a machine or process.  Like, I'm a complete mechanical layman, but I at least know the basics of how a simple steam engine works: heat up water, use the steam to push against a piston, do work.  It wouldn't be very sophisticated, but I think a lot of people would be able to jury-rig something basic along those lines.  In the same way, while your average person might know much about genetics, but there are a lot of do-it-yourself gardeners who understand the basics of selecting good crops.  Most people could tell you that boiling water before drinking it kills off germs, or that letting cattle do their business upstream of a town's water supply is a bad idea.  There's a lot of basic knowledge ingrained in people today that would still be there if we had to start over, and I think we'd be able to reconstruct a lot of the basics along those lines without massive difficulty.  It's when you get into the more recent technical advances, things that required many experts in specific fields to develop, that you'd run into problems.
Title: Re: Layers of Complexity
Post by: Nuke on June 04, 2012, 04:40:28 pm
nuclear warheads tent to point at cities and other missile bases, i figure the rural areas will mostly be unaffected by anything, maybe a little fallout. plenty of small towns and and agricultural areas will remain to keep essential knowledge going. you dont really need to worry about the stuff thats well documented. for example there have been people who could produce diy semiconductors. its the closed proprietary technologies that will be lost. any tech that is documented from theory to production will probibly survive without being need to be made from scratch.

there will be a lot of book burning, not to destroy knowledge but to make fire for heat/cooking. there will be some desire to save books deemed essential for immediate survival, stuff on agriculture, trades manuals, **** like that. i very much doubt a manual on c++ will last very long. for that preservation of knowledge to be ongoing, literacy must be maintained, because when its not you get indiscriminate book burning.

there will also be a number of working computers left lying around. a computer really isnt essential to survival, so they might be salvaged for parts or just used for their raw materials. like making arrow points out of bits of case metal, or hammering the sheet metal flat with a rock to make building material. the surface mount technology is a ***** when it comes to component level work, so most of the boards will probibly be thrown into rubbish heaps. if computers do manage to survive it will be unlikely that they will be used for anything due to lack of power. of course you can fuel a genny with wood, or build a wind turbine with junk, so its possible to have a few places where the technology will be preserved.

engine technology will survive for a couple reasons. the first being that they can be found everywhere. they will be extremely useful for survival as they can do work humans can not and allow for a degree of industrialization. all be it limited by the ability to find fuel (though wood gasifiers are easy to build). what ultimately determines if a technology is preserved or not is ultimately its usefulness for survival.