Has Little Endian won?
When teaching recently about the Big vs. Little Endian battle, a student asked whether it had been settled, and I realized I didn’t know. Looking at the Wikipedia article, it seems that the most popular current OS/architecture pairs use Little Endian but that Internet Protocol specifies Big Endian for transferring numeric values in packet headers. Would that be a good summary of the current status? Do current network cards or CPUs provide hardware support for switching byte order?
How can I tell whether my computer is Harvard or von Neumann architecture?
I understand the difference between the two architectures is the separation of instructions from data in the Harvard architecture. But how do I know which type of system I’m on? Is it possible to write a program such that the program determines whether the system is von Neumann or Harvard? Could there be another architecture or are these architectures the only ones known?
How many bits’ address is required for a computer with n bytes of memory?
How many bits of address is required (for the program counter for example) in a byte-addressed computer with 512 Mbyte RAM?
Instruction vs data cache usage
Say I’ve got a cache memory where instruction and data have different cache memories (“Harvard architecture”). Which cache, instruction or data, is used most often? I mean “most often” as in time, not amount of data since data memory might be used “more” in terms of amount of data while instruction cache might be used “more often” especially depending on the program.
Why isn’t DSM for unstructured memory done today?
Edit: Comments suggested, that DSM just faded out by being not used recently. What were the reasons for this, what are DSMs drawbacks? Literature lists many positive aspects like
easy to port programms to, no marshalling and therefore faster and so on but also negative aspects like only usable in homogenous environments due to endianess and word size issues. So why is all data synchronizing done by databases and not DSM anymore? Has there been a historic comparison or study at the time when both ways existed concurrently?
How Do Computers Process Conditional/Input/ Event Based Code? [duplicate]
This question already has answers here: How Do Computers Work? [closed] (12 answers) Closed 9 years ago. I understand that computers are basically a complex system of electrical signatures that can calculate based on logic boards, and some sort of gate mechanism, but how do computers process something like if the number produced by the […]
How are operating systems compiled and booted the very first time?
I’m interested in how operating systems work. I’ve been reading some articles about Linux and seem to understand how it all generally comes together, but I feel like there’s a chicken and egg dilemma when it comes to constructing an operating system.
How does understanding computer architecture help a programmer? [duplicate]
Students need to understand computer architecture in order to
structure a program so that it runs more efficiently on a real machine
Is there genetic relationship between ARM and PDP-11 architectures?
Reading about ARM architecture I found many similarities to PDP-11 architecture which did not exist between ARM and x86.
Why did Aiken decided to separate data and instructions in the Harvard Mark I?
When Aiken devised the Mark I, why did he decided to separate data and instructions? It was not mentioned in Wikipedia (or in any other searches I’ve looked) on how or why Aiken separated data and instructions.