Interesting detail I didn't know about, and different to my initial interpretation too.
It doesn't seem too unreasonable to call the program counter, memory registers, etc. "special registers" since they have specific functions beyond simply storing data. Therefore I could imagine calling the aggregate of such registers in a CPU a "special register group". Perhaps those still using the term are following this line of thought.
The definition doesn't mention that there are any other kinds of registers other than special register groups: it asserts either that there exist groups of nothing but special registers, or else that there exist nothing but special groups of registers.
"special" is meaningless without a the presence of contrasting subjects that are "ordinary" or "regular".
E.g, "The human hands have two special finger groups".
In that they're not the GPRs (General Purpose Registers) this kinda sorta makes sense. Mostly the problem is that this terminology isn't actually used this way. Notice how the CPU (which we're thinking of as one small component of e.g. a laptop or phone, but the originators were imagining a furniture sized piece of equipment which is responsible for the actual computing) is supposed to be comprised of these particular elements but not GPRs, not an FPU, or an MMU, or any other elements which are in fact typical today.
So this sort of "quiz" is bogus because it's basically checking whether you've rote memorized some particular words. People who know how a CPU works are confused and likely don't pass, those who've memorized the words but lack all understanding pass. This is not a useful education.
This was my thought as well. Any sufficiently complex modern CPU contains some register that it expects some bit to be set in to enable something like a powersaving mode with an interrupt mask in the lower bits to turn it off. Or something equally esoteric, but purposeful once you consider the application.
My recollection is that even the Atmega 8-bit microcontrollers have tons of special register groups around timers and interrupts.
Given the relative scarcity of "general purpose" registers on x86 32-bit CPUs, you could actually argue those are the special purpose registers.
> My recollection is that even the Atmega 8-bit microcontrollers have tons of special register groups around timers and interrupts.
Not precisely. AVR, like most embedded architectures, has a bunch of I/O registers which control CPU and peripheral behavior - but those are distinct from the CPU's general-purpose registers. The I/O registers exist in the same address space as main memory, and can only be accessed as memory. You can write an instruction like "increment general-purpose register R3", but you can't use that same syntax for e.g. "increment the UART baud rate register"; you have to load a value from that register to a GPR, increment it there, and store it back to the I/O register.
AVR is a bit weird in that the CPU's general-purpose registers are also mapped to main memory and can be accessed as if they were memory - but that functionality is rarely used in practice.
Getting back to your original point, x86 does have special-purpose registers - lots of them, in fact. They're accessed using the privileged rdmsr/wrmsr instructions.
The Honeywell 800 allowed eight programs to run on a single processor, switching between programs after every instruction[...]its own separate group of 32 registers (program counter, general-purpose registers, index registers, etc.)
I'm surprised that the analogy to hyperthreading wasn't made here.
The Honeywell's "special register groups" were rather unusual, since the processor switched tasks and registers after every instruction. (This is more commonly known as a barrel processor.) I don't see an evolutionary path to register files, windows, banks, etc, although the concept is a bit similar. Rather, I think the concept of having separate sets of registers developed independently several times.
That hardware multiprogramming concept showed up a few times. The CDC 6600 pretended to have 10 peripheral processors for I/O using that approach.
National Semiconductor (I think) once made a part which had two BASIC interpreters on one CPU with two sets of registers.
TI had some IC where the registers were in main memory, with a hardware register to point to them. Context switching didn't require a register save, just pointing the register pointer somewhere else. There were stack machines such as the B5500 where the top items on the stack were in hardware registers but the rest of the stack was in memory. Floating point in the early years of floating point co-processor chips had lots of rather special registers.
Getting data in and out of them was sometimes quite slow, so it was desirable to get data into the math co-processor and work on it there without bringing it back into memory if at all possible.
Lots of variations on this theme. There's a lot going on with special-purpose registers today inside superscalar processors, but the programmer can't see most of it.
Are you thinking of TI's TMS9900 processor, which had registers in main memory? This processor was used in TI's TI-99/4 home computer. (Both the processor and the home computer were big failures.)
Tera/Cray used to make the MTA (multi-threaded architecture) that had 128 register sets so it could run 128 threads in a round-robin fashion. The purpose was to cover the latency of main-memory access.
For all the complaints about our corpus being contaminated by AI slop, there's been plenty of human-generated slop being regurgitated over the years as well. Lazy textbook writers copy and paste from older books, lazy test makers quiz on arbitrary phrases from the textbooks, nobody ever does any fact checking to see if any of it makes sense.
My favorite example is the "tongue map" - decades of schoolchildren have been taught that different parts of the tongue are responsible for tasting sweetness and saltiness and so on. This is contrary to everyone's experience, and it turns out to have been a mistranslation of some random foreign journal article on taste buds, but it's stuck around in the primary school curriculum because it's easy to make it into a "fill in the map" activity. As long as the kids can regurgitate what they're told, who cares if anything they're learning is true?
I love weird jargon. There's plenty of it in the terms we normally use, like despite ROM being a type of RAM, everyone knows when you say RAM you are excluding ROM.
One of my favorites is BLDC which stands for BrushLess Direct Current, which is used to describe motors which are driven by an alternating current at a variable frequency, not to be confused with VFD, which in this case stands for Variable Frequency Drive, and describes how BLDC actually functions, and is on rare occasions used interchangably, but is most often used with analog or open-loop controllers, while BLCD is used for VFDs that are digitally controlled. BLDCs are Brushless not Brush Less and, AC not DC. I like to think the acronym stands for Bitwise Logic Driven Current, which follows the acronym, correctly describes how it works, and differentiates it from non-BLDC VFDs. Also, while VFD can mean Variable Frequency Drive, it is also used to mean Vacuum Fluorescent Display, and for Lemony Snicket fans it can me a whole lot more. (https://snicket.fandom.com/wiki/V.F.D._%28disambiguation%29)
Uncommon jargon is even better, especially when it comes out of marketing departments and describes something ordinary. I have a rice cooker that has terms like "micom" and "fuzzy logic" on the labels. They describe microcontrollers and variables, respectively, which are found in all but the simplest electromechanical appliances.
It's normal for companies to trademark their names for common technology, like Nvidia and AMD trademarking their implementation of variable framerate as G-Sync and FreeSync, respectively, and some companies are more agressive with their trademarks than others, such as Intel trademarking their symetric multithreading as Hyper-Threading, but AMD just calling it symetric multithreading.
Cessna used to have a bunch of random trademarked jargon in their ads in the 70's, from trademarking flaps to back seats.
Your comment reminds me of a couple things. On the topic of RAM, I was recently researching the vintage IBM 650 computer (1953). You could get it with Random Access Memory (RAM), but this RAM was actually a hard disk. I guess IBM considered the disk to be more random access than the rotating drum that provided the 650's main storage. One strange thing is that the disk had three independent read/write arms. Unlike modern disks, the arm could only access one platter at a time and had to physically retract to move to a different platter. Having three independent arms let you overlap reads and seeks.
As far as "fuzzy logic", there's a whole lot of history behind how that ended up in your rice cooker. In the 1990s, fuzzy logic was a big academic research area that was supposed to revolutionize control systems. The basic idea was to extend Boolean logic to support in-between values, and there was a lot of mathematics behind this, with books, journals, conferences, and everything. Fuzzy logic didn't live up to the hype, and mostly disappeared. But fuzzy logic did end up being used in rice cookers, adjusting the temperature based on how cooked the rice was, instead of just being on or off.
Actually, no :-) The 650 had a drum for main storage. The "355 Random Access Memory" was something different, an optional hard drive peripheral, which IBM literally called "RAM" (see the link above). This became the RAMAC system.
The 650 main memory was a drum; but what IBM called Random Access Memory (and RAM) for this machine was a hard drive. As described in the Manual of Operation linked above. Here are a few quotes:
"Records in the IBM Random Access Memory Unit are stored on the faces of magnetic disks."
"The stored data in the Random Access Memory Unit are read and written by access arms."
"The IBM 355 RAM units provide extemely large storage capacity for data... Up to four RAM units can be attached to the 650 to provide 24,000,000 digits of RAM storage."
The main memory on the other hand:
"The 20,000 digits of storage, arranges as 2000 words of memory on the magnetic drum..."
The rice cooker audibly turns the heating element completely on or off with a relay or thermostat or similar. It does have some kind of PWM , but so does a purely electromechanical rice cooker. There are different settings for brown and white rice, as well as different keep-warm levels, so I presume it can electronically modify the set point for the temperature.
I had looked up the history of fuzzy logic, when I originally saw the term. Everything I came across was from the mid 60's during the minicomputers era (e.g. the DEC PDP and IBM System/3). I just did a Google ngram search for it, and it does look like it peaked highest in the mid 90's. (https://books.google.com/ngrams/graph?content=fuzzy+logic&ye...) It was especially redundant then, because by that point 32-bit floating-point processors were already extremely common. It made sense during the 60's to have two-bit values in ladder logic control systems, because mini computers were prohibitively expensive, but by the 90's control systems were already using personal computer components to emulate old relay logic, while newer control systems were being written in languages that are still common today.
Once academia gets involved, you get fractal (https://xkcd.com/1095/) sub-niches of sub-niches, and it goes further off the rails than marketing jargon.
I swear there's a computing buzzword cycle of embedded systems (e.g. fuzzy logic, smart, IoT, etc.), artificial intelegence (e.g. genetic algorthims, machine vision, neural networks, etc.), and quantum (just quantum, it's never been practical enough to have sub-niches). Right now we're on the trailing end of AI and working our way to quantum. There's a tech company called C3 that has changed their name from the original C3 Energy to C3 IoT and now C3 AI. When they change it to C3 Quantum, we'll know were in the next phase of the buzzword cycle.
I'm going to guess it's a Japanese rice cooker, because "micom" is a Japanese shortening of "microcomputer", which is what they've been calling microcontrollers. Using it for marketing most certainly dates from the time when computerised control was considered a desirable novel feature, much like when "solid state" was used in English.
When a a language gets a word from another language, linguists call it either a loanword or a borrowed word. They're odd terms, because loaning and borrowing mean it'll be given back, which isn't expected when a langauge adopts a word from a different language.
Japan loves to give words back to English though, with cosplay, anime, emoticon, and now micom all originally being from English, then used in Japanese, and brought back to English from their Japanese form.
Right. For a few years, 3-phase motors used as controlled servos were called "brushless DC" below about 1 HP, and "variable frequency drive" above 1 HP. Now everybody admits they're really all 3-phase synchronous motors.
Can of worms. My understanding is that VFD refers to any control system capable of varying speed and torque, usually through varying the supply frequency and voltage to the coils of an asynchronous AC induction motor. However, it is important to note that a VFD can also control synchronous motors such as BLDC and Permanent Magnet Synchronous Motors (PMSM), although in practice the term is usually applied to control systems for high power industrial AC asynchronous induction motors. It would therefore be incorrect to state "they're really all 3-phase synchronous motors", although some VFD control systems could be seen to emulate synchronous motors with asynchronous motors.
I think it really boils down to being jargon from two different groups, so which term gets used isn't a matter of how the device works as much as it is what the person talking about it does for a living.
Since the jargon we've invented in technology has derived from natural language, it's often repurposing common terms as terms of art. In my opinion this leads to ambiguity and I sometimes pine for the abstruse but more precise jargon from classical languages you can use in medicine (for example).
For example, how many things does "link" mean? "Process"? "Type"? "Local"? It makes people (e.g., non-technical people) think that they understand what I mean when I talk about these things but sometimes they do and sometimes they don't. Sometimes we use it in a colloquial sense, but sometimes we'd like to use it in a strict technical sense. Sometimes we can invent a new, precise term like "hyperlink" or "codec" but as often as not it fails to gain traction ("hyperlink" is outdated).
That's one reason we get a lot of acronyms, too. They're too unconversational but they can at least signal we're talking about something specific and rigorous rather than loose.
Medical jargon (or at least biology jargon) using can still conflict with common language. For example: thorn, spine, and prickle all have different meaning in biology, and the term thorn doesn't cover anything native to England, where that word direves and was used in Shakespeare's plays.
Interesting detail I didn't know about, and different to my initial interpretation too.
It doesn't seem too unreasonable to call the program counter, memory registers, etc. "special registers" since they have specific functions beyond simply storing data. Therefore I could imagine calling the aggregate of such registers in a CPU a "special register group". Perhaps those still using the term are following this line of thought.
The definition doesn't mention that there are any other kinds of registers other than special register groups: it asserts either that there exist groups of nothing but special registers, or else that there exist nothing but special groups of registers.
"special" is meaningless without a the presence of contrasting subjects that are "ordinary" or "regular".
E.g, "The human hands have two special finger groups".
In that they're not the GPRs (General Purpose Registers) this kinda sorta makes sense. Mostly the problem is that this terminology isn't actually used this way. Notice how the CPU (which we're thinking of as one small component of e.g. a laptop or phone, but the originators were imagining a furniture sized piece of equipment which is responsible for the actual computing) is supposed to be comprised of these particular elements but not GPRs, not an FPU, or an MMU, or any other elements which are in fact typical today.
So this sort of "quiz" is bogus because it's basically checking whether you've rote memorized some particular words. People who know how a CPU works are confused and likely don't pass, those who've memorized the words but lack all understanding pass. This is not a useful education.
This was my thought as well. Any sufficiently complex modern CPU contains some register that it expects some bit to be set in to enable something like a powersaving mode with an interrupt mask in the lower bits to turn it off. Or something equally esoteric, but purposeful once you consider the application.
My recollection is that even the Atmega 8-bit microcontrollers have tons of special register groups around timers and interrupts.
Given the relative scarcity of "general purpose" registers on x86 32-bit CPUs, you could actually argue those are the special purpose registers.
> My recollection is that even the Atmega 8-bit microcontrollers have tons of special register groups around timers and interrupts.
Not precisely. AVR, like most embedded architectures, has a bunch of I/O registers which control CPU and peripheral behavior - but those are distinct from the CPU's general-purpose registers. The I/O registers exist in the same address space as main memory, and can only be accessed as memory. You can write an instruction like "increment general-purpose register R3", but you can't use that same syntax for e.g. "increment the UART baud rate register"; you have to load a value from that register to a GPR, increment it there, and store it back to the I/O register.
AVR is a bit weird in that the CPU's general-purpose registers are also mapped to main memory and can be accessed as if they were memory - but that functionality is rarely used in practice.
Getting back to your original point, x86 does have special-purpose registers - lots of them, in fact. They're accessed using the privileged rdmsr/wrmsr instructions.
The Honeywell 800 allowed eight programs to run on a single processor, switching between programs after every instruction[...]its own separate group of 32 registers (program counter, general-purpose registers, index registers, etc.)
I'm surprised that the analogy to hyperthreading wasn't made here.
Author here if anyone has questions. This was discussed on HN a while ago: https://news.ycombinator.com/item?id=21333245
Did register groups evolve into what would become register files, register windows, and register banks?
The Honeywell's "special register groups" were rather unusual, since the processor switched tasks and registers after every instruction. (This is more commonly known as a barrel processor.) I don't see an evolutionary path to register files, windows, banks, etc, although the concept is a bit similar. Rather, I think the concept of having separate sets of registers developed independently several times.
That hardware multiprogramming concept showed up a few times. The CDC 6600 pretended to have 10 peripheral processors for I/O using that approach. National Semiconductor (I think) once made a part which had two BASIC interpreters on one CPU with two sets of registers.
TI had some IC where the registers were in main memory, with a hardware register to point to them. Context switching didn't require a register save, just pointing the register pointer somewhere else. There were stack machines such as the B5500 where the top items on the stack were in hardware registers but the rest of the stack was in memory. Floating point in the early years of floating point co-processor chips had lots of rather special registers. Getting data in and out of them was sometimes quite slow, so it was desirable to get data into the math co-processor and work on it there without bringing it back into memory if at all possible.
Lots of variations on this theme. There's a lot going on with special-purpose registers today inside superscalar processors, but the programmer can't see most of it.
Are you thinking of TI's TMS9900 processor, which had registers in main memory? This processor was used in TI's TI-99/4 home computer. (Both the processor and the home computer were big failures.)
https://spectrum.ieee.org/the-inside-story-of-texas-instrume...
Yes. I've used a Marinchip 9900, from John Walker's company before Autodesk.
In case anybody else was curious: https://www.fourmilab.ch/documents/marinchip/boards/
Tera/Cray used to make the MTA (multi-threaded architecture) that had 128 register sets so it could run 128 threads in a round-robin fashion. The purpose was to cover the latency of main-memory access.
I wonder if your article will revive the popularity of the phrase. https://books.google.com/ngrams/graph?content=special+regist...
For all the complaints about our corpus being contaminated by AI slop, there's been plenty of human-generated slop being regurgitated over the years as well. Lazy textbook writers copy and paste from older books, lazy test makers quiz on arbitrary phrases from the textbooks, nobody ever does any fact checking to see if any of it makes sense.
My favorite example is the "tongue map" - decades of schoolchildren have been taught that different parts of the tongue are responsible for tasting sweetness and saltiness and so on. This is contrary to everyone's experience, and it turns out to have been a mistranslation of some random foreign journal article on taste buds, but it's stuck around in the primary school curriculum because it's easy to make it into a "fill in the map" activity. As long as the kids can regurgitate what they're told, who cares if anything they're learning is true?
Discussed at the time (of the article):
“Special register groups” invaded computer dictionaries for decades - https://news.ycombinator.com/item?id=21333245 - Oct 2019 (60 comments)
I love weird jargon. There's plenty of it in the terms we normally use, like despite ROM being a type of RAM, everyone knows when you say RAM you are excluding ROM.
One of my favorites is BLDC which stands for BrushLess Direct Current, which is used to describe motors which are driven by an alternating current at a variable frequency, not to be confused with VFD, which in this case stands for Variable Frequency Drive, and describes how BLDC actually functions, and is on rare occasions used interchangably, but is most often used with analog or open-loop controllers, while BLCD is used for VFDs that are digitally controlled. BLDCs are Brushless not Brush Less and, AC not DC. I like to think the acronym stands for Bitwise Logic Driven Current, which follows the acronym, correctly describes how it works, and differentiates it from non-BLDC VFDs. Also, while VFD can mean Variable Frequency Drive, it is also used to mean Vacuum Fluorescent Display, and for Lemony Snicket fans it can me a whole lot more. (https://snicket.fandom.com/wiki/V.F.D._%28disambiguation%29)
Uncommon jargon is even better, especially when it comes out of marketing departments and describes something ordinary. I have a rice cooker that has terms like "micom" and "fuzzy logic" on the labels. They describe microcontrollers and variables, respectively, which are found in all but the simplest electromechanical appliances.
It's normal for companies to trademark their names for common technology, like Nvidia and AMD trademarking their implementation of variable framerate as G-Sync and FreeSync, respectively, and some companies are more agressive with their trademarks than others, such as Intel trademarking their symetric multithreading as Hyper-Threading, but AMD just calling it symetric multithreading.
Cessna used to have a bunch of random trademarked jargon in their ads in the 70's, from trademarking flaps to back seats.
Your comment reminds me of a couple things. On the topic of RAM, I was recently researching the vintage IBM 650 computer (1953). You could get it with Random Access Memory (RAM), but this RAM was actually a hard disk. I guess IBM considered the disk to be more random access than the rotating drum that provided the 650's main storage. One strange thing is that the disk had three independent read/write arms. Unlike modern disks, the arm could only access one platter at a time and had to physically retract to move to a different platter. Having three independent arms let you overlap reads and seeks.
As far as "fuzzy logic", there's a whole lot of history behind how that ended up in your rice cooker. In the 1990s, fuzzy logic was a big academic research area that was supposed to revolutionize control systems. The basic idea was to extend Boolean logic to support in-between values, and there was a lot of mathematics behind this, with books, journals, conferences, and everything. Fuzzy logic didn't live up to the hype, and mostly disappeared. But fuzzy logic did end up being used in rice cookers, adjusting the temperature based on how cooked the rice was, instead of just being on or off.
[1] https://bitsavers.org/pdf/ibm/650/22-6270-1_RAM.pdf
[2] https://en.wikipedia.org/wiki/Fuzzy_logic
On the topic of RAM, I was recently researching the vintage IBM 650 computer (1953) ... but this RAM was actually a hard disk.
Actually it was a drum. No moving arms at all, but some tracks had multiple read/write heads to reduce access time.
Knuth's high school science fair project was to write an angular optimizing assembler for the IBM 650, called SOAP III.
Actually, no :-) The 650 had a drum for main storage. The "355 Random Access Memory" was something different, an optional hard drive peripheral, which IBM literally called "RAM" (see the link above). This became the RAMAC system.
Right, you could add a RAMAC disk, but that was a very expensive option. Many universities had IBM 650s, but usually without the RAMAC.
The 650 main memory was a drum; but what IBM called Random Access Memory (and RAM) for this machine was a hard drive. As described in the Manual of Operation linked above. Here are a few quotes:
"Records in the IBM Random Access Memory Unit are stored on the faces of magnetic disks."
"The stored data in the Random Access Memory Unit are read and written by access arms."
"The IBM 355 RAM units provide extemely large storage capacity for data... Up to four RAM units can be attached to the 650 to provide 24,000,000 digits of RAM storage."
The main memory on the other hand: "The 20,000 digits of storage, arranges as 2000 words of memory on the magnetic drum..."
The rice cooker audibly turns the heating element completely on or off with a relay or thermostat or similar. It does have some kind of PWM , but so does a purely electromechanical rice cooker. There are different settings for brown and white rice, as well as different keep-warm levels, so I presume it can electronically modify the set point for the temperature.
I had looked up the history of fuzzy logic, when I originally saw the term. Everything I came across was from the mid 60's during the minicomputers era (e.g. the DEC PDP and IBM System/3). I just did a Google ngram search for it, and it does look like it peaked highest in the mid 90's. (https://books.google.com/ngrams/graph?content=fuzzy+logic&ye...) It was especially redundant then, because by that point 32-bit floating-point processors were already extremely common. It made sense during the 60's to have two-bit values in ladder logic control systems, because mini computers were prohibitively expensive, but by the 90's control systems were already using personal computer components to emulate old relay logic, while newer control systems were being written in languages that are still common today.
You just sent me down a rabbit hole of mid-90's buzzword ands jargon, and I found something especially great: neuro-fuzzy logic (https://link.springer.com/article/10.1007/s44196-024-00709-z)
There's also quantum logic (https://en.wikipedia.org/wiki/Quantum_logic) and someone wrote a paper relating it with fuzzy logic: (https://link.springer.com/chapter/10.1007/978-3-540-93802-6_...)
Once academia gets involved, you get fractal (https://xkcd.com/1095/) sub-niches of sub-niches, and it goes further off the rails than marketing jargon.
I swear there's a computing buzzword cycle of embedded systems (e.g. fuzzy logic, smart, IoT, etc.), artificial intelegence (e.g. genetic algorthims, machine vision, neural networks, etc.), and quantum (just quantum, it's never been practical enough to have sub-niches). Right now we're on the trailing end of AI and working our way to quantum. There's a tech company called C3 that has changed their name from the original C3 Energy to C3 IoT and now C3 AI. When they change it to C3 Quantum, we'll know were in the next phase of the buzzword cycle.
I have a rice cooker that has terms like "micom"
I'm going to guess it's a Japanese rice cooker, because "micom" is a Japanese shortening of "microcomputer", which is what they've been calling microcontrollers. Using it for marketing most certainly dates from the time when computerised control was considered a desirable novel feature, much like when "solid state" was used in English.
When a a language gets a word from another language, linguists call it either a loanword or a borrowed word. They're odd terms, because loaning and borrowing mean it'll be given back, which isn't expected when a langauge adopts a word from a different language.
Japan loves to give words back to English though, with cosplay, anime, emoticon, and now micom all originally being from English, then used in Japanese, and brought back to English from their Japanese form.
Right. For a few years, 3-phase motors used as controlled servos were called "brushless DC" below about 1 HP, and "variable frequency drive" above 1 HP. Now everybody admits they're really all 3-phase synchronous motors.
Can of worms. My understanding is that VFD refers to any control system capable of varying speed and torque, usually through varying the supply frequency and voltage to the coils of an asynchronous AC induction motor. However, it is important to note that a VFD can also control synchronous motors such as BLDC and Permanent Magnet Synchronous Motors (PMSM), although in practice the term is usually applied to control systems for high power industrial AC asynchronous induction motors. It would therefore be incorrect to state "they're really all 3-phase synchronous motors", although some VFD control systems could be seen to emulate synchronous motors with asynchronous motors.
I think it really boils down to being jargon from two different groups, so which term gets used isn't a matter of how the device works as much as it is what the person talking about it does for a living.
Since the jargon we've invented in technology has derived from natural language, it's often repurposing common terms as terms of art. In my opinion this leads to ambiguity and I sometimes pine for the abstruse but more precise jargon from classical languages you can use in medicine (for example).
For example, how many things does "link" mean? "Process"? "Type"? "Local"? It makes people (e.g., non-technical people) think that they understand what I mean when I talk about these things but sometimes they do and sometimes they don't. Sometimes we use it in a colloquial sense, but sometimes we'd like to use it in a strict technical sense. Sometimes we can invent a new, precise term like "hyperlink" or "codec" but as often as not it fails to gain traction ("hyperlink" is outdated).
That's one reason we get a lot of acronyms, too. They're too unconversational but they can at least signal we're talking about something specific and rigorous rather than loose.
Medical jargon (or at least biology jargon) using can still conflict with common language. For example: thorn, spine, and prickle all have different meaning in biology, and the term thorn doesn't cover anything native to England, where that word direves and was used in Shakespeare's plays.