Timeline for Is it practically possible to shorten computer Bits?
Current License: CC BY-SA 4.0
17 events
when toggle format | what | by | license | comment | |
---|---|---|---|---|---|
Nov 11, 2021 at 0:54 | comment | added | Frank Thomas | how would you express 10101010? also, note that unless you can create hardware that can process more than low/high signal (which is where 0 and 1 come from) you would just have to translate that data into binary to process it anyway, which would be a big drag on efficiency at processing time. | |
Nov 10, 2021 at 17:04 | comment | added | Keltari |
Here is a bit 0 - Now I am shortening it to o - see its much shorter now. :D
|
|
Nov 10, 2021 at 16:50 | comment | added | StainlessSteelRat | Data compression uses a form of this. Nothing new. A binary bit takes up 1 bit, but could occupy 32 bits (depending on processor and compiler). | |
Nov 10, 2021 at 16:50 | review | Close votes | |||
Nov 26, 2021 at 3:01 | |||||
Nov 10, 2021 at 16:21 | history | edited | Parking Master | CC BY-SA 4.0 |
deleted 36 characters in body
|
Nov 10, 2021 at 16:16 | comment | added | Tetsujin | These 7 bits shall henceforth be known as - up, left, odd, quirk, cheese, gromit & wheelbarrow. | |
Nov 10, 2021 at 16:13 | vote | accept | Parking Master | ||
Nov 10, 2021 at 16:13 | answer | added | Ben Voigt | timeline score: 4 | |
Nov 10, 2021 at 16:06 | comment | added | Parking Master | There's not much too it, I just thought it would be a good idea for other things such as storage, etc. I just came here to ask if it was possible. | |
Nov 10, 2021 at 16:05 | comment | added | Yorik | this is off topic. I'd suggest it would be better suited to computer ("computing") science or math stacks, but frankly they will just shut it down as it is cranky | |
Nov 10, 2021 at 16:04 | comment | added | Parking Master | I did do it, I made it up when I was doing testing in an editor, and shortened it by 2 million bits. | |
Nov 10, 2021 at 16:03 | comment | added | spikey_richie | retrocomputing.stackexchange.com/questions/15512/… No you didn't. | |
Nov 10, 2021 at 16:03 | comment | added | Parking Master | I made it up myself, as a way of shortening 7-digit binary computer bits. But all computers can read is 1s and 0s, so I'm asking if there is a way of shortening bits, or if my idea can be used for something else. | |
Nov 10, 2021 at 16:02 | comment | added | spikey_richie | Reducing 8-bit bytes to 7-bit bytes is hardly original. | |
Nov 10, 2021 at 16:01 | comment | added | Ramhound | Where exactly are you getting the notation you are using? Your original idea does not make sense. There is already hexadecimal notation that is widely used. Your original idea is extremely confusing and honestly does not make sense. This is coming from a person with multiple college degrees in engineering and computer science. | |
S Nov 10, 2021 at 15:58 | review | First questions | |||
Nov 10, 2021 at 16:16 | |||||
S Nov 10, 2021 at 15:58 | history | asked | Parking Master | CC BY-SA 4.0 |