Sir, this is a Lemmy.
Or/and are forced to use it at work
FOCK FOCK FOCK!
So what games are we talking about in the terminal?
You told me the i32 was a default int, which it isn’t though, not more than i16, right?
But well the discussion has become a bit sterile I feel.
Lets see if it’s the new miracle language or not in the coming years :-)
C/C++ compilers. Not FPGA.
Anyways, int is s32 very very very often, that’s all my point was. C, gpgpu, c#, java, …
In C/C++ the default int definitely exists, and is a signed 32bit in the most overwhelming cases, what are you talking about?
Default unsigned?
I think you misunderstood.
Downvote all you want, even tho gs you dont understand lol
In 99% of C/C++ compilers, if you write “int” then it is treated as a signed 32 bit data type. If you want something else you need to specify it. Like unsigned char for example is a non signed 16bit data tupe (again,on 99% of C/C++ compilers). Thus int defaults to a signed 32 bit data type which makes ‘int’ a default value. I don’t know how to better explain that better to a developer. If you don’t understand, please do tell.
So not default??
For the packing/unpacking, sure it all depends on where you want the cost.
Unsigned ?
I mean the nintendo ds math library was 16 bits too, it happens, but unsigned? Never heard of it.
Well where are those unsigned default ‘int’ I said doesn’t exist, and that everyone seems to not think I’m right about?
Don’t get me wrong, unsigned integers are useful, that’s why we hate java btw, but it was not really the question.
Also, if you’re using like 16 bits ints because you have memory constraints then you are doing it wrong. All modern compilers can handle any kind of bits as long as it’s less than the base size, so you can have a 3bit int, another 3bit int and 2 1 bit ints, nicely #packed into a byte. You use the :3 syntax.
Duh I know, it’s just that it’s now the standard that your ‘int’ is signed, you know as I said ‘show me otherwise’. It’s also wildly most common 32 bits.
Thanks!
Is there an “int” to use the default int with?
I have done low level dev, and put stuff in wan frames (576 bytes IIRC) and meddling with ethernet frames (1500 bytes), now you tell me where on earth you find a not totally obscure int thats unsigned by default.
That was my statement you know.
For rust, it feels like someone fell in love with c++ template meta programming (you know the … variadic template and all that) and wanted a better language for that. Good gor them if they like it.
Fixing a SSAO bug where indices overflowed the 32bit int on the gpu I had to use 64GB. Since then I have never needed more than 32GB and at home 24 is way more than I need.
Well, I just remembered, actually I did need more once for a fftv bug (same story, 32bit overflow) but I borrowed a 192GB pc for that.