Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Feed
All
Post Types
- Rants
- Jokes/Memes
- Questions
- Collabs
- devRant
- Random
- Undefined
Cancel
All
-
Hey guys, Been a while...
Quick status update
Moved out of my parent's house
Now a lead backend engineer at a crypto exchange.
Getting offers from startups without even applying, just referrals
Still underpaid by global standards but very comfortable locally.
1year+ of financial stability
Lots of motivation from the lovely people here when I started out, I'm grateful1 -
In my case, the most unrealistic deadline was when I was put on a project for 30 person days in 2008. The project had been running for about 6 months at that point.
I spoke to the project manager about my tasks and she told me to finish the fat client. So I immersed myself in the sources. And I was horrified to realize that not only was it not even a POC, but the performance was lousy to say the least. It took about 70 (sic!) seconds to start the program, read in about 20 records from a database and display them as a hierarchical structure.
I asked the PM when I was supposed to have finished my work, and her response was, "Yesterday."
"Very funny," I replied.
"No, really," she said, "the deadline was yesterday."
It took me an afternoon to speed up the fat client startup to 6 seconds. And then it took us another two weeks or so to identify the processes in discussions with the technical project manager. Because that didn't exist yet either.
About 1.5 years after the deadline, the software system - consisting of the fat client, mainframe modules and purchased software - was stable enough to be rolled out. -
Client said the images need to be responsive and go full width.
Told her that would make the images massive and we should have a limit in height a cropping enabled.
She told me it should look nice with full width images uncropped.
Fucking designers said the same fucking thing.
Client today: Hey images are huge. Make sure they look nice.
*sigh*8 -
I genuinely don't know what I am doing here. I have no form of social media so I started using my school laptop and found out this site isn't blocked so here I am reading random ass dev's yapping and ranting. Honestly love it lmfao.4
-
Why is it that companies feel the need to ask, “Why do you wanna work here?” Or “What made you apply?”
Ah, idk, just spitballing here, you’re hiring and I’m unemployed.
I need money, and I heard you pay ppl to do work for you. 🤷🏾♂️
I have bills to pay and u have the money to satisfy that need.
Good enough or should I keep going?? 🙄22 -
My boss keeps touching and poking my colleague's and my laptop screens. Our laptops are both Mac and are not company property. My colleague once told him to stop. He said sorry, but after 5 minutes, he did it again. It's so annoying. Why can't people understand that if it's not a touchscreen, they should keep their manners and not touch it?12
-
Current workload as dev lead:
- 1% actual development
- 2,5% waiting for SaaS to load
- 2,5% cursing company server network connectivity issues
- 5% switching VPNs
- 7,5% pkg management & deploys
- 10% writing JIRA and support tickets
- 12,5% filling in timesheets
- 15% coaching & reviewing a bot coworker
- 19% doing 2FA, refreshing expired passwords
- give up and spend the remaining 25% doing something meaningful8 -
so Broadcom bought VMWare.... so now whenever you go to any community support page that used to be on VMWare's community, you're just always redirected to Broadcom's support homepage...
another billion dollar company that has failed to understand the basics in HTTP and DNS
what do i expect...
i don't know, they probably only have like 2 devs, i shouldn't be so hard on them
🤡🤡🤡🤡🤡🤡🤡🤡3 -
Fuck you AMD for being too lazy to implement VK_EXT_fragment_shader_interlock even though your hardware supports it [1]
It's literally *the* best way to implement any sort of order independent transparency ( https://web.archive.org/web/... )
But noo, not enough people are using it so too bad. Now you just have to render transparent objects all fucked up and bad looking on AMD hardware because "we don't feel like it"
[1] https://github.com/GPUOpen-Drivers/...23 -
As a team lead, what would you do if one of your direct reports sent obscenely bad code for review? Like absolutely nonsensical, non-working, touching wrong parts of the project, doing wrong things… Terrible even by your company's standards.
Would you consider it an instance of stupidity? Tiredness? A resignation letter? An insult? A cry for help? A combination of those things?10 -
1. I have to learn German (as a language).
2. I have undiagnosed and subclinical ADHD.
3. I have a job that partially needs my brain for 9 hours of the day.
4. I'm coming off of antidepressants. (Life has been hard lately. Needed a little help to cope.)
5. I need to finish learning German in about 2-3 months.
6. I don't enjoy interacting with people.
Any suggestions for what can help with the goal? Software, web apps, services, etc. Specially good non-violent and non-depressing tv-series.16 -
Sometimes I just don't know what to say anymore
I'm working on my engine and I really wanna push high triangle counts. I'm doing a pretty cool technique called visibility rendering and it's great because it kind of balances out some known causes of bad performance on GPUs (namely that pixels are always rasterized in quads, which is especially bad for small triangles)
So then I come across this post https://tellusim.com/compute-raster... which shows some fantastic results and just for the fun of it I implement it. Like not optimized or anything just a quick and dirty toy demo to see what sort of performance I can get
... I just don't know what to say. Using actual hardware accelerated rasterization, which GPUs are literally designed to be good at, I render about 37 million triangles in 3.6 ms. Eh, fine but not great. Then I implement this guys unoptimized(!) software rasterizer and I render the same scene in 0.5 ms?!
IT'S LITERALLY A COMPUTE SHADER. I rasterize the triangles manually IN SOFTWARE and write them out with 64-bit atomic image stores. HOW IS THIS FASTER THAN ACTUAL HARDWARE!???
AND BY LIKE A ORDER OF MAGNITUDE AT THAT???
Like I even tried doing some optimizations like backface cone culling on the meshlets, but doing that makes it slower. HOW. Im rendering 37 million triangles without ANY fancy tricks. No hi-z depth culling which a GPU would normally do. No backface culling which a GPU with normally do. Not even damn clipping of triangles. I render ALL of them ALL the time. At 0.5 ms7 -
“Hey what’s this issue? Can you tell me what’s going on?”
“Yeah sure, what is it?”
[login page alert displaying “your email or password is incorrect, please try again.”]
How do people like this live? How do they not forget how to breathe or eat?11 -
Soviet “Altair”: a Nintendo Game&Watch with a built-in Geiger counter.
Right side, top to bottom: game 1, alarm, game 2, reset, time, µSv/hr x 100 µR/h
Top side: Altair
Bottom side: Dosimeter ✻ Watch ✻ Game5