DOGE Catch-All

  • Thread starter Thread starter nycfan
  • Start date Start date
  • Replies: 461
  • Views: 10K
  • Politics 
As someone who paid his way through law school as a professional database programmer, I can tell you with 100% certainty that this is normal.
It may have been normal in the earliest days of computer programming. Legacy programming languages are the reason we had to frantically prepare for the year 2000. By no measure is it normal In the year 2025.
 
Last edited:
There is something wrong with it. That's why the media is having to explain the inaccurate data by talking about an archaic programming language

There will be no massive cuts to social security, Medicaid, etc.
if you are going to claim there is something wrong with the database, then tell me what is going wrong with actual proof.
Are people not getting paid that should get paid? Are people being paid that shouldn’t? That is the question. If the answer is pretty much no, then there is nothing wrong with the system.
 
if you are going to claim there is something wrong with the database, then tell me what is going wrong with actual proof.
Are people not getting paid that should get paid? Are people being paid that shouldn’t? That is the question. If the answer is pretty much no, then there is nothing wrong with the system.
Two separate topics. It can be true that the right people are getting paid and, at the same time, it can be true that the system/data/reporting can have something wrong with it. When you're system is reporting things that are impossible by today's medical standards, that would seem to be a problem.
 
It may have been normal in the earliest days of computer programming. Legacy programming languages are the reason we had to frantically prepare for the year 2000. By no measure is it normal In the year 2025.
1. It's not just legacy systems. The systems I designed had unused flags. The systems others designed had unused flags. My friend has been working with databases his whole life; he sees unused flags.

2. I thought I explained this already, but I'll take another stab. I'll use an analogy to help you out.

A. Suppose you're buying some land and you're planning to turn it into a farm. It costs money to acquire the land, erect fencing, get permits, etc. So, at the outset, you want to estimate the amount of land you think you will be farming and buy at least that much. But you will also want to buy a bit extra to allow yourself some expansion opportunities, because it's cheaper to buy a bit of extra land than to replace your fencing if you need more room.

In databases, the same principle is followed. You want to allocate the disk and memory space at the outset, and you give yourself a buffer in case you need more. Often these buffers take the form of unused data columns that might be used later -- or, as you've undoubtedly seen the phrase, "reserved for future use."

B. Now, suppose on your farm, there's a pocket of land that isn't useful. Let's say you planted trees for tree fruit, but the market for tree fruit has collapsed in your area. Maybe the climate has changed and the fruit is now low quality. Anyway, you've got this hole in your farm. Do you: a) leave everything where it is? or b) uproot all of your other crops and "slide them down" to make sure that there isn't a gap in there?

In databases, it's the same. If you find that a column isn't helpful for whatever reason, you don't eliminate it. Eliminating a column from a database is extremely inefficient. In most RDMS, there's not even a command to do it. You have to create a new table without the column, copy all the records from the original table into it, then delete the original table, and then move the new table back to where the original table was. Nobody does this, because it makes no sense. It's a huge amount of work that risks errors, when there's nothing wrong with having a database column go unused.

Does this make sense to you? Or do you need me to explain it like I did my ten year old when I started teaching him databases last year?
 
Forget about databases super (with all due respect), it is impossible for there to be millions of people to still be receiving SS payments after they are dead. Trump is just looking for an excuse to dismantle the main basis of our social safety net.
Of course that's true. But I think there's another issue, which is the enthusiasm with which some people accept nonsense about how terrible and inefficient the government is.

These people think that the presence of unused database columns is evidence of incompetence. It's worth at least presenting to them the knowledge that it isn't incompetence at all, that you could go into any database in any big organization and find loads of unused columns. When I was working with SAP, we used maybe 1/3 of the data columns they provided.

Sure, they will make up bullshit excuses and invent new principles or distinctions despite knowing zero about the issue, but at least present the information to them. Maybe in some other forum or area of their life, someone will be talking about this and they can take the chance to show off their knowledge by correcting the person and explaining the issue.
 
Two separate topics. It can be true that the right people are getting paid and, at the same time, it can be true that the system/data/reporting can have something wrong with it. When you're system is reporting things that are impossible by today's medical standards, that would seem to be a problem.
It isn’t reporting these things. It is Elon reporting these things. Because he doesn’t understand it, or is lying.
 
It isn’t reporting these things. It is Elon reporting these things. Because he doesn’t understand it, or is lying.
It is reporting these things because SS's backend is written in a programming language that doesn't have a date field.

Musk first made the claims during his Oval Office press conference last week, when he claimed that a “cursory examination of Social Security, and we got people in there that are 150 years old. Now, do you know anyone that's 150? I don't know. They should be in the Guinness Book of World Records … So that's a case where I think they're probably dead.”

While no evidence was produced to back up this claim, it was picked up by right-wing commentators online, primarily on Musk’s own X platform, as well as being reported credibly by pro-Trump media outlets.

Computer programmers quickly claimed that the 150 figure was not evidence of fraud but rather the result of a weird quirk of the Social Security Administration’s benefits system, which was largely written in COBOL, a 60-year-old programming language that undergirds SSA’s databases as well as systems from many other US government agencies.

COBOL is rarely used today, and as such, Musk’s cadre of young engineers may well be unfamiliar with it.

Because COBOL does not have a date type, some implementations rely instead on a system whereby all dates are coded to a reference point. The most commonly used is May 20, 1875, as this was the date of an international standards-setting conference held in Paris, known as the Convention du Mètre.

 
It is reporting these things because SS's backend is written in a programming language that doesn't have a date field.

Musk first made the claims during his Oval Office press conference last week, when he claimed that a “cursory examination of Social Security, and we got people in there that are 150 years old. Now, do you know anyone that's 150? I don't know. They should be in the Guinness Book of World Records … So that's a case where I think they're probably dead.”

While no evidence was produced to back up this claim, it was picked up by right-wing commentators online, primarily on Musk’s own X platform, as well as being reported credibly by pro-Trump media outlets.

Computer programmers quickly claimed that the 150 figure was not evidence of fraud but rather the result of a weird quirk of the Social Security Administration’s benefits system, which was largely written in COBOL, a 60-year-old programming language that undergirds SSA’s databases as well as systems from many other US government agencies.

COBOL is rarely used today, and as such, Musk’s cadre of young engineers may well be unfamiliar with it.

Because COBOL does not have a date type, some implementations rely instead on a system whereby all dates are coded to a reference point. The most commonly used is May 20, 1875, as this was the date of an international standards-setting conference held in Paris, known as the Convention du Mètre.

Your first sentence that you copied and pasted says exactly what I am saying…”Musk first made these claims…”

The only reason we are having this conversation is because Musk once again misrepresented the data, either because he didn’t understand it, didn’t care to understand it because that isn’t really the goal, or lied.
They originally used this so-called “quirk” to imply (and later stated directly by Trump and others) that there is massive fraud happening with social security. They are using these claims now to threaten cuts.

And all your bolded text doesn’t show there is anything wrong or broken in the system. Just that it is a legacy system that has been around a long time.
 
1. It's not just legacy systems. The systems I designed had unused flags. The systems others designed had unused flags. My friend has been working with databases his whole life; he sees unused flags.

2. I thought I explained this already, but I'll take another stab. I'll use an analogy to help you out.

A. Suppose you're buying some land and you're planning to turn it into a farm. It costs money to acquire the land, erect fencing, get permits, etc. So, at the outset, you want to estimate the amount of land you think you will be farming and buy at least that much. But you will also want to buy a bit extra to allow yourself some expansion opportunities, because it's cheaper to buy a bit of extra land than to replace your fencing if you need more room.

In databases, the same principle is followed. You want to allocate the disk and memory space at the outset, and you give yourself a buffer in case you need more. Often these buffers take the form of unused data columns that might be used later -- or, as you've undoubtedly seen the phrase, "reserved for future use."

B. Now, suppose on your farm, there's a pocket of land that isn't useful. Let's say you planted trees for tree fruit, but the market for tree fruit has collapsed in your area. Maybe the climate has changed and the fruit is now low quality. Anyway, you've got this hole in your farm. Do you: a) leave everything where it is? or b) uproot all of your other crops and "slide them down" to make sure that there isn't a gap in there?

In databases, it's the same. If you find that a column isn't helpful for whatever reason, you don't eliminate it. Eliminating a column from a database is extremely inefficient. In most RDMS, there's not even a command to do it. You have to create a new table without the column, copy all the records from the original table into it, then delete the original table, and then move the new table back to where the original table was. Nobody does this, because it makes no sense. It's a huge amount of work that risks errors, when there's nothing wrong with having a database column go unused.

Does this make sense to you? Or do you need me to explain it like I did my ten year old when I started teaching him databases last year?
"Does this make sense to you? Or do you need me to explain it like I did my ten year old when I started teaching him databases last year?"

I see you graduated from the @finesse College of Being Prick.

Liam Hemsworth Workaholics GIF by Comedy Central
 
Your first sentence that you copied and pasted says exactly what I am saying…”Musk first made these claims…”

The only reason we are having this conversation is because Musk once again misrepresented the data, either because he didn’t understand it, didn’t care to understand it because that isn’t really the goal, or lied.
They originally used this so-called “quirk” to imply (and later stated directly by Trump and others) to claim there is massive fraud happening with social security and are threatening cuts.

And all your bolded text doesn’t show there is anything wrong or broken in the system. Just that it is a legacy system that has been around a long time.
Again, what Trump or Elon said is beside the point. There is something wrong with their system regardless of whether or not every person is correctly being paid.

If my employer had a flaw in their backend system, that showed an extra zero at the end of each payroll payment ($4000 showed as $40000), but each person got paid the correct ($4000) amount, that would not negate the clear issue with the system/data/reporting.

When your system shows you paying money to tens of millions of dead people, that is, in itself, an issue.
 
"Does this make sense to you? Or do you need me to explain it like I did my ten year old when I started teaching him databases last year?"

I see you graduated from the @finesse College of Being Prick.

Liam Hemsworth Workaholics GIF by Comedy Central
But seriously. It has been explained. Then you just repeated the same nonsense claim. Here's an idea: stop talking shit you know nothing about.

I won't apologize for calling out and correcting ignorance, misinformation and/or untruth. In fact, I'm quite proud of my ability to contribute to our collective understanding of the world around us. I don't play favorites: hell, sometimes I even have to correct myself because I make mistakes.

Anyway, regardless of my style, does the explanation make sense to you?
 
It is reporting these things because SS's backend is written in a programming language that doesn't have a date field.
Because COBOL does not have a date type, some implementations rely instead on a system whereby all dates are coded to a reference point. The most commonly used is May 20, 1875, as this was the date of an international standards-setting conference held in Paris, known as the Convention du Mètre.

C also does not have a date field. Python does not have a date field natively. Programming languages that do have date fields implement them in the same way: as integers that get converted to a different visual representation by the code.

Recording dates and times as the number of days/hours/seconds/milliseconds after some recognized reference time is the most efficient way of doing it. By far.
 
But seriously. It has been explained. Then you just repeated the same nonsense claim. Here's an idea: stop talking shit you know nothing about.

I won't apologize for calling out and correcting ignorance, misinformation and/or untruth. In fact, I'm quite proud of my ability to contribute to our collective understanding of the world around us. I don't play favorites: hell, sometimes I even have to correct myself because I make mistakes.

Anyway, regardless of my style, does the explanation make sense to you?
giphy.gif
 
Back
Top