A variable _NULL was set to the value null in Javascript. All null checks were then done against the _NULL variable for an unknown reason.
I was investigating a server-side crash that occurred when users did not fully fill out some data. The text logs (the only means of debugging in this case) I added indicated that the value of a particular unset value was null, as expected. After being briefly baffled, I realized that what was being printed was the string "null" rather than the value null. At some point in the pipeline, null was converted to "null" automatically. It's unclear to me whether this feature had ever worked, but I was advised to just check for "null" as well as null and move on.
While I've only noticed this one time, I wouldn't be surprised if I've come across it more than once. A set of unit tests that compared results with a dummy dictionary in Python, like so:
A seemingly straightforward test, except Python (and a bunch of other languages) iterate through keys unless (key,obj) is specified, so the comparisons will always be the same for both objects. The actual data may as well be null, this test will always* pass.
* as long as it actually reaches the assert part