This is definitely a case of overload from transmitting too close to the gateway creating phantom spurious signals. The facts that the packets overlap in time, one is absurdly strong, and the other much weaker are telling.
You typically want the strongest RSSI’s to be -40 or lower, -60 is about the strongest you’d see in a typical actual deployment.
All physically possible radio components can be characterized by a maximum signal power level they can handle before they start suffering more than a specified tiny degree of distortion. In an RF system, “distortion” typically shows up as spurious fake signals that didn’t exist on the air, but were invented by the non-linear distortion of a signal chain overdriven with too strong a signal.
In the architecture of a typical LoRaWAN gateway, the antenna input is fed to two front end radio chips that function as downconverters for receive. For the 868 band, these are typically tuned at 867.5 MHz and 868.5 MHz. Each converts a useful range of about +/- 500 KHz from that input frequency down to an intermediate frequency to be fed to a different input of the single baseband processing chip. As this is an IQ design, the actual IF frequency is 0 Hz, with signals falling at up to +500 KHz and -500 KHz.
For your specific example of a real signal at 868.1 MHz and a phantom signal at 867.1, in the typical global_config.json for 868 MHz, we can see that these channels are each at a -400 KHz offset from an IF, but they are at the offset from the two different IF’s of the two different radio chips. So likely what is happening is that the absurdly strong IF signal from one radio to the baseband chip is leaking over into the IF input from the other radio making the baseband think that both radios are seeing a signal at this offset from their respective center frequencies - the coupling might be in the baseband chip itself, through free space, in the board power supply, or whatever. But it’s entirely expected when the signal level is so absurdly high.
That’s actually not what I expected to find before I traced this particular case through the global_config.json. Rather, what I expected was something like the case in some US configs where there’s a channel defined at +100 KHz from an IF center, and another at +300 KHz. If you imagine a sine wave that is overdriving an amplifier, what will start to happen is that near the peaks, the amplifier will go into “compression” slightly lowering them. And changing the shape of a sine wave causes harmonic distortion, especially at the 3rd multiple of the actual frequency. So it would seem that a real signal at +100 suffering compression might start to show up as a weak spurious signal at +300 (though the lora modulation would be 3 times as wide, still, it might fool the processing). There are also ways that higher multiples can “fold back” from the sample frequency into the passband.
Anyway, figuring out the exact mechanism of the distortion isn’t the important point - what matters is simply understanding that like all radios, gateways have maximum signal levels they can see before they start behaving oddly. Not only can you get spurious signals, you can also get corruption of the real signals. Additionally, even when not creating fake signals, a node a few 10’s of meters away can still manage to “blank out” all channels on a gateway whenever it transmits, preventing signals from other more distant nodes from being received, even when those are on distinct frequencies that should not interfere. Plus the moderate ability of a gateway to receive two signals with different spreading factors on the same frequency at the same time only works when the difference in the signal levels is moderate.
Keep the nodes far enough from the gateways not to overload them.
If you have to test closer, replace the node or gateway antenna with an appropriate non-inductive resistor / dummy load (never operate a potential transmitter without an antenna)
Just for completeness, two other points which have nothing to do with what is actually going on here:
Although NbTrans is communicated in the LinkADRReq MAC command, that is a multi-purpose command used for far more than ADR. In particular, many network servers will have to send it to even non-ADR nodes in order to configure the channel map, or the NbTrans if they wanted to. Unfortunately all three things - channel map enables, datarate and power level, and NbTrans are merged together in a single MAC command, and it’s only possible to send all of them, even if the server doesn’t want to change the node’s power. But the case described in this thread doesn’t involve by-the-spec LoRaWAN at all, so that’s all irrelevant.
Frequency error in the node’s radio is not likely to be an issue here. LoRa is a fairly wideband modulation, so exact center frequency isn’t all that critical, and the difference between even adjacent channel frequencies (which these are not) is wider than reasonably expectable frequency error. It would theoretically be possible to have close-in spurious outputs from a transmitter using an upconversion topology however. If one looks at an FCC or whatever test report, these may be visible, as they are regulated. However, typically regulations only require that the spurious outputs be a certain degree weaker than the intended one, and the difference in signal levels seen here probably meets that easily. Spurious outputs (weak enough to meet the regulatory rules) wouldn’t typically show up at a gateway in a deployed system, because (to save battery power expended when transmitting and avoiding blanking more distant nodes) the power level of the main signal should be adjusted by the installer or ADR such that the intended signal is only being received at a level sufficient for confidence, rather than the absurdly strong overload level signal being seen here.