Detect and Locate GPS Jammers using radar gun!

Can't hide any more. GPS Jammer Locator and Detector!  Image: Chronos Technology

Can’t hide any more. GPS Jammer Locator and Detector! Image: Chronos Technology

The most infamous GPS jamming event in history of GPS has to be the San Diego Airport disruption. A Single event brought the flight control room in the San Diego airport to its feet, wondering and panicking to as to what was really going on. ATM machines refused customers , the Harbor traffic management system was going haywire. All this because of a GPS jamming event. A clear indication that the GPS system does not just run your Sat Nav on the car, it does a lot more than that.

GPS is in a sense, is a silent force that powers the modern communication world. Mobile network service providers use GPS time signals to coordinate how your phone talks to the cell phone towers. Electricity grids turn to GPS for synchronization when they are connected together. Banks and stock exchanges use the GPS/GNSS for time-stamping transactions without which commerce would be rendered impossible.

The GPS Jamming was eventually identified after 3 days of investigation – a navy exercise to test procedures when communication was down. The Technies had also jammed GPS signals unintentionally. Unfortunately, the jamming expertise was not just localized or available to the Navy or the Military. There was another infamous event where a truck driver was using a GPS jammer near an airport to avoid being tracked! The GPS jamming device is available for under 30$ online but it’s illegal to use/buy such a device only in a few countries. Many across the world have not yet realized the danger and disruptions that these devices can cause.

So isn’t there anything that can be done to find the Jammers?

Actually, until now it was possible to find the presence of a GPS jammer but was not possible to locate it. Enter Chronos! The company claims that its £1600 GPS Jammer Detector and Locator System can identify a jammer-using vehicle in a multi-storey car park – and can pinpoint portable devices in drivers’ pockets when they have left their cars. Pretty impressive! Of course, they didnt explain how the system works but some people have guessed that it has to be a signal triangulation of some sort. But sadly, it only works for L1 signals. I suppose they are working on getting L2 signals covered as well. Hope so!

Detect and Locate GPS Jammers – No more hiding!

Say thanks for this article (0)
The community is supported by:
Become a sponsor
#
#Copernicus #Environment #Geospatial analytics #Geospatial for Good #Ideas #Landsat #Optical #SAR
Introduction to Satellite Imagery and its Relevance to Disaster Management
Justyna Więcławska 09.14.2023
AWESOME 4
#Business #Construction #Drones #Drones #Featured #Satellite tasking
Exploring the role of satellite and drone data in construction progress monitoring
Aleks Buczkowski 11.15.2023
AWESOME 5
#Airbus #Data processing #Deep Learning #Ideas #Maxar #Optical #Science
Enhancing Satellite Imagery Readability with Super-resolution Machine Learning Models
Aleks Buczkowski 05.16.2023
AWESOME 13
Next article

…Your machine can think like you and work faster than you

Last few weeks, I was juggling with huge amount of spatial data (temporal extent: 20 years (weekly data), spatial extent: whole world) from different sources (i.e., ESA, DLR, VITO, USGS, USDA etc.) which are in different formats (i.e., *.img, *.tif, *.txt, *.ascii, .GRID etc.) and of different projections. It was a complete mess. I am supposed to process all these data by bringing all in one common format/common projection system and calculate the long-term average and provide the averaged data in a spread sheet format according to some administrative unit. ops! that was a nightmare.

What i have tried to come to a solution:

1. First, I have tried both commercial and open-source software (i.e., ArcGIS, Qgis). trust me, that was ‘THE’ most dumbest idea i have had for such processing.

2. Secondly, I tried some Java library (i.e., GeoTools). that wasn’t bad but still I had to face the problem of converting from different format to GeoTIFF and of-course projection and transformation problem; that’s usual (as some provider always have user-defined projection system and woohooo!!! that’s some crap you have to spend some more time then you might expect).

3. Thirdly, I tried Python with GDAL library. that was better but not the best. because there were still some problem such as shortage of built-in functions (i.e., dealing with file system and network security, complex file naming conventions etc. )

4. Fourthly, i tried to develop some Procedure using oracle spatial inside the oracle database/Geomatica. that’s effective, convenient but not the fastest because it has its own format of processing (i.e., *.pix) and that’s time-consuming.

5. Finally, i have found THE best solution which is simplest (70 line of code), fastest (processing required only 15 min), convenient (you can save the *.sh file and run it any time for any kind of mentioned data), time-saving (helps to stop re-doing the task over and over again) and effective (my boss got exactly what he wanted, before the dead-line…).  the “Shell Scripting” is the solution. write just 40 line of basic code and that will do the whole (described above) task for you.

S6OAON~9

Shell Scripting for Geospatial data Processing/analysis:

Let me give you some example what i have used to solve the above problem. Do you want to convert all the different format/ projection/ transformation/ conversion into a common format/ projection/ transformation/ conversion or clip the raster based on *.shp file or some defined extent? use the GDAL utility inside the shell script (gdalwarp and gdal_translate) and these procedures is really awesome because these do not avoid the null values/No Data values in raster analysis/computation and so. Then, if you need to retrieve some averaged pixel value according to some boundary the  solution might be using the ‘gdalinfo’ utility and that will create an *.xml/*.info file with the necessary statistics and after one can simply use the ‘grap’ or ‘xpath’ command to extract the necessary statistics from the *.xml/*.info file and print the output according to the boundary. if you need to do some mathematical operation go for ‘gdal_calc.py’.  there are many other interesting actions which will make your problem easier and let you go bed early.

Then again, if you are dealing with temporal data then you might want to name the files with time stamps. If that’s the case, you can easily do that with some shell command like ‘rename’ or ‘sed’. Besides, ‘awk’ command can be very useful to find a matching name in a spread-sheet and extract the data from the spread-sheet and use that extracted value to rename the file/folder name. I also have found that the ‘date’ command can be very useful to convert date from different format.

Clipping the raster

All the tasks discussed above can be formatted separately and then if we are satisfied with the certain solution we can merge them together in a single *.sh script. next time you just put the retrieved data (from different source) in a $source folder and you will get the expected data in a $destination folder. So, your machine can think like you and work faster than you if you train and teach it.

Read on
Search