Karen sits in her Germantown office on a Tuesday morning, pulling up three browser tabs. One shows the Memphis Police Department’s crime dashboard. Another has the FBI’s latest Uniform Crime Report numbers. The third is a Bureau of Justice Statistics release from last month. All three claim to describe violent crime in Memphis. None of them agree.
This is not a hypothetical. This is the actual situation facing every operations director, property manager, and security company owner in Shelby County right now. And the CBS News report published on October 6 made it worse by putting the contradictions on national television without offering much guidance on what the numbers actually mean.
If you’re making security decisions based on crime data in Memphis, you need to understand exactly where these numbers come from, why they diverge, and which ones matter for your specific situation.
What CBS Got Right (and What They Missed)
The CBS story highlighted something that local security professionals have grumbled about for years: Memphis crime statistics from different agencies paint different pictures. MPD tracks incidents using FBI Uniform Crime Report definitions for Part 1 crimes, which include murder, rape, robbery, aggravated assault, burglary, larceny-theft, motor vehicle theft, and arson. That’s the standard classification system used by police departments across the country.
Here’s where it gets complicated. MPD also tracks subcategories that don’t fit neatly into the UCR framework. Non-fatal shootings are one example. If someone gets shot on Airways Boulevard and survives, that incident falls under aggravated assault in the UCR system. MPD counts it there, sure, and they also track it separately as a non-fatal shooting. Same incident, two different statistical categories. Neither one is wrong.
The city’s public-facing dashboard rolls some of these subcategories together in ways that can inflate or deflate totals depending on which tab you click. A property manager checking “violent crime” on the city site might get a number that includes categories the FBI wouldn’t count, or excludes ones the feds would add.
CBS noted the discrepancy. What they didn’t explain was the specific mechanism causing it.
The Federal Side of the Equation
The Bureau of Justice Statistics uses a different definition of violent crime than either MPD or the FBI’s UCR program. BJS includes simple assault in its violent crime totals. The UCR does not.
That single difference changes everything.
Simple assault means an attack without a weapon that doesn’t result in serious injury. A shoving match outside a bar on Beale Street. A slap during an argument at a gas station on Poplar. These incidents are real. They matter to the people involved. And in Memphis, where aggravated assaults already dominate the violent crime stats, adding simple assaults into the mix pushes the totals even higher.
When BJS releases a report saying violent crime in a city is X, and MPD says it’s Y, the gap isn’t evidence of dishonesty. It’s two agencies measuring slightly different things and calling them by the same name.
For Karen, sitting in her office trying to figure out whether her properties on Summer Avenue are safer this quarter than last, the distinction is not academic. If she’s comparing MPD dashboard data from Q3 2025 against a BJS report from the same period, she’s comparing apples to something that looks like an apple, tastes like an apple, yet has an extra chunk of fruit glued to the side.
Jeff Asher’s October Question
Crime analyst Jeff Asher raised a pointed question on his Substack earlier this month. The city’s dashboard showed a notable crime drop in recent weeks, and Asher questioned whether the decline was as steep as the numbers suggested. His concern wasn’t that MPD was fabricating data. It was more specific than that: the way the dashboard presents rolling averages and week-over-week comparisons can smooth out spikes or dips depending on the comparison window.
Pick a bad week from last year as your baseline, and this year’s numbers look great. Pick an average week, and the improvement shrinks. Pick a particularly good week, and you might even show an increase.
This is standard statistical analysis, not conspiracy. Every city dashboard in America has the same limitation. The question is whether Memphis’s dashboard makes it easy or hard for non-statisticians to understand what they’re seeing.
Right now, it’s hard. The dashboard presents data clearly enough for someone who already understands UCR categories, rolling averages, and year-over-year methodology. For a property manager or small business owner on Winchester Road trying to figure out if their parking lot is safer this fall? The dashboard doesn’t meet them where they are.
What Security Professionals Actually Need
Here’s the practical problem. None of these data sources were designed for the people who use them most.
MPD’s dashboard was built for public transparency and media reporting. The FBI’s UCR was designed for national comparisons between jurisdictions. BJS surveys were built for academic research and federal policy analysis. All three serve their intended purpose reasonably well.
Nobody built a data source for the operations director at a Memphis property management company who needs to decide whether to extend a security contract at a strip mall on Getwell Road, add cameras at a warehouse in Whitehaven, or shift patrol hours at an office park in East Memphis.
Karen needs to know three things: Is crime going up or down in the specific areas where I have properties? What types of crime are most common near my sites? Are my current security measures working?
The city dashboard can partially answer the first question. MPD’s precinct-level data helps with the second. The third question can’t be answered by any public dataset.
Which Numbers Should You Trust?
Trust is the wrong word. The better question is: which numbers are useful for your decisions?
If you’re writing a report for corporate headquarters that compares Memphis to Nashville or Charlotte, use FBI UCR data. It’s the closest thing to an apples-to-apples comparison between cities. Just know that the most recent complete UCR dataset typically runs 12-18 months behind the current date, so you’re always looking at old numbers for cross-city comparisons.
If you’re tracking month-to-month trends within Memphis, the MPD dashboard is your best bet. It updates faster than anything federal. Accept that it includes some subcategories other agencies wouldn’t count, and use it for trend direction rather than precise totals. Is property crime going up or down in Raleigh? The dashboard tells you that, even if the exact number might differ from what BJS would report.
If you’re making staffing decisions for a specific site, neither source gives you what you need on its own. Combine the dashboard’s trend data with your own incident reports, talk to your security provider about what they’re seeing on the ground, and check Shelby County Sheriff’s Office reports for areas outside MPD jurisdiction.
One more thing. When someone sends you a news headline saying Memphis crime is up or down by some dramatic percentage, check the baseline period. Check whether they’re using MPD, FBI, or BJS definitions. Check whether they’re counting simple assault. A 30% drop using one methodology might be a 15% drop using another, and both numbers can be technically correct.
The Real Gap in Memphis Crime Data
The deepest problem isn’t the difference between local and federal numbers. It’s the absence of granular, site-level data that security professionals can act on.
Memphis has better crime data transparency than most mid-size American cities. The dashboard exists. It gets updated. MPD publishes weekly reports. That puts Memphis ahead of dozens of comparable cities that release data quarterly or not at all.
Still, there’s a gap between “transparent” and “useful.” The data tells you what happened in a precinct. It doesn’t tell you what happened in the parking lot of your client’s pharmacy at the corner of Mendenhall and Park. For that level of detail, you still need to file public records requests, build relationships with precinct commanders, or pay for a commercial crime data service.
Security companies that invest in their own data collection (incident tracking, patrol logs, camera analytics) end up with better operational intelligence than anything the city publishes. That’s not a criticism of MPD. It’s a recognition that public crime data was never meant to replace the kind of specific, site-level information that drives good security decisions.
Karen’s best move isn’t choosing between the city dashboard and the FBI report. It’s using both, understanding the limitations of each, and filling in the gaps with information she controls.
The numbers will never perfectly align. That’s not a scandal. It’s statistics.