Solved

Login Auditing


I’m looking to build an analytics report for a Login Audit based on “Failed Authentication Count => 3”. However, when built out and I add the filter for “Updated Date UTC Date is in the past 24 hours” it gives me all failed login attempts total for that user not just the failed login attempts within the last 24 hours. What am I doing wrong here?

 

Does anyone have any better suggestions for auditing failed login attempts with email notification?

 

 

icon

Best answer by Jaycee.Elliott 30 May 2023, 16:39

View original

This topic has been closed for comments

14 replies

Userlevel 3

Good morning @daniel.bouck 

 

First let me say that I have not been successful using the User Sessions Data Explore (Analytics View). I am not sure that the data in this view is actually valid (if you review the available fields, the data mappings are incorrect → i.e., User Full Name is stored in the User Badge Number field). Also, let me direct you to Mark43 employees that may provide more concise information (@DavidSchwindt). Also, as far as I can tell, you are the only person to post on this forum in almost a year. I don’t think Mark43 is actually pushing this platform any longer…

That said, it is difficult to diagnose what is contributing to your query returning additional records than you expect. A couple of things for you to confirm and review are listed below:

 

  • You mentioned “Updated Date UTC Date is in the past 24 hours.”
    • I am not sure how Looker handles dynamic timeframes (i.e., past 24 hours) on fields that are date type. The Updated Date UTC Date field is formatted as a parsed date and does not contain time values. I assume that the software handles this elegantly, but to be sure, I’d recommend using the field with the Created Date Utc Time field in your filter instead. This field is datetime format and should be more precise for your needs. 
       
  • The second component of your query is to return records (assuming grouped by user) that have three or more failed authentications in the filtered period. I was able to use the Failed Authentication Count field as created. It appears that this field is grouped by the user in the query used to create the analytics view and it is unclear what else is contributing to the grouped value. It does seem like the data type is numeric (or at least Looker recognizes it as numeric). Mark43 would have to provide a definition of this field and how it is calculated before I’d feel comfortable using it.
     
  • Below I snipped an image of a quick table visual with the fields I referenced that should be used. Again, without knowing the script/query feeding the User Session explore, I would stray away from using it. I haven’t seen documentation for it and it is not an exposed view in the data lake that I could leverage to reverse engineer some understanding. 

     

 

 

I hope this helps (or sways you from using this source for sessions information). I apologize for not providing you another alternative. This is a topic we have wanted more insights from for our own audits, but we have not been supplied a solution from Mark43 as of yet.

Best,

Jaycee

Hi @Jaycee.Elliott 

 

Thanks for the reply and info, I saw this is a pretty inactive system but figured I’d give it a shot. I appreciate your feedback as well and agree the data doesn’t seem to be correctly mapped.

 

Also, as far as I can tell, you are the only person to post on this forum in almost a year. I don’t think Mark43 is actually pushing this platform any longer…

Oh they still push it, David Schwindt was at our agency a few weeks ago pushing it, and it was pushed on their latest webinar but you are correct, it doesn’t seem to be used at all by anyone really.

 

Can I ask, what do y’all use to monitor login audit logs as mandated by CJIS (that is if you’re a US agency.)? Our previous CAD/RMS vendor had a very straight forward auditing tool but Mark43 does not seem to have this and the closest thing I can find is this analytics explorer. However, as you noticed isn’t really accurate.

 

This is a topic we have wanted more insights from for our own audits, but we have not been supplied a solution from Mark43 as of yet.

 

This seems to be the running case with Mark43 in many aspects. Hopefully there is a solution soon.

Userlevel 3

@daniel.bouck 

That is a good question and outside of my scope of work. This is likely performed by our Records folks. 

I do know that iterative failed authentication attempts into the system result in account locking in compliance with CJIS 5.5.3, but I don’t know what is provided, logged, or performed with regard to unsuccessful login attempts. 

Sorry to be less than helpful in that regard. I would appreciate you sharing any insights you identify on this process though!

Cheers, 

 

Jaycee

@daniel.bouck 

We gave up on analytics, and have built our own solution. I tried to see if I could write some SQL against the data lake to pull this for you.

 

The database field search query I have didn't find a field containing %auth% or %fail% anywhere in the data lake. The data lake documentation I have didn't provide any clues, and I didn't see any tables that looked like it would contain this information.

 

If you could provide me the SQL output from your exist report (it's been a long time since I used analytics, and I've forgotten how to do that), maybe I can find out what's going on, but I suspect this information isn't replicated to the data lake, so I'm surprised that a "Failed Authentication Count" is available at all.

 

Kevin M.

Elk Grove Police Department

WITH rms_session_profiles AS (SELECT sp.*,
up.agency_name,
up.full_name, up.first_name, up.middle_name, up.last_name,
up.badge_number,
up.department_id
FROM rms.session_profiles sp
inner join analytics_rms.vw_user_profiles up
ON sp.user_id = up.user_id
)
SELECT TOP (500)
rms_session_profiles.full_name AS [rms_session_profiles.badge_number],
rms_session_profiles.operating_system AS [rms_session_profiles.operating_system],
rms_session_profiles.browser AS [rms_session_profiles.browser],
rms_session_profiles.browser_version AS [rms_session_profiles.browser_version],
rms_session_profiles.ip AS [rms_session_profiles.ip],
rms_session_profiles.successful_authentication_count AS [rms_session_profiles.successful_authentication_count],
rms_session_profiles.failed_authentication_count AS [rms_session_profiles.failed_authentication_count],
(CONVERT(VARCHAR(10),rms_session_profiles.updated_date_utc ,120)) AS [rms_session_profiles.updated_date_utc_date],
(CONVERT(VARCHAR(5), CAST(rms_session_profiles.updated_date_utc AS DATETIME), 108)) AS [rms_session_profiles.updated_date_utc_time_of_day]
FROM rms_session_profiles
WHERE ((((( rms_session_profiles.updated_date_utc )) >= ((DATEADD(hour,-23, CONVERT(datetime, CONVERT(VARCHAR(13), CAST(CONVERT(DATETIME2,SYSDATETIMEOFFSET() AT TIME ZONE 'Greenwich Standard Time') AS DATETIME), 120) + ':00:00', 120) ))) AND (( rms_session_profiles.updated_date_utc )) < ((DATEADD(hour,24, DATEADD(hour,-23, CONVERT(datetime, CONVERT(VARCHAR(13), CAST(CONVERT(DATETIME2,SYSDATETIMEOFFSET() AT TIME ZONE 'Greenwich Standard Time') AS DATETIME), 120) + ':00:00', 120) ) )))))) AND (rms_session_profiles.failed_authentication_count ) >= 3 AND (rms_session_profiles.department_id ) = [DEPT-DB-ID]
GROUP BY
rms_session_profiles.full_name ,
rms_session_profiles.ip ,
rms_session_profiles.operating_system ,
rms_session_profiles.browser ,
rms_session_profiles.browser_version ,
(CONVERT(VARCHAR(10),rms_session_profiles.updated_date_utc ,120)),
(CONVERT(VARCHAR(5), CAST(rms_session_profiles.updated_date_utc AS DATETIME), 108)),
rms_session_profiles.successful_authentication_count ,
rms_session_profiles.failed_authentication_count
ORDER BY
8 DESC

 @Kevin M. 

 

I’ve attached the SQL query from analytics, note I replaced our Datalake ID with [DEPT-DB-ID] so you’ll need to change that to your agencies Datalake ID. I appreciate your assistance as I can do basic SQL queries but no SQL master.

 

I am hoping Mark43 has an analytics solution for this so I don’t need to access the datalake every time for this audit log.

It appears rms.session_profiles is not a table in our data lake, but we are using SSO. So either M43 is not putting this table in the data lake, or it just doesn't show up if you use SSO. Either way I couldn't write you a direct query against the data lake.

 

If I interpret this SQL right, a session is probably a unique connection. So if a user tried once, refreshed the browser, tried again, failed, refreshed the browser and tried a third time and failed, it would probably count as three separate sessions. However, the analytics module is probably doing a sum of all failed logins that were returned by the query for the date range specified. So I'm suspecting this analytics report is doing what you want it to do.

 

You might be able to verify by intentionally creating some failed logins

 

However, this is conjecture.

@Kevin M. I appreciate you looking into it.

 

Unfortunately I have to disagree that analytics is doing what I want it to do as its reporting over 300+ failed attempts on multiple users “in the past 24 hours.”. This looks more like its reporting the total number of failed attempts ever not within the past 24 hours.

 

I may need to reach out to support or David directly to see if we can come up with a solution here.

Did you try using created date UTC instead of updated date UTC?

Never mind, it wasn’t available.

@Kevin L @Jaycee.Elliott it appears using “Created Date Utc Date” vs “Updated Date Utc Date” worked as expected so far. I appreciate both your inputs!

Userlevel 3

Sounds Good @daniel.bouck. I think my comment mentioned the update date, though my snip showed I was using created date. Apologies for any confusion.

@Kevin M. we too do not have the rms_session_profiles table exposed or replicated in the data lake. We previously attempted to leverage that table to audit inactive users, but to no avail. I alos totally forgot that we could view the SQL script in analytics. Thanks for that reminder; will help for troubleshooting things in the future.

@Jaycee.Elliott I never understood why all of an agencies data isn’t exposed to that agency. I’ve noticed several portions of data that’s not in analytics or the datalake.

Userlevel 3
Badge

I am impressed by the community support here by @Jaycee.Elliott and @Kevin M. Thank you for sharing your knowledge! I received an email notice earlier today when @daniel.bouck posted this, but I have limited personal knowledge on the User Sessions explore. During the day I logged in to work with the explore, look at the underlying views, and look for any documentation to learn more. To supplement, I also reached out to engineering to get additional information and I am just able to catch up and reply on this thread now. Here is what I have learned:

I pretty quickly noticed the same data mapping issue @Jaycee.Elliott mentioned (User Badge Number returning User Full Name).

The included dimensions and measures are not intuitive. Everyone would benefit from better documentation in the explore explaining the purpose and/or source of each of them.

The major factor here is that the underlying data/view are not returning the results you need for your purpose. I am told the [Failed Authentication Count] is an aggregate that is not at the grain you would need for your purpose. Your use case is understandable and reasonable, however, this explore falls short of fulfilling that type of query.

I am going to go to the engineering team to see what we can do to improve this explore. As I understand from the discussion, the following would be valuable:

1. Fix issues where dimensions are returning incorrect data
2. Provide a failed authentication count at a grain where the user is able to solve the issue described by @daniel.bouck 
3. Add documentation to the dimensions/measures to assist the user in understanding the purpose/source/limitations
4. Follow up in this thread when I better understand what we can provide to improve this

I welcome and encourage feedback on additional items that would be valuable here. 

Dave

Userlevel 3

@DavidSchwindt I am not sure what grain is being proposed, but I would certainly benefit from a datalake table/view, analytics explore or otherwise that provides all agency users and their most recent login attempt (successful or not). This will enable us to disable access for any potentially inactive users that have not demonstrated a need to be in the RMS.

The crux is that all users (regardless of whether they have ever logged in) need to be included in the query. This would be feasible if we could develop a merge query with an explore that only provided user profile information, but that is not currently available in the analytics module. Even if this is not feasible, I can always export the list from analytics and reconcile it against the datalake tables/views, but we don’t have sessions data anywhere else. 

This CTE (rms_session_profiles) is provided in the user sessions SQL definition:
 

WITH rms_session_profiles AS (SELECT sp.*,
        up.agency_name,
        up.full_name, up.first_name, up.middle_name, up.last_name,
        up.badge_number,
        up.department_id
        FROM rms.session_profiles sp
        inner join analytics_rms.vw_user_profiles up
          ON sp.user_id = up.user_id
      )

The inner join in the CTE excludes the user_profile_id ‘ s that don’t exist in the sessions due to the inner join between the rms.session_profiles and the analytics_rms.vw_user_profiles. In order to serve my purpose, I need a right join here. This isn’t configurable for agencies with analytics explorer roles because it is configured by the developers of the data model. We can (or could) accomplish the right join with a merge query IF the analytics_rms.vw_user_profiles view was mapped into the analytics module as an explore for us to leverage (there would be many benefits to this). Though we would also need to understand what/why the rms.session_profiles object is incomplete before utilizing it. The result set seems to be VERY incomplete… My own profile doesn’t show a session in the past 60 days, so we know the data is unreliable. Perhaps I am misundertanding what the explore is. Documentation defining the explore is warranted and would clarify what it is and what it is not. 

 

Best,

 

Jaycee