/sep 15, 2020

Write Code That Protects Sensitive User Data

By Jason Lane

Sensitive data exposure is currently at number 3 in the OWASP Top 10 list of the most critical application security risks.

In this blog post, we will describe common scenarios of incorrect sensitive data handling and suggest ways to protect sensitive data. We will illustrate our suggestions with code samples in C# that can be used in ASP.NET Core applications.

What is sensitive data?

OWASP lists passwords, credit card numbers, health records, personal information and business secrets as sensitive data.

Social security numbers, passwords, biometric data, trade memberships and criminal records can also be thought of at sensitive data.

What exactly sensitive data means for you will depend on:

  • Laws and industry regulations such as EU's General Data Protection Regulation (GDPR) or the UK's Data Protection Act (DPA) that govern the use of "personal data".
  • Business requirements. The law may not enforce strict measures around sensitive data that your application creates or stores for its users, but breaching that data would still hurt your users and, by extension, your business.

In software applications, we can think of sensitive data as:

  1. Most user data (for example, names listed in public user profiles may not be sensitive).
  1. Application data (such as session IDs and encryption keys) that helps protect user data from being exposed.

Various sources and authorities may have different definitions of sensitive data. However, if you're a business that develops an application that works with user data, it's in your best interest to use a broad interpretation of "sensitive data" and do your best to protect it.

What vulnerabilities can lead to sensitive data exposure?

Let's discuss some of the most common vulnerabilities that can expose sensitive user data.

Leaking access control that enables forced browsing to restricted content

Due to inadequate access control, users who are not expected to see sensitive data may in fact be able to access it, even though the data is not referenced by the application in any way. An attack called force browsing takes advantage of this situation.

Imagine you're a regular user of a web application, and when you look around the UI, you don't see any administrative functionality available. Still, if you manually enter a URL that you think may be available to admin users (such as https://www.myapp.com/admin), you do see the admin UI. This is forced browsing: the application didn't guide you to a restricted resource, but neither did it prevent you from accessing it.

Improperly managed sessions

When sessions are managed improperly, session IDs of authenticated users are at risk of being exposed, and attackers can take advantage of this to impersonate legitimate users. Two common attacks that are made possible by improper session management are session hijacking and session fixation. Attacks like these can have a severe impact if targeted at privileged accounts and can cause massive leakage of sensitive data.

One major reason why sessions can be mismanaged is that developers sometimes write their custom authentication and session management schemes instead of using battlefield-tested solutions, but doing this correctly is hard.

Insecure cryptographic storage

Insecure cryptographic storage refers to unsafe practices of storing sensitive data, most prominently user passwords. This is not about not protecting data at all, which results in storing passwords as plain text. Instead, this is about applying a wrong cryptographic process or a surrogate, such as:

  • Using an outdated and weak hashing algorithm (think SHA1 or MD5), which makes cracking hashed data quick and easy once the data has been exposed.
  • Using a custom hashing algorithm.
  • Using encryption instead of hashing for password protection.
  • Using protection that is not a cryptographic process at all, such as string transformations or Base64 encoding.

This vulnerability is extra important because secure cryptographic storage is the last line of defense: strong cryptography saves the data once it has been exposed by other risks in an application.

How do you protect sensitive data?

Let's see what kind of secure coding practices can help you avoid vulnerabilities such as the ones listed above, and minimize the risk of disclosing sensitive data.

To prevent forced browsing to restricted content

  • Implement a robust authorization mechanism with early and uniform authorization checks that are executed right after authentication.
  • Use proven frameworks for authentication and authorization. Modern frameworks often implement secure authentication and authorization behind the scenes, provide sensible defaults, and allow you to write extensions based on your application's requirements. For example, on the Microsoft stack, ASP.NET Core Identity is a proven framework that abstracts away authorization management.
  • Do not rely on hiding privileged UI as the only authorization check. Hiding a UI element will not prevent access to the resource that it refers to. For example, in an ASP.NET Core MVC application, let's say there's a link to a view that only authenticated users should see:
    @if (User.Identity.IsAuthenticated)
    {
        <p> <a asp-area="" asp-controller="Home" asp-action="Hidden">This is a hidden page!</a></p>
    }

However, if the Home controller's Hidden action is not configured as available to logged-in users only, an anonymous user would still be available to enter the direct URL and access the hidden page. To prevent this, the controller action should be protected as well:

    [Authorize]
    public IActionResult Hidden() => View();
  • Cover authorization logic with tests. As your codebase evolves, inadvertent changes can create vulnerabilities, and it's vital to make sure they are detected as soon as possible. This is why it's important to write and maintain automated tests for authorization code that test all roles, as well as anonymous access.

To avoid improperly managed sessions and session ID leaks

  • Do not expose session IDs in URLs. Keeping a session ID as part of a URL is an easy way to enable session hijacking via URL sharing or logging.
  • Keep session IDs in cookies only. Instead of using URLs, only keep session IDs in cookies. This way, unless an attacker can access request headers, sessions will not be hijacked maliciously. In addition, they certainly won't be hijacked unintentionally as a side effect of URL sharing.
  • Use HTTPS throughout your application. Don't refer to HTTP resources from pages that use HTTPS. Make sure to configure HTTP to HTTPS redirects. If for some reason you're forced to use a mix of HTTPS and HTTP, create new session IDs every time when connection security changes from HTTPS and HTTP, or vice versa.
  • Use HSTS (but do it carefully). When you've gained confidence in your full-HTTPS infrastructure, start setting the HSTS (Strict-Transport-Security) header that prohibits the web browser from attempting to communicate with the web application via plain HTTP ever again. Since browsers actively cache HSTS settings, start with a small max-age value and gradually increase if all goes well. This is how you can configure initial HSTS options in an ASP.NET Core application:
    public void ConfigureServices(IServiceCollection services)
    {
        // ...
        services.AddHsts(options => options.MaxAge = TimeSpan.FromHours(6));
        // ...
    }
  • Maintain healthy cookie settings. Unless client-side scripts in your application need to read or set cookie values, set the HttpOnly attribute. When transmitting cookies over HTTPS, make sure to set the Secure attribute. Enforce a Strict same-site policy if your application doesn't use OAuth2, or Lax if it does. Finally, set cookie expiration to a reasonably short time span. Here's how you would configure cookies in an ASP.NET Core application that uses ASP.NET Core Identity:
    public void ConfigureServices(IServiceCollection services)
    {
        // ...
        services.ConfigureApplicationCookie(options =>
        {
            options.Cookie.HttpOnly = true;
            options.Cookie.SecurePolicy = CookieSecurePolicy.Always;
            options.Cookie.SameSite = SameSiteMode.Strict;
            options.ExpireTimeSpan = TimeSpan.FromHours(1);
        });
        // ...
    }

To keep cryptographic storage secure

  • Do not use encryption for password storage. Use hashing instead. Encryption is a two-way process, and hashing is a one-way process. When a database of symmetrically encrypted passwords is exposed, the attacker gets access to the encryption key and instantly restores passwords to their original form, making protection useless. To make it difficult or impractical for an attacker to obtain original passwords, they should be hashed.
  • Apply a unique salt to each password. A salt is a randomly generated string added to a password before hashing. Salting protects against attacks based on pre-computed hashes and helps hide identical passwords in a database.
  • Use a modern hashing algorithm that is slow (a good thing!) and designed for secure password storage. The extent to which an algorithm is slow should be configurable using a work factor. OWASP currently recommends choosing between Argon2id, PBKDF2, and Bcrypt.
  • Never create your own hashing algorithms for production applications. Writing hashing algorithms is insanely hard. A half-baked custom algorithm will inevitably introduce multiple weaknesses, thus defeating the purpose of the endeavor.

Summary

We've learned how applying a set of secure coding practices in access control, session management and cryptographic storage can help you avoid a set of vulnerabilities and minimize the risk of disclosing sensitive data.

There's one more fundamental advice that OWASP gives: don't store sensitive data unless you absolutely need to. Data that is not stored cannot be compromised.

Whatever decisions you make on data storage policy, remember to detect vulnerable code early with continuous testing, code review, static and dynamic analysis.

Related Posts

By Jason Lane

Jason is Veracode's Principal Product Marketing Manager for SAST, Security Labs, and eLearning. He's spent 20+ years working in product management, marketing, and IT roles along with a long stint working with developers and startup companies. Now, he’s helping Veracode to drive awareness and adoption of secure coding and DevSecOps best practices within the developer community.