A critical vulnerability in Apache Hadoop, the widely used framework for distributed processing of large data sets, requires urgent patching.
Uber software engineer Ekanth Sethuramalingam he has identified a critical vulnerability in Apache Hadoop – the widely used framework for distributed processing of large data sets.
The Hadoop vulnerability, CVE-2018-11768, affects Hadoop versions 3.1.0 to 3.1.1; 3.0.0-alpha1 to 3.0.3; 2.9.0 to 2.9.1; 2.0.0-alpha to 2.8.
Describing it as involving a “a mismatch in the size of the fields used to store user/group information between memory and disk representation”, the Apache Software Foundation said the vulnerability “causes the
user/group information to be corrupted across storing in fsimage and
reading back from fsimage.”
FSimage is a file stored on the OS filesystem that contains the complete directory structure (namespace) of the Hadoop Distributed File System with details about the location of the data.
There is currently little public information available about the precise details of how to exploit the Hadoop vulnerability (Computer Business Review has contacted Sethuramalingam for further details and will update this piece when we recieve them) but given its severity, users should patch urgently.
A security list email from the ASF says users should upgrade to Apache Hadoop 2.8.5, 2.9.2, 3.1.2 or upper.
“This vulnerability fix contains a fsimage layout change, so once the image is
saved in the new layout format you cannot go back to a version that doesn’t
support the newer layout. This means that once 2.7.x users upgraded to the
fixed version, they cannot downgrade to 2.7.x because there is no fixed
version in 2.7.x. We suggest downgrade to 2.8.5 or upper version that
contains the vulnerability fix.”
With unpatched software one of the single greatest ongoing causes of data breaches and other security issues, prompt fixes of issues like this are crucial to security hygiene.