Jul 18, 2017

Microsoft TechNet UK Blog: Parquet.Net


Edited: Jul 20, 2017

We're very pleased that Microsoft's TechNet UK blog recently published an article on Parquet .Net, outlining what it is, how it works and what situations it is useful for. With that blog post as a starting point, this article will constantly update with handy links and resources for getting started with Parquet .Net.



Go here for Parquet .Net's GitHub page.

Go here to get started with Parquet .Net on GitHub.

Go here to see the project issues, bugs and questions on Github.


Go here for installation instructions on NuGet.


Go to the Elastacloud Channels: Parquet .Net Channel to stay up to date on the latest developments.


Elastacloud Channels is a social forum in which the Elastacloud team share updates on projects and technologies that they are currently working with. Keep up to date with our progress by visiting regularly, or following us on Twitter.


Errata and amendments:

This post is currently up to date.

New Posts
  • We've been working hard on Parquet.NET to give developers high level abstractions over Parquet so that there is an easy entrypoint into developing with Parquet that is not onerous for developers new to the format. As such, we've created ParquetConvert which allows the trivial creation of Parquet files from IEnumerable<T> Check out this gist I created that shows how to do this. It shows that a simple class as below can be serialized so easily that for the consuming developer there is next to no code. With a simple call to ParquetConvert.Serialize, we can save a collection of these elements in Parquet. Inspecting with Parq shows our output: We'll be continuing to add more features to help developers maximise their productivity with Parquet, as well as retaining low level features that allow complete control of the Parquet format from dotnet.
  • https://github.com/elastacloud/parquet-dotnet is about to be released in the following few days. Since v3.0 was pushed to the public, it saw a lot of interest and appraisal for it's incredible performance boost, however there were problems as well. To reiterate, v3.0 was a complete rewrite of 2.0 and allowed you to get deeper into parquet internals, especially API for creating *row groups* , writing columns directly, controlling row group sizes etc. Although this was a big improvement in the library's core itself, it made it harder to use for a general audience, because v2.0 had a handy *row-based interface* for accessing data. Although working with rows slows down parquet library, you will eventually run into a situation where you need to work with rows anyway. For instance, writing utilities for viewing parquet data, converting between parquet and row-based formats like CSV and so on. Therefore, V3.1 *resurrects row-based access* and makes it faster and better. The way you work with rows has changed slightly but mostly you shouldn't notice any differences at all. They come in play when working with complex data structures like maps, list, structures etc. Preview documentation for this feature is located here https://github.com/elastacloud/parquet-dotnet/blob/features/rows/doc/rows.md so feel free to browse and leave feedback either on this page or raise an issue on GitHub. PARQ We'd also like to announce that we're introducing .NET Core Global Tool in this version called parq . Full description is located here https://github.com/elastacloud/parquet-dotnet/blob/features/rows/doc/parq.md . Essentially it's a hassle free way to work with parquet files locally and the number of commands supported will continue to grow.
  • Did you know that it's possible to extract data from Parquet files in Azure Data Lake Analytics ? Well it is and the library just received a couple of updates, check it out over on its Github page. First of all the library has just received an update to bring it up to the latest version of Parquet .NET (2.1.2 at the time of writing), this brings with it a range of updates and I recommend checking out some of the other posts here to find out what's been going on. The other update is in how the library handles dates, by default Parquet .NET returns DateTimeOffset values when reading dates from parquet files, this is always the right thing to do as anyone who has ever had to work with time zones and offsets will tell you. However, U-SQL does not support the DateTimeOffset type for extracting data, so a small change has been made to convert the DateTimeOffset value to a DateTime, allowing the data to be extracted. So, have a look, give it a go and if you find any problems then please raise an issue on Github or even contribute a fix yourself.