Hi,
Are there any plans on the horizon to roll compression into Monet? The X-100 project looked really interesting in this regard, but as I understand it, that work has been transferred into VectorWise.
If there are no plans, is this because it's completely antithetical to the monet architecture (from the papers it seems like X-100 was, to some degree at least, 'integrated' in), or more due to lack of resources?
My motivating example here is OLAP: I frequently have 1 relatively large fact table and then many much smaller dimensional tables. If optional compression were available, it we be nice to compress all or some of the BATs for the fact table columns and then have the others work as usual.
(Well, at least this sounds good, maybe it makes no sense). Another motivation is there seems to be a lot of anecdotal evidence for companies moving from larger big iron servers to more numerous, smaller machines - so it would be really nice to have this capability for more memory constrained settings.
I understand on a basic level how compression conflicts with the relatively simple approach monet uses to load BATs (e.g. memory map), but, dwelling in ignorance, I blithely assume there could be some solution not as complex as X-100 if one were to accept a significant performance cost. For example: decompressing BAT data on the fly as part of a BATiterator. I probably don't have the skills to implement even a basic on-the-fly decompression approach like this, but just wondering aloud: how hard a problem is this?
Thanks,
Jason