Sharing data across virtual environments in Python

  Kiến thức lập trình

I am trying to consolidate data from several different sources into a Tableau .hyper file using Pantab. One of my sources is Snowflake. The problem is that Pantab and Snowflake require non-overlapping versions of PyArrow. By using virtual environments, I can write scripts for each of them separately, but that doesn’t get me to what I need. I need for the output Pandas dataframe from Snowflake (Script #1 in Environment #1) to use as inputs into Tableau (Script #2 in Environment #2).

I can always have Script #1 export as an Excel file or CSV and then import those, but I’d rather not have those extra files unless there’s no other way. There is a lot of transformation that happens prior to creating the .hyper file, so it’s not something that lends itself to just doing in Tableau.

I’m currently doing the work in Tableau Data Prep, but it takes over an hour out of my day to babysit it.

Theme wordpress giá rẻ Theme wordpress giá rẻ Thiết kế website

LEAVE A COMMENT