Conversation
| mainApplicationFile: local:///stackable/spark/jobs/spark-ingest-into-lakehouse.py | ||
| deps: | ||
| packages: | ||
| - org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.5.0 |
There was a problem hiding this comment.
I guess we want to bump to 1.6.1 once we do the actual version bump just before 24.11 is out
There was a problem hiding this comment.
What did you mean by 1.6.1?
I see it now, iceberg.
There was a problem hiding this comment.
Done. I checked and it exists:
$ mvn org.apache.maven.plugins:maven-dependency-plugin:2.1:get -DrepoUrl=https://repo.maven.apache.org -Dartifact=org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.6.1
[INFO] Scanning for projects...
...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
There was a problem hiding this comment.
So we now have some demos on Iceberg 1.5 and some on 1.6 ;). And I assume we did not test the change by running the big lakehouse demo? ;)
That's by the way why I'm not a fan of the next branch, because the person doing the actual 24.11 bumps still needs to grep for every tool and every version and bump them, as we can not rely on PRs into next (think of e.g. iceberg, postgres, minio or Redis versions). And test the demos after the bumps. And test the actual SDP bump ;)
All intermediate PRs (as this) in my personal opinion (sorry^^) just create noise and effort and move the next branch into an partially updated, untested state. Running the demos for every change into the next branch IMHO is not worth the effort, e.g. it takes over an hour to test the lakehouse demo.
Anyway, I will not veto any PRs going into next (as you can see I approved this one) or will tell anyone to stop, I just wanted to express my concerns :)
So no need to change or fixup anything, all of this will be done properly by the 24.11 bump 😅
part of stackabletech/docker-images#838