Solved! Go to Solution.
There is no "dependency" in terms of "running" Photon/Spark execution engines.
I guess your question is the impact of steps when run on both engines. There is a significant impact of n.o of steps based on the size of the Spark cluster and other Spark properties. Photon is an in-memory execution engine that run a chrome browser, so there is natural dependency on the available memory to a user's chrome and processor memory.
General rule of thumb is to keep the steps to minimum and efficient in a recipe. Best practice is to keep about 20 steps in a recipe.
That answers my question.
Thank you for your response.
No problem