Context
Here in the company we have a very complex query that encompasses several business rules, for presentation in a WorkFlow and for now I am evaluating it in SQLServer .
The performance of this query was poor and as the tables grew, the DBA warned us to re-evaluate the query as it is getting very costly for the database.
When I started the query it had a cost [Estimated SubTreeCost = 19.891] , from there I started evaluating and removing some optional fields, unnecessary left joins, projection subselects and cost increased instead of Decrease [Estimated SubTreeCost = 20.232]
When I run the query reduced, I had the result in 23s and the original query (with lower cost) 1min18s
Question
I'm not very experienced with evaluating plans for executing queries and hence came a doubt, the cost is based on the amount of data returned? at run time? in the cost to the bank?
Can the higher-cost query be better than the lower-cost original?
I can not share the query because it is a sensitive data and the query is very complex and it would be difficult to try to create a similar fake query