Connected Streams is one option. But may be an overkill in your scenario if your CSV does not refresh. If your CSV is small enough (number of records wise), you could parse it and load it into an object (serializable) and pass it to the constructor of the operator where you will be streaming the data.
If the CSV can be made available via a shared network folder (or S3 in case of AWS) you could also read it in the open function (if you use Rich versions of the operator).
The real problem I guess is how frequently does the CSV update. If you want the updates to propagate in near real time (or on schedule) the option 1 ( parse in driver and send it via constructor does not work). Also in the second option you need to be responsible for refreshing the file read from the shared folder.
In that case use Connected Streams where the stream reading in the file (the other stream reads the events) periodically re-reads the file and sends it down the stream. The refresh interval is your tolerance of stale data in the CSV.