This is possible with a DS custom function, if your data would reside in a database. But once you have the data is in a database, there is a much more effective approach by using a database-level function. That will avoid thousands of roundtrips from DS to database.
Build a data flow to copy your file into a database. Make sure you add a column with a sequence number, use the built-in gen_row_num_by_group function to do so.
Write the database function with 2 input parameters (group and sequence number) to concatenate (up to) 500 userid's, and import it into DS.
Build a 2nd data flow, use the table as a source. Add a Query transform that reads records #1, #501, #1001... from the table (use where-clause mod(seq_number,500) = 1) and calls the function. Write the output to a file or another databse table.