Batch import of users and user attributes
Permalink
For our current project, each user has an extra 23 attributes. I'm working on a script which can do a batch import from an excel file.
Problem is, I don't see a function with which you can do a batch save of attributes. For example, what I'd like to do is have an array of key/value pairs of attribute names and values, and do something like saveBatch($userInfo, $attrData). I can't find a function in the code which does something like this.
All I can see so far is to manually save each attribute, but that would lead to 1 (user record) + 22 (attributes) * 1250 (rows from excel) = 28750 seperate save queries for 1 import. Not the most efficient course of action for an import which might be done more then 1 time.
Any hints, tips or lines of code in the C5 which I should take a look at?
Problem is, I don't see a function with which you can do a batch save of attributes. For example, what I'd like to do is have an array of key/value pairs of attribute names and values, and do something like saveBatch($userInfo, $attrData). I can't find a function in the code which does something like this.
All I can see so far is to manually save each attribute, but that would lead to 1 (user record) + 22 (attributes) * 1250 (rows from excel) = 28750 seperate save queries for 1 import. Not the most efficient course of action for an import which might be done more then 1 time.
Any hints, tips or lines of code in the C5 which I should take a look at?

I think I posted this in the wrong subforum, could an admin please move it to 'Building with Concrete5'?
I need to do the same.
The Kino plugin seems worthless(?) in terms of importing custom attributes.
Is there any documentation of the database schema so I can build my own import?
The Kino plugin seems worthless(?) in terms of importing custom attributes.
Is there any documentation of the database schema so I can build my own import?
That seems reasonable for something that is probably going to only be done rarely. Databases are pretty fast and they shouldn't sweat over 20k updates. I would be more worried about wrapping each user update in a transaction than in the overhead of a couple thousand updates.
If I were trying to do this more efficiently, I suppose I would initialize a user, then export the database (using MySQL backup or the backup function in the Dashboard), then use the SQL to reverse engineer the schema, generate a new .sql file, and import it back through the database. Maybe wrap everything up in a nice mustache template.
If I were trying to do this more efficiently, I suppose I would initialize a user, then export the database (using MySQL backup or the backup function in the Dashboard), then use the SQL to reverse engineer the schema, generate a new .sql file, and import it back through the database. Maybe wrap everything up in a nice mustache template.
I agree with xaritas, 23k updates really isn't that big a deal, you can update max execute time in php or update it via ini_set if enabled.