My requirement is to get polarion data and store that to our sql server database.
I went through Polarion SDK document and I feel webservice is the way to do that....
Which is the best way to read and store specific data from polarion to SQL server.
The webservice is very slow and depending on your size it will not be practicable to export your data with help of the webservice.
However the data in Polarion is stored in an SVN-repository in form of small .xml-files. So you can read these XML files directly from the repository.
As Polarion is not stored in a database compatible format, you need to setup your own DB-schema and the transform from the XML-files should be straight forward.
You can either checkout a complete Polarion Project or you retrieve the files on demand via http(s) second approach will be slightly slower again.
Related
I need all the data backup storage on a regular basis on Salesforce to local database, so I wrote a program that calls the REST API /services/data/v53.0/sobjects access to all the sobjects, Then respectively according to their name call /services/data/v53.0 sobjects/XXX/describegot fields for each object, but I found that the fields I got did not match the fields in the object manager.
I've also tried using SOQL directly:
SELECT EntityDefinition.QualifiedApiName, QualifiedApiName, DataType
FROM FieldDefinition
WHERE EntityDefinition.QualifiedApiName = 'xxx'
But it still doesn't work, if I need to back up the CRM data to my own local database, what do I need to do? How do I get all the tables and all the fields and export them?
please help me!
There are a few ways to do this, but none of them are easy. In the past I have used addons that connect directly to Salesforce via MSSQL. One such application is purpose built for this use case. Its called DBamp. Unfortunately it is rather pricy. You can also connect to your Salesforce instance with integration software like Jitterbit, Mulesoft, DellBoomi or Talend. That approach would require creating an integration catered to the object you want the backup for.
On free side, you could use Excel to connect to your Salesforce instance and pull down whatever object you want, this is probably not an ideal solution though. Data Tab > Get Data > From Online Service > From Salesforce Object.
enter image description here
I have seen other solution like creating full copy sandboxes every week. The last option is connecting MSSQL to Salesforce via SSIS and an ODBC connector but this has been a pretty bad experience in the past, could just be me though.
I am currently architecting some integration services for a web application. External java applications produce a data feed which supplies data, the data is massaged as necessary and then inputted in to a sql server database. The data is managed here and used as the basis for wcf and http rest services which are accessed by web applications, mobile devices etc.
This is the current setup. I am at present changing this modifying this as we have some issues with the integration of the java system and sql server database. The main issue we have is the standard of the data required, it can be missing fields etc. The current integration is a comma separated file placed on an ftp server, the file picked up, the file processed, data massaged and data inserted in to the sql server. Where we are currently getting "burned" is that data is inserted in to the sql server database and the quality of the data is not up to the necessary standard and/or quality.
So this process is being changed and looking for options as to both modernize this and make the integration services more robust.
So I am looking for both suggestions and recommendations to improve the above?
Some options that spring to mind are:
Expose a wcf service that the java system calls, data gets passed to it via the SOAP protocol, data then validated in the service before inserting in to sql server
Format of the data supplied moves from common separated file to an xml file and the xml file gets validated against a schema before the data is massaged
Any other suggestions?
Neither of your solutions is going to solve your data quality problem at its source. I'd look more critically at the applications producing the data and put the validation there in addition to validating it before INSERT into the database. You want to validate prior to INSERT, because you should never trust clients. But clients ought to honor a contract when they send you data.
One advantage that the web service offers that the others don't is the possibility of real time INSERTs into the database. Let the source applications send their requests to this broker service. It validates requests and inserts them in real time. No more batch.
I am working on a Java program that connects to a databases and serves as a design and data manipulation tool. I eventually want to port my program to an XML database of some kind.
More specifically, I want to be able to use Java to upload XSD (representing a database) that I generate to an actual XML database (like Sedna) and then connect to this database and query/update it.
I have been exploring Sedna and the XQJ API and I don't see how to do this. Can it be done through Java?
Edit I don't have any data I want to map to XML or a database. I have a database structure specified in XSD, and I just want to create the corresponding database on a server somewhere.
There are at least two frameworks worth trying for XML to database mapping:
Castor XML http://www.castor.org
Hibernate http://docs.jboss.org/hibernate/orm/3.3/reference/en/html/xml.html
I have created a app that will have a large set of data in the form of XML files inside documents folder. The data size is so large and its growing data by day so planning to move it to SQLLite DB. Also, i want it to be moved to SQLLite DB for security purposes. I have around 1000 XML files currently, it may grow in future. My primary issue is i want all the data inside XML files to be moved into SQLLite DB using a Backend System(.Net Framework or Java) and can i push this complete Database into the iPhone using a Web Service. So that no XML parsing happens in iPhone. Because i heard XML parsing is resource intensive than reading from SQLLite DB inside iPhone. Whether this is a feasible solution or any better approach is available?
Don't transport the entire set of data each time. Have the iOS client request only the changes since it last synced, and have it update its local database. Processing multiple XML documents should be fine so long as the app can synchronize in the background while the user continues to use it.
In the application Im writting the server will have information abuot the users, using a XML databse. The admin user will be able to write/read information on those files too.
How can I deal with concurrent access to those files?
The only way users/admin can read/write to those files is by requesting to the server(Sockets, TCP connection), so the server will have to handle this.
What can I do? I could synchronize server methods, but I dont want to avoid USER A to access his files while the admin is writing on USER B files.
Use a database instead of files is my first suggestion, they handle locks already.
You should post an example of file structure. It could be done if User A has his data in fileA.xml and user b has his in fileB.xml by locking the given file and synchronizing based on that.
As Jes says use a database.
MySQL Supports XML: http://dev.mysql.com/tech-resources/articles/xml-in-mysql5.1-6.0.html
Most databases support XML, or you could simply use a VARCHAR that is long enough and get and put the data in there. If that is your plan then maybe a NoSQL Solution would work also, it is just a persistent HashMap that supports record locking as well as other features.
It sounds like there is no conflict between users, what you could also do is have an area for the admins to modify the files, which you would copy daily to where the data is read from for the users.