I have a client-server program splited in 3 net beans projects, 1. client 2. server 3. common classes
I would like to push them in one bitbucket git repository but after pushing one of the projects i can't push others
Remote Repository Updates
Branch : master
Old Id : dfe16274e865383a78be5ddc294d828650cd73b0
New Id : 855f9f0355e9151330e82948cfda8bfc7a412d65
Result : REJECTED_NONFASTFORWARD
Local Repository Updates
Branch : origin/master
Old Id : null
New Id : 855f9f0355e9151330e82948cfda8bfc7a412d65
Result : NOT_ATTEMPTED
Note that there is no package and file name conflicts in these projects.
As looks like there is nothing like/equivalent to Solution in Visual Studio to group related projects in net beans, is it a good idea and correct use of git to push all these 3 projects into one repo?
How to push other two projects to my repo?
git version: 1.8.3.1 Net beans version: 7.3
You need to create a single repository from the parent folder. I would suggest that you put the three project folders into a folder lets say: myproject. Then initialize the repository inside myproject and push that to the bitbucket git repository. This way any changes in any of the projects will be pushed to your repository.
Related
Current setup:
Maven
Spring Boot
Angular
12 maven project + 1 Pom project
Git
Jenkin
Currently i have separate 13 project.
Every time if i have to setup on another machine then i have to clone each project individually.
Also i did the Jenkin pipeline setup for auto build.
Looking for
Once i clone the single pom/main project , all other project should get cloned
If i take pull of the main project then all other projects code should get updated
Child project should get head revision every time i start the jenkin
build.
Want to keep separate repo for each sub projects so that it can be reuse in other project as i have multiple requirements.
Solution that i have thought of:
Maven Child projects
Git Sub modules
Issues facing in above solution
1. Maven Child projects
--> If we are using maven child projects then we are literally copying the sub projects in pom/main project.
In this case i cannot reuse the project. Every time i have to clone the project to use it.
For big project with number of team member working, merging issue will come.
2. Git Sub modules
--> This is best solution for me but when we create sub module it always point to commit not the head revision.
Every time i have to manually pull the latest code push it to parent project.
This approach we can not use in Jenkins.
Is there best approach which can resolve all the issue.
You can add git sub-modules by specifying what modules to be added from other repositories. A .gitmodules file will be created on root that will contain list of all submodules and other preferences of yours.
Creating git submodules for projects in different repositories
There are set of commands available that you can use to sync up the git module and all it's sub-modules.
Set of steps that you could take is:
Add all submodules in project using git submodule add [ssh or https path] e.g. git submodule add https://github.com/test
Add other submodules and push the changes
Fetch submodule content by running
git submodule init
git submodule update
or alternatively run git clone --recurse-submodules https://github.com/chaconinc/MainProject to get sub-modules as well
I would recommend that you read through this link and ensure your requirement is fulfilled by this.
I am setting up a maven repository for my workplace. The main reason for setting up a new private repository is that, our original nexus repo is hidden behind a VPN network. So anyone who wished to pull the dependencies. He needs to have a VPN connection in the working machine. Goal is to make the VPN secured artefacts under the newly created repo to make it available for CI pipeline.
Right now I am doing it as below, listing all the dependencies using following command:
mvn -Dmdep.copyPom=true dependency:copy-dependencies
Which send all the jars under target/dependency.
And then I using the following maven command:
mvn deploy:deploy-file
Which pushes all the dependencies in my newly created nexus, issue is that during setting deploy-file command I am passing a group id which is a generic one. And all the jars are pushed under that groupid. Which is different than my local name namespace/groupid, as different packages are under different group id and pushing all of them under a same group id makes them unusable as it messes with the namespace.
I am new to maven, and the approach I took feels like a hack. Can I use maven native functionality to solve the problem? What would be the standard way to solve the issue?
Nexus has a concept of proxy repositories , for the open source dependencies, you can create proxy repository to mirror the maven central repo and provide in the proxy information is maven settings and run a mvn install, during the resource generation phase of maven the nexus will automatically pull artifacts from the central repo and cache them in your proxy repository for future use.
Hi i am using jenkins and bitbucket , i want to trigger a build in jenkins when ever i commit any thing to bitbucket repository .
in jenkins
i created a project called test_1
in configure section Build Triggers part i ticked Trigger builds remotely
i added a token TEST_TOKEN
when i type this in my browser url and execute the jenkins build is triggered
http://test.com:8080/job/test_1//build?token=TEST_TOKEN
In bitbucket
i added a jenkins hook
Endpoint : http://test.com:8080/job/test_1//build?token=TEST_TOKEN
Module name - empty
Project name - test_1
Token - empty
then commited some code to bitbucket via git , The jenkins build not running , seems that the trigger is not running . :/ how to solve this problem . please help me . thanks in advance :)
I had the same problem. #fmitchell is correct with his suggestions for these fields.
But it didn't work for me.
I use a normal POST Hook instead where I provide the whole URL:
http://USER_NAME:USER_TOKEN#YOUR.JENKINS.URL.COM:YOUR_PORT/job/YOUR_PROJECT_NAME/build?token=some_token_from_jenkins
eg: http://bob.miller#jenkins.example.com:8080/job/test_1/build?token=TEST_TOKEN
It seems to be that Bitbuckt is missing the last parameter "build" in its created URL, but I can't tell for sure.
------Update------
I found a better solution, where you don't only trigger your build but also be able do build different branches by different Jenkins projects:
Install Bitbucket Plugin at your Jenkins
Add a normal Post as Hook to your Bitbucket repository (Settings -> Hooks) and use following url:
https://YOUR.JENKINS.SERVER:PORT/bitbucket-hook
Configure your Jenkins project as follows:
under build trigger enable Build when a change is pushed to BitBucket
under Source Code Management select GIT; enter your credentials and define Branches to build (like **feature/*)
By this way I have three build projects, one for all features, one for develop and one for release branch.
And best of it, you don't have to ad new hooks for new Jenkins projects.
It should be:
Endpoint : http://test.com:8080/
Module name:
Project name: test_1
Token: TEST_TOKEN
I have a Project A which depends on Project B; both are internal projects in active development.
Say the latest Project A release is 1.1.2 which depends on Project B 1.1.1.
Now we are developing Project A 1.2.0 which depends on the Project B 1.2.0 also in development.
<dependency org="my.org" name="projectB" rev="1.2.0" transitive="true" conf="..." changing="true"/>
New intergration builds for Project B 1.2.0 are pushed by the CI server in the common local repository, so thanks to "changing" everyone gets the latest integration builds as soon as they are published.
Say Bob is developing a new feature on Project A which requires some modifications to Project B; he publishes a new shapshot Project B 1.2.0 in his local private repository and his is picked up in the build because is more recent than the one in the common repository. So far all ok.
But if Alice commits something in Project B, the CI server pushes a new 1.2.0 on common repo, which is more recent than the one Bob has locally; now Bob gets the common version which overrides his local changes.
Of course I could use different names (using property files in a clever way that name does not have to end in ivy.xml), something like 1.2.0_snapshot for Bob, as long as Bob needs the local version, and then switch back to 1.2.0 when the common version is ok.
But isn't there a way to force using artifact whose status is "snapshot" (that will always be the status of local builds) over the ones which have "integration" (the ones produced by CI server will always have that status) or higher?
I tried "latest.snapshot" but it takes the integration version, if more recent.
What is the best way to deal with this pattern?
I think you want your local resolver to be in "force mode". Set the force="true" on your local resolver in ivysettings.xml.
See the description of force mode at: http://ant.apache.org/ivy/history/latest-milestone/settings/resolvers.html
I have a situation where I have an elderly CVS repository which we would like to convert to git once and for all while keeping full history etc.
All folders at the root of the repository contains Eclipse projects (either plain or dynamic web projects) including .classpath and .project. We use Team ProjectSets to check out the projects we need for a given task (where the project set is located in the project containing the main, and the rest are library projects).
When the Team ProjectSet is checked out, the workspace is fully populated.
This approach has worked pretty well for many years (except the project set part which came with 3.5), and we would like to work in a similar way with git if possible, but we are uncertain how.
I've played somewhat with git cvs import but it failed - probably due to us not using modules.
How would you suggest we do this, and how should we work with git to allow our current usage of shared library projects? Would we HAVE to introduce maven and create maven modules for our library projects? Or just ant ivy?
EDIT: I've now managed to convert our CVS repository to Subversion with a suitable cvs2svn invocation and found that Eclipse recognizes the resulting Subversion repository nicely. Unfortunately after cloning http://github.com/iteman/svn2git` and trying to run bin/svn2git I get
tra#Sandbox:~/cvsgit/svn2git/svn2git$ bin/svn2git
bin/svn2git:35:in `initialize': wrong number of arguments (2 for 1) (ArgumentError)
from bin/svn2git:35:in `new'
from bin/svn2git:35
This is with Ubuntu 10.04.1 LTS Server and I've tried various sudo things with Ruby and its gems without fully understanding what I did as I am not a Ruby programmer so I may have messed up things a bit. I'd appreciate advice - if the easiest is to install another Linux variant to do the conversion, that is fine.
EDIT:
https://help.ubuntu.com/community/Git
http://css.dzone.com/articles/subversion-git-morning
Edit: My first try with the default svn2git completed successfully (after a while), and I get a nice repository where git branch -a reports roughly
tra#Sandbox:~/gitroot/svnroot$ git branch -a
* master
remotes/XX64_DEPLOYED_CODE
remotes/Beta1
remotes/Beta2
remotes/SV46
... lots more
We are interested in being able to check out the SV46 branch and work with it (we basically do not care about the tags, just actual branches). I have set up gitosis and pushed this repository to gitosis, and cloned it to another computer to find out how to do the "work with SV46" bit with Eclipse. THAT repository does not know of all the branches:
tra#TRA ~/git/git00 (master)
$ git branch -a
* master
remotes/origin/HEAD -> origin/master
remotes/origin/master
Do I need to massage the original result from svn2git to get the information into the gitosis repository? Do I need to clone with an argument? Should I redo the svn2git step with the suggested version instead of the one shipping with Ubuntu?
EDIT: It turned out that publishing the svn2git generated repository with "git push --mirror" made things shown up in the gitosis repository. I now see the following inside gitosis (trimmed):
tra#Sandbox:/srv/gitosis/repositories/git01.git$ git branch -a
* master
remotes/XX64_DEPLOYED_CODE
remotes/Basic_Beta1
remotes/Beta1
remotes/Beta2
remotes/SV46
... lots more
tra#Sandbox:/srv/gitosis/repositories/git01.git$ git branch
* master
tra#Sandbox:/srv/gitosis/repositories/git01.git$ git tag -l
tra#Sandbox:/srv/gitosis/repositories/git01.git$
Trying to clone this repository with git clone gitosis#sandbox:git01 -b remotes/SV46 or git clone gitosis#sandbox:git01 -b SV46 both tell me that the remote branch is not found upstream origin, using HEAD instead.
Am I barking up the wrong tree?
First of all, using submodules for independent parts of your Central VCS repository (i.e. your CVS repo) is always good (see "What are the Git limits?").
That mean you will end up with many independent Git repo, that is "set of files evolving independently one from another", which is why submodules exist.
So multiple Git import (in multiple repo) are required.
But since git cvs import is not always up to the task, I would recommend:
cvs2svn in order to get an SVN repo first (just one repo)
svn2git in order to properly convert your SVN repo to a git one (i.e. transforming SVN branches into Git branch and SVN tags into Git tags)