#!/usr/bin/env python3

"""
Rudder development tool.

You need a configuration file, if you don't have one, the tool will create one for you at first run.

Usage:
    rudder-dev -h|--help
    rudder-dev [-d|--debug] [-f|--force] clone <repository_name> [--fork]
    rudder-dev [-d|--debug] [-f|--force] pull [<branch>]
    rudder-dev [-d|--debug] [-f|--force] branch <ticket_id> [--base=<ticket_id>]
    rudder-dev [-d|--debug] [-f|--force] quickfix <pr_url/file> [<subject>|<trigraph>] [--merge]
    rudder-dev [-d|--debug] [-f|--force] technique <version> <comment>
    rudder-dev [-d|--debug] [-f|--force] subtask <new_title> [<next_branch>|--bug] [--base=<ticket_id>]
    rudder-dev [-d|--debug] [-f|--force] wip [--nopr|<trigraph> [<PR_comment>]]
    rudder-dev [-d|--debug] [-f|--force] commit [--nopr|<trigraph> [<PR_comment>]]
    rudder-dev [-d|--debug] [-f|--force] amend [<PR_comment>]
    rudder-dev [-d|--debug] [-f|--force] fixup [<PR_comment>]
    rudder-dev [-d|--debug] [-f|--force] rebase [--base=<ticket_id>] [<PR_comment>]
    rudder-dev [-d|--debug] [-f|--force] retarget [<target_version>]
    rudder-dev [-d|--debug] [-f|--force] takeover <ticket_id>
    rudder-dev [-d|--debug] [-f|--force] revert <ticket_id> [--retarget]
    rudder-dev [-d|--debug] [-f|--force] merge all [(-s <strategy>)] [-a|--automatic] [-t|--test]
    rudder-dev [-d|--debug] [-f|--force] merge <first_branch/ticket_id/pr_url> [(-s <strategy>)] [-a|--automatic] [-t|--test] [--no-autosquash]
    rudder-dev [-d|--debug] [-f|--force] merge <first_branch> <next_branch> [(-s <strategy>)] [-a|--automatic] [-t|--test]
    rudder-dev [-d|--debug] [-f|--force] merge [(-s <strategy>)] [-a|--automatic] [-t|--test] [--no-autosquash]
    rudder-dev [-d|--debug] [-f|--force] find <command>
    rudder-dev [-d|--debug] [-f|--force] cleanup [--more] [-n|--dry-run]
    rudder-dev [-d|--debug] [-f|--force] update
    rudder-dev [-d|--debug] [-f|--force] blame <file> [--before=<commit_id>] [--changed-after=<commit_id>] [--long]
    rudder-dev [-d|--debug] [-f|--force] <smart_arg> [<PR_message>] [--base=<ticket_id>]

SMART
    <smart_arg> is a shortcut for other commands
    - if it is a number > 100, it is guessed to be a ticket_id -> branch
    - if it is a x.y or master, it is guessed to be a branch -> pull
    - if it is a caps TLA, it is guessed to be a trigraph -> commit -u
    - if it is '-', checkout the last branch we where working on

CLONE
    Call it when you want to work on a new repository.
    - fork the repository from Normation if --fork is given
    - clone it
    - setup both remote repository in git
    ex: rudder-dev clone rudder-techniques

PULL
    Call it to make sure you are on a branch and up to date.
    - create the branch with remote tracking if it doesn't exist
    - checkout the branch if specified
    - update the current branch
    ex: rudder-dev pull 3.0
    ex: rudder-dev pull master

BRANCH
    Call it before working on a bug.
    - if the branch already exist, just checkout it
    - find and check the ticket from redmine
    - create a branch with the proper name in your repository
    - set the ticket status to "in progress"
    - if --base is given, use the PR from the base ticket as the origin branch
    - --base can also be a local branch name
    ex: rudder-dev branch 1234
    ex: rudder-dev branch '#1234' --base 1233

QUICKFIX
    Call it when you or someone made a quick fix.
    If a quick fix PR is already opened:
    - open an issue of type bug with subject
    - takeover existing PR
    - close existing PR and link the new one
    - merge if needed
    If the fix is in a local file:
    - check that the diff is small
    - find the first branch having the problem
    - open an issue of type bug with subject
    - call rudder-dev branch on the issue
    - patch the file
    - commit with git
    - call rudder dev commit
    ex: rudder-dev quickfix README "Typo in readme"
    ex: rudder-dev quickfix https://github.com/Normation/ncf/pull/486 --merge

TECHNIQUE
    Call it after branching and before working on a new technique version.
    You must be in the technique directory containing all technique versions.
    - create a new version of a technique based on the last known version
    - add the comment to the changelog
    - create a commit for this new version
    ex: cd techniques/applications/apacheServer
        rudder-dev technique 4.0 "Add a new option"

SUBTASK
    Call it after committing a patch and you want a different patch on the next version.
    Or if you want to extend a feature from one branch in a new branch.
    Or if just discovered a bug in a PR that was just merdged.
    For example, after a ncf v0.x patch you want to amend it in ncf v1.
    - create a new ticket that is the child copy of the current one (set its tracker to Bug if --bug)
    - set the new ticket version to next_version (but not with --bug)
    - replace the issue's title if needed (if the new title starts with '+', extend the existing title)
    - call rudder-dev branch on the new ticket with a base branch from the old ticket
    - suggest modifications for logs in the modified files (ncf v1 specific)
    ex: rudder-dev sub <version>

WIP
    Works like the commit command, but creates a draft pull-request instead.
    ex: rudder-dev wip

COMMIT
    Call it after working on a branch.
    - fork the repository if it has not yet been forked
    - if no file is indexed by git, call git add -u
    - commit current work
    - push the branch to your github repository
    - stop if --nopr is given
    - make a pull-request to Normation's repository
    - update the corresponding bug with those information
    ex: rudder-dev commit BPF "@reviewer please tell me it's good"
    ex: rudder-dev commit

AMEND
    Call it after you made a change to a branch with a PR (deprecated).
    - if no file is indexed by git, call git add -u
    - commit current work amending last commit
    - push the branch to your github repository
    - add a comment to the PR to signal the change
    ex: rudder-dev amend "@reviewer pr updated"

FIXUP
    Call it after you made a change to a branch with a PR.
    - if no file is indexed by git, call git add -u
    - commit current work creating a "fixup" commit
    - push the branch to your github repository
    - add a comment to the PR to signal the change
    ex: rudder-dev fixup "@reviewer pr updated"

REBASE
    Call it when a PR is not mergeable anymore.
    - interactive rebase on NRM branch
    - if rebase has worked, push to OWN
    - add a comment to the PR to signal the change
    - if --base is given, use the PR from the base ticket as the origin branch
    - --base can also be a local branch name
    ex: rudder-dev rebase "@reviewer pr updated"
    ex: rudder-dev rebase --base 1234

RETARGET
    Call it when you started on the wrong version, to change the base branch.
    - if a version is provided, update the ticket with it
    - if the ticket's version has changed, continue
    - rebase to ticket's NRM branch
    - close current PR
    - create new PR
    - update ticket
    ex: rudder-dev retarget

TAKEOVER
    Call it when someone else has started a work you want to continue.
    - if the ticket has no pull-request, abandon
    - checkout upstream branch into local repository
    - rename it if needed
    - update the ticket's status
    ex: rudder-dev takeover 1234

REVERT
    Call it when a PR has been merged but you don't want it.
    - If the ticket has no pull-request, abandon
    - If retarget, ensure we have everything merged
    - Find merge commit of pull request
    - Revert pull request merge commit
    - if retarget, merge it to next branch with ours strategy (keep changes in next branch)
    ex: rudder-dev revert 1234

MERGE
    Call it when you want to merge different branches (needs commit rights)
    or when you want to merge a Pull-Request.
    "merge all" and "merge <first_branch>" are only valid within Rudder versioned repositories.
    Use --automatic to automatically validate merge comment.
    There is a special strategy call upto_XXX where XXX is a version number.
    When we reach this version, use -s ours to avoid upmerging to later branches.
    - detect the ticket id from current branch if there is no parameter
    - checkout an pull last version or the PR branch for a given ticket
    - automatically fixup commits on the PR branch unless --no-autosquash is provided
    - checkout an pull new version
    - merge both
    - push result to Normation's repository
    - if we are merging a ticket or 'all', continue merging to next version
    If the merge fails, please make adjustments, commit and rerun this command.
    ex: rudder-dev merge 3.0 master
    ex: rudder-dev merge all
    ex: rudder-dev merge 1234
    ex: rudder-dev merge https://github.com/Normation/ncf/pull/388 -s upto_3.1

FIND
    Call it to search thing within active branches.
    - for each branch version
    - checkout branch
    - run command
    - find when it returns 0
    ex: rudder-dev find 'grep bugfix path/file.cf'

CLEANUP
    Call it when your local repository becomes messy.
    - for each branch in local repository
    - if ticket is closed
    - if branch's commit are pushed upstream
    - remove local and remote branch
    With --more, include more branch that are not strictly clean, such as
      closed tickets that have un unmerged commits are asked to the user.
    ex: git cleanup

UPDATE
    Call it when you want the last version of rudder-dev.
    - download last version from https://repository.rudder.io/tools/rudder-dev
    - replace current rudder-dev with it
    - use sudo if needed

BLAME
    Call it to find when a line has been modified
    - run git blame on the file
    - reformat output to be shorter and add redmine issue id
    - stop before a specific commit with --before
    - add issue description if --long is provided
    - find lines changed after a commit id with --changed-after (useful to find deleted lines)
    - changed after and before are incompatible

Options:
    -b, --base=<ticket_id>   Use ticket PR as the ne work base
"""

from operator import ge
import sys
import os
import re
import string
import locale
import time
import json
from pprint import pprint
from tempfile import NamedTemporaryFile

import requests # apt-get install python-requests || pip install requests
import docopt # apt-get install python-docopt || pip install docopt

# Fake imports are a solution to make the source be a valid python file
# whereas the script need to stay a single file.
# We just replace the import at build time with the corresponding file content.
# Please note that current code won't work properly with real import
# but it could be interesting (or not).
import os
import re
from subprocess import Popen,PIPE
from pprint import pprint

try:
  import configparser
  config = configparser.ConfigParser()
except ImportError:
  import ConfigParser
  config = ConfigParser.ConfigParser()



class Config:
  """Pseudo class used to share global configuration"""
  # Other values are read from this file by read_configuration()
  CONFIG_FILE = "~/.rudder-dev"
  # Default error template (needed by config file parsing)
  ERROR_TPL = "\033[1;31m{}\033[0m"
  # Default force mode (needed because not necessary set)
  force = False
  # Common config
  QA_TEST = "qa-test"
  # Cache file where we put temporary data
  CACHE_FILE = "~/.rudder-dev.cache"


###
###  Internal functions
### 

# Run a command in a shell like a script would do
# And inform the user of its execution
def shell(command, comment=None, keep_output=False, fail_exit=True, keep_error=False):
  if comment is not None and (Config.LOGLEVEL == "debug" or Config.LOGLEVEL == "info"):
    print(comment)
    print(" $ " + command)
  if keep_output or keep_error:
    if keep_output:
      keep_out = PIPE
    else:
      keep_out = None
    if keep_error:
      keep_err = PIPE
    else:
      keep_err = None
    # contrary to python doc, environment is not inherited by default
    env = os.environ.copy()
    process = Popen(command, stdout=keep_out, stderr=keep_err, shell=True, universal_newlines=True, env=env)
    output, error = process.communicate()
    retcode = process.poll()
  else: # keep tty management and thus colors
    process = Popen(command, shell=True)
    retcode = process.wait()
    output = None
    error = None
  if fail_exit and retcode != 0:
    if (comment is None and Config.LOGLEVEL == "info") or (Config.LOGLEVEL == "debug"):
      print(command)
    if Config.LOGLEVEL == "debug" and output is not None:
      print(">" + output)
    if (Config.LOGLEVEL == "debug" or Config.LOGLEVEL == "info") and error is not None:
      print(error)
    logfail("*** COMMAND ERROR " + str(retcode))
    if not Config.force:
      exit(1)
  if not fail_exit:
    return (retcode, output, error)
  else:
    return output


def logfail(message):
  print(Config.ERROR_TPL.format(message))

# Read rudder configuration from ~/.rudder-dev or create a template if none found
def read_configuration(section=None):
  # Detect missing configuration
  if not os.path.isfile(os.path.expanduser(Config.CONFIG_FILE)):
    with open(os.path.expanduser(Config.CONFIG_FILE), 'a') as cfile:
      cfile.write("""[default]
## Uncomment and set your own values
## Your preferred name for Normation upstream repository on your local git (NRM, origin, ...)
#nrm_upstream = NRM
## Your preferred name for personal upstream repository on your local git (ABC, origin, ...)
#own_upstream =
## If github cli is installed, rudder-dev will read hub's configuration file to get the user's github token
## Otherwise, yo manage your tokens, click here https://github.com/settings/tokens (click "Generate new token")
# github_token = 
## Redmine personal access token, get yours here http://www.rudder-project.org/redmine/my/account (under "API access key")
#redmine_token = 
## For Normation people only
#redmine_alt_token = 
## Set to 'https' if you don't have ssh access to github, 'ssh' is the default
#remote_protocol = ssh
## For colorblind or dumb terminal change ansicode here (ref: http://pueblo.sourceforge.net/doc/manual/ansi_color_codes.html)
## Do no forget to add keeep {} to have the text content
#error_tpl = \\033[1;31m{}\\033[0m
""")
    print(Config.CONFIG_FILE + " doesn't exist !")
    logfail("I made a sample one, please fill it")
    exit(5)
  
  # Read ERROR_TPL first since it can be used just after
  Config.ERROR_TPL = get_config("error_tpl", None, section)
  if Config.ERROR_TPL is None:
    Config.ERROR_TPL = "\\033[1;31m{}\\033[0m"
  # replace \\Oxx characters by their octal equivalent
  Config.ERROR_TPL = re.sub(r'(\\0\d+)', lambda x: chr(int(x.group(0)[1:],8)), Config.ERROR_TPL)

  # Read configuration
  config.read(os.path.expanduser(Config.CONFIG_FILE))
  Config.UPSTREAM_REPOSITORY = get_config("nrm_upstream", "No 'nrm_upstream' entry in " + Config.CONFIG_FILE, section)
  Config.OWN_REPOSITORY = get_config("own_upstream", "No 'own_upstream' entry in " + Config.CONFIG_FILE, section)
  Config.GITHUB_TOKEN = get_config("github_token", None, section)
  Config.REDMINE_TOKEN = get_config("redmine_token", None, section)
  Config.REDMINE_ALT_TOKEN = get_config("redmine_alt_token", None, section)
  Config.REMOTE_PROTOCOL = get_config("remote_protocol", None, section)
  if Config.REMOTE_PROTOCOL is None:
    Config.REMOTE_PROTOCOL = "ssh"
  Config.LOGLEVEL = get_config("loglevel", None, section) # verbose, info, error
  if Config.LOGLEVEL is None:
    Config.LOGLEVEL = "info"


def get_config(item, error, section):
  """ Get a configuration item from current configuration file """
  try:
    # try [section] first
    if section is not None:
      try:
        return config.get(section, item)
      except:
        pass
    # use [default] otherwise
    return config.get("default", item)
  except:
    if error is None:
      return None
    else:
      logfail(error)
      exit(5)

def get_cache_info(key, subkey = None):
  """ Get a value from the cache file """
  data = {}
  filename = os.path.expanduser(Config.CACHE_FILE)
  if not os.path.isfile(filename):
    return None
  with open(filename) as fd:
    data = json.load(fd)
  if not key in data:
    return None
  if subkey is None:
    return data[key]
  if subkey not in data[key]:
    return None
  return data[key][subkey]

def set_cache_info(key, value, subkey=None):
  """ Set a value into the cache file """
  filename = os.path.expanduser(Config.CACHE_FILE)
  if os.path.isfile(filename):
    with open(filename) as fd:
      data = json.load(fd)
  else:
    data = { }
  if subkey is None:
    data[key] = value
  else:
    if key not in data:
      data[key] = { }
    data[key][subkey] = value
  with open(filename, "w") as fd:
    json.dump(data, fd)

import json
import re
import requests # apt-get install python-requests || pip install requests
from dateutil.parser import parse

# trick to make fake import compatible with regular import
if 'Config' not in vars():
  from common import *
      
Config.HUB_CONFIG_FILE = "~/.config/hub"
Config.PR_VALIDATED_LABEL = "Ready for merge"
Config.PR_VALIDATED_COLOR = "0e8a16"
Config.BOT_CANNOT_MERGE_LABEL = "qa: Can't merge"
Config.BOT_CANNOT_MERGE_COLOR = "ededed"
Config.PR_TOO_OLD_LABEL = "Very old PR"
Config.PR_TOO_OLD_COLOR = "d93f0b"

class PR:
  """A Pull Request"""
  def __init__(self, url):
    self.url = url
    self.info = None
    match = re.search(r'.*?://.*?/(.*?)/(.*?)/pull/(\d+)', url)
    if match:
      self.id = match.group(3)
      self.repo_name = match.group(2)
      self.upstream = match.group(1)
    else:
      raise ValueError("BUG: not a valid PR URL")

  def is_labeled(self, label):
    """Tell if the pull request is labeled with label"""
    url = "https://api.github.com/repos/Normation/{repo}/issues/{pr_id}/labels"
    label_list = github_request(url, None, self.url, repo=self.repo_name)
    labels = [x['name'] for x in label_list]
    return label in labels

  def _request_pr(self):
    if self.info is not None:
      return
    url = "https://api.github.com/repos/Normation/{repo}/pulls/{pr_id}"
    self.info = github_request(url, None, self.url, repo=self.repo_name)

  def repo(self):
    self._request_pr()
    return self.info['head']['repo']['ssh_url']

  def remote_branch(self):
    self._request_pr()
    return self.info['head']['ref']

  def base_branch(self):
    self._request_pr()
    return self.info['base']['ref']

  def author(self):
    self._request_pr()
    return self.info['user']['login']

  def title(self):
    self._request_pr()
    return self.info['title']

  def sha(self):
    self._request_pr()
    return self.info['head']['sha']

  def mergeable(self):
    self._request_pr()
    return self.info['mergeable']

  def draft(self):
    self._request_pr()
    return self.info['draft']

  def _commits(self):
    self._request_pr()
    self.commits = github_call(self.info['commits_url'])

  def commits_titles(self):
    self._commits()
    return [ x['commit']['message'] for x in self.commits ]

  def comment(self, comment):
    url = "https://api.github.com/repos/Normation/{repo}/issues/{pr_id}/comments"
    data = { "body": comment }
    github_request(url, "Posting comment", self.url, json.dumps(data), repo=self.repo_name)

  def label(self, label):
    url = "https://api.github.com/repos/Normation/{repo}/issues/{pr_id}/labels"
    data = [ label ]
    github_request(url, "Changing label", self.url, json.dumps(data), repo=self.repo_name)

  def unlabel(self, label):
    # We need to check if the label exists before removing it
    get_labels_url = "https://api.github.com/repos/Normation/{repo}/issues/{pr_id}/labels"
    existing_labels = github_request(get_labels_url, "Getting labels", self.url, None, self.repo_name, "GET")
    if label in [ lab["name"] for lab in existing_labels]:
      remove_label_url = get_labels_url+"/"+label
      github_request(remove_label_url, "Removing label", self.url, None, self.repo_name, "DELETE")

  def close(self, message=None):
    if message is not None:
      self.comment(message)
    url = "https://api.github.com/repos/Normation/{repo}/pulls/{pr_id}"
    data = { "state": "closed" }
    github_request(url, "Closing PR", self.url, json.dumps(data), self.repo_name, "PATCH")

  # comments can be issue comments or review comments
  def get_comments(self):
    comments = []
    # Issue comments
    url = "https://api.github.com/repos/Normation/{repo}/issues/{pr_id}/comments"
    icomments = github_request(url, None, self.url, repo=self.repo_name)
    for c in icomments:
      comments.append({
            "date": parse(c['updated_at'], ignoretz=True),
            "author": c['user']['login'],
            "body": c['body'],
          })
    # Review comments
    url = "https://api.github.com/repos/Normation/{repo}/pulls/{pr_id}/reviews"
    pcomments = github_request(url, None, self.url, repo=self.repo_name)
    for c in pcomments:
      comments.append({
          "date": parse(c['submitted_at'], ignoretz=True),
          "author": c['user']['login'],
          "body": c['body'],
          })
    return comments

  def set_reviewer(self, reviewer):
    url = "https://api.github.com/repos/Normation/{repo}/pulls/{pr_id}/requested_reviewers"
    data = { "reviewers": [ reviewer ] }
    github_request(url, "Setting reviewer on github to "+reviewer, self.url, repo=self.repo_name, post_data=json.dumps(data))

  def review_approval(self):
    """ Returns True (approved), False (not approved) or None (no review) """
    # list reviews of this PR (always in chronological order)
    url = "https://api.github.com/repos/Normation/{repo}/pulls/{pr_id}/reviews"
    data = github_request(url, "Getting review status", self.url, repo=self.repo_name)
    status = {}
    for review in data:
      # for each reviewer, get the last status given and filter out comments
      if review['state'] == "APPROVED":
        status[review['user']['login']] = True
      if review['state'] == "REQUEST_CHANGES" or review['state'] == "CHANGES_REQUESTED":
        status[review['user']['login']] = False
    # Skip PR without review
    if len(status) == 0:
      return None
    return not (False in status.values())

  def tests_passed(self):
    """ Tests if a pull request tests are finished and successful """
    # status are stored by reference, use latest commit in PR
    url = "https://api.github.com/repos/Normation/{repo}/commits/{sha}/status"
    data = github_request(url, "Getting latest commit status", self.url, sha=self.sha(), repo=self.repo_name)
    state = data['state']
    # when 0, means no tests are actually pending
    nb_tests = len(data['statuses'])
    if nb_tests == 0:
      return True
    else:
      # other cases are pending or failed
      return state == "success"


# Get github user as used by the hub command
def get_github_user():
  user_data=github_call("https://api.github.com/user")
  if 'login' in user_data:
    return user_data['login']
  else:
    logfail("Github user not found")
    exit(6)


# Get github token as used by the hub command
def get_github_token(can_fail=False):
  if Config.GITHUB_TOKEN is not None:
    return Config.GITHUB_TOKEN
  if os.path.isfile(os.path.expanduser(Config.HUB_CONFIG_FILE)):
    with open(os.path.expanduser(Config.HUB_CONFIG_FILE)) as f:
      for line in f:
        match = re.search(r'oauth_token: (\w+)', line)
        if match:
          return match.group(1)
  if can_fail:
    return None
  else:
    logfail("Github user not found")
    exit(6)


# query github
def github_request(api_url, comment, pr_url=None, post_data=None, repo=None, method=None, sha=None):
  pr_id = None
  if pr_url is not None:
    # Validate PR url
    match = re.match(r'^http.*/(\d+)(/files|/commits)?$', pr_url)
    if match:
      pr_id = match.group(1)
    else:
      print("Can't find pull-request ID, you should warn the reviewer that you updated the pull-request")
      return False

  # get connection info
  if repo is None:
    repo = remote_repo()
  url = api_url.format(repo=repo, pr_id=pr_id, sha=sha)

  # Say what we are doing
  if comment is not None and (Config.LOGLEVEL == "debug" or Config.LOGLEVEL == "info"):
    print(comment)
    call_on = ""
    if pr_url is not None:
      call_on = "for " + pr_url
    print(" $ api-call " + url + " " + call_on)

  return github_call(url, post_data, method=method)

def github_call(url, post_data=None, fail_ok=False, method=None):
  token = get_github_token()
  # make query
  if post_data is not None:
    if method is None or method == "POST":
      ret = requests.post(url, headers = {'Authorization': 'token ' + token, 'Content-Type': 'application/json' }, data=post_data)
    elif method == "PATCH":
      ret = requests.patch(url, headers = {'Authorization': 'token ' + token, 'Content-Type': 'application/json' }, data=post_data)
    else:
      print("Unknown method call with data in github_call " + method)
      exit(1)
  elif method is None or method == "GET":
    ret = requests.get(url, headers = {'Authorization': 'token ' + token, 'Content-Type': 'application/json' })
  elif method == "DELETE":
    ret = requests.delete(url, headers = {'Authorization': 'token ' + token, 'Content-Type': 'application/json' })
  else:
    print("Unknown method call in github_call " + method)
    exit(1)

  # process output
  if ret.status_code < 200 or ret.status_code >= 300:
    if fail_ok:
      return None
    else:
      logfail("Github query error " + ret.reason)
      print(ret.text)
      if not Config.force:
        exit(12)

  # return result
  return ret.json()

# Retrieve current user organizations
def get_github_orgs():
    orgs = []
    data = github_call("https://api.github.com/user/orgs", fail_ok=True, method="GET")
    if data is not None:
        for org in data:
            orgs.append(org["login"])
    return orgs
import sys
import requests
import json

# trick to make fake import compatible with regular import
if 'Config' not in vars():
  from common import *

Config.REDMINE_ALT_API_URL = "https://redmine.normation.com"
Config.REDMINE_API_URL = "https://issues.rudder.io"
Config.REDMINE_API_LIMIT = 100
Config.REDMINE_CLOSED_STATUSES = [5, 6, 16, 11] # 5=Released, 6=rejected, 16=resolved, 11=Pending release
Config.REDMINE_META_TRACKERS = [3]
Config.ACCESS_ROLE_LIST = [ 3, 4, 5, 6, 7, 8, 9, 11 ] # 7=Product owner, 3=Scrum master, 8=Lead developer, 4=Developer, 5=Reporter, 11=Release manager, 6=Consultant, 9=Integrator

Config.REDMINE_NRM_GROUP = 314
Config.REDMINE_ALT_NRM_GROUP = 36

Config.TRACKER_NAME_MAPPING = { 'Bug': 'bug', 'User story': 'ust', 'Architecture': 'arch', 'Change': 'chg', 'Problem': 'pbm', 'Incident': 'inc', 'Enhancement': 'enh' }
Config.PENDING_TR_CODE = 3
Config.IN_PROGRESS_CODE = 9
Config.CUSTOM_FIELD_PR = 3
Config.ALT_CUSTOM_FIELD_PR = 1
Config.BUG_TACKER_ID = 1
Config.PENDING_MERGE_CODE = 12
Config.DISCUSSION_CODE = 4

Config.REDMINE_VERSION_DETECTOR = [ (r'master', r'master', False), (r'(\d+\.\d+)-.*', r'\1', True), (r'(\d+\.\d+).*', r'\1', False) ]

class Issue:
  """Class to hold information about a single issue"""
  def __init__(self, name, must_be_open=True):
    """name is a string like: 1234 or i1234"""
    self.must_be_open = must_be_open
    self.info = None
    if name.startswith('#'):
      name = name [1:]
    self.name = name
    is_internal = re.match(r'i(\d+)', name)
    if is_internal:
      self.id = int(is_internal.group(1))
      self.token = Config.REDMINE_ALT_TOKEN
      self.api_url = Config.REDMINE_ALT_API_URL
      self.custom_field_pr = Config.ALT_CUSTOM_FIELD_PR
      self.internal = True
      # Some deprecated usage of these global vars still exist
      Config.REDMINE_API_URL = Config.REDMINE_ALT_API_URL
      Config.REDMINE_TOKEN = Config.REDMINE_ALT_TOKEN
    else:
      self.id = int(name)
      self.token = Config.REDMINE_TOKEN
      self.api_url = Config.REDMINE_API_URL
      self.custom_field_pr = Config.CUSTOM_FIELD_PR
      self.internal = False
    self.server = Redmine(self.internal)

  def __getitem__(self, key):
    """Make Issue behave like a dict"""
    self._get_info()
    if key not in self.info:
      return None
    return self.info[key]

  def __contains__(self, key):
    """Make Issue behave like a dict"""
    self._get_info()
    return key in self.info

  def _get_version(self, issue_info, error_fail=True):
    """Extract a friendly version from an issue information"""
    if 'fixed_version' not in issue_info:
      if error_fail:
        logfail("***** BUG: Can't extract version from #" + self.name)
        exit(2)
      else:
        return None
    version = self.server.major_or_master(issue_info['fixed_version']['name'])
    if version is not None:
      return (issue_info['fixed_version']['id'], version)
    elif error_fail:
      logfail("***** BUG: Can't extract version from " + issue_info['fixed_version']['name'] + " in #" + self.name)
      exit(2)
    else:
      return None

  def _get_info(self):
    """Get issue information from redmine"""
    if self.info is not None:
      return self.info
    # Find issue in redmine
    print("Looking for Redmine ticket #" + self.name + "... ", end=' ')
    sys.stdout.flush() # to display previous unfinished line
    issues_req = self.server._query("/issues/" + str(self.id) + ".json?include=journals")
    issue = issues_req.json()['issue'] if issues_req.status_code == requests.codes.ok else None
    if not issue:
      print("Not found!")
      logfail("***** ERROR: ticket not found. Exiting.")
      if not Config.force:
        exit(2)
    else:
      print("Done")

    # Check ticket type
    if issue['tracker'] in Config.REDMINE_META_TRACKERS:
      print("This is a question ticket! You cannot make a pull request on this ticket.")
      logfail("***** ERROR: This is a question ticket. Exiting.")
      if not Config.force:
        exit(2)

    # Check ticket status
    if self.must_be_open and issue['status']['id'] in Config.REDMINE_CLOSED_STATUSES:
      print("This ticket is closed! You cannot make a pull request on this ticket.")
      logfail("***** ERROR: Closed ticket. Exiting.")
      if not Config.force:
        exit(2)

    if 'fixed_version' not in issue:
      print("This ticket has no target version! I can't make a branch against its repository.")
      logfail("***** ERROR: Missing target version. Exiting.")
      if not Config.force:
        exit(2)

    self.data = issue

    # Get ticket elements
    info = {}
    info['type'] = issue['tracker']['name']
    info['name'] = issue['subject']
    (info['version_id'],info['version']) = self._get_version(issue)
    info['project_id'] = issue['project']['id']
    info['tracker_id'] = issue['tracker']['id']
    info['priority_id'] = issue['priority']['id']
    info['subject'] = issue['subject']
    info['description'] = issue['description']
    info['private'] = 'is_private' in issue and issue['is_private']
    if 'category' in issue:
      info['category_id'] = issue['category']['id']
    if 'is_private' in issue:
      info['is_private'] = issue['is_private']
    if 'assigned_to' in issue:
      info['assigned_to_id'] = issue['assigned_to']['id']
    if 'custom_fields' in issue:
      for field in issue['custom_fields']:
        if field['id'] == self.custom_field_pr and 'value' in field and field['value'] is not None and field['value'] != '':
          info['pr'] = field['value']

    # Get ticket's last assignment besides me
    my_id = self.server.get_redmine_uid()
    if my_id is not None:
      if 'journals' in issue:
        for journal in issue['journals']:
          if 'details' in journal:
            for detail in journal['details']:
              if detail['name'] == 'assigned_to_id' and 'old_value' in detail and detail['old_value'] is not None and detail['old_value'] != '':
                if int(detail['old_value']) != my_id:
                  info['last_assignee'] = int(detail['old_value'])

    self.info = info
    return info

  def branch_name(self):
    """Create a branch name base on this issue"""
    branchified_name = re.sub("__+", "_", re.sub("[^" + string.ascii_letters + string.digits + "]", "_", self['name'].strip().lower())).strip("_")
    if self.internal:
      id = 'i' + str(self.id)
    else:
      id = str(self.id)
    if self['private']:
      branch_name = Config.TRACKER_NAME_MAPPING[self['type']] + "_" + id + "/_"
    else:
      branch_name = Config.TRACKER_NAME_MAPPING[self['type']] + "_" + id + "/" + branchified_name
    return branch_name

  def existing_branch(self):
    """Check if a branch already exists fir this issue"""
    for line in os.popen("git branch --no-color --list"):
      if self.internal:
        ticket = 'i'+str(self.id)
      else:
        ticket = str(self.id)
      match = re.search(r'^\*?\s*(\w+_' + ticket + r'/.*?)\n', line)
      if match:
        return match.group(1)
    return None


  # Beware, we don't update in memory status, throw away 'self' after calling this
  def _update_issue(self, change, message=None, alt_message=None):
    """Change the ticket content if possible"""
    if Config.REDMINE_TOKEN is None:
      return False
    # prepare info
    if self.can_modify():
      info = { 'issue': change }
      if message is not None:
        info['issue']['notes'] = message
    else:
      if alt_message is None:
        print("Cannot update the issue, you should update it manually status here " + self.api_url + "/issues/" + str(self.id))
        return False
      info = { 'issue': { 'notes': alt_message } }

    # send info
    ret = self.server._query("/issues/" + str(self.id) + ".json", put_data=json.dumps(info))
    if ret.status_code < 200 or ret.status_code >= 300:
      logfail("Issue Update error: " + ret.reason)
      print(ret.text)
      if not Config.force:
        exit(3)
    return True

  def update(self, user_id=None, pr_url=None, message=None, status=None):
    """Change ticket state and comment it"""
    # Create note content
    note = None
    if pr_url is not None:
      note = "PR " + pr_url
    if message is not None:
      if note is None:
        note = message
      else:
        note += "\n" + message

    change = {}
    # fill ticket data with developer available content
    if status is not None:
      change['status_id'] = status
    if user_id is not None:
      change['assigned_to_id'] = user_id
    if note is not None:
      change['notes'] = note
    if pr_url is not None:
      change['custom_fields'] = [ { 'id': self.custom_field_pr, 'value': pr_url } ]

    self._update_issue(change, message, message)

  def to_in_progress(self, message=None):
    """Change the ticket state to In progress"""
    change = {
            'status_id': Config.IN_PROGRESS_CODE,
            }
    if Config.REDMINE_TOKEN is not None:
      change['assigned_to_id'] = self.server.get_redmine_uid()
    self._update_issue(change, message)

  def to_status(self, status, assign_to, message=None):
    """Change the ticket state to pending_merge"""
    change = {
            'status_id': status,
            'assigned_to_id': assign_to,
            }
    self._update_issue(change, message)

  def clone(self, version_id, new_title, bug=False):
    """Clone this issue making the new one a child of this one"""
    self._get_info()
    new_info = {}
    for i in ('project_id', 'tracker_id', 'priority_id', 'subject', 'description', 'category_id', 'fixed_version_id', 'is_private'):
      if i in self.info:
        new_info[i] = self.info[i]
    new_info['parent_issue_id'] = self.id
    if bug:
      new_info['tracker_id'] = Config.BUG_TACKER_ID
      new_info['description'] = "There was a bug in the resolution of #" + str(self.id)
    if new_title is not None:
      if new_title.startswith('+'):
        subject = new_info['subject']
        m = re.match("(.*) - .*", subject)
        if m:
          subject = m.group(1)
        new_info['subject'] = subject + " - " + new_title.replace('+', ' ', 1)
      else:
        new_info['subject'] = new_title
    new_info['fixed_version_id'] = version_id
    return self.server._create_issue(new_info)

  def url(self):
    return self.api_url+"/issues/"+str(self.id)

  def can_modify(self):
    return self.server.can_modify_issues(self['project_id'])

  def update_version(self, version):
    if not self.can_modify():
      logfail("Cannot change ticket version since you are not a developer, you should change it manually before calling retarget")
      exit(13)

    # list all versions
    versions = self.server.version_list(self.info['project_id'])
    # keep versions that match and that are still open
    valid_versions = [ v for v in versions if v['status'] == 'open' and v['name'].startswith(version) ]
    # there should only only, but in doubt keep the last one
    final_version = valid_versions[-1]

    # set the version
    self._update_issue({ 'issue': { 'fixed_version_id': final_version['id'] } })
    self.info['fixed_version'] = final_version
    (self.info['version_id'],self.info['version']) = self._get_version(self.info)


class Redmine:
  """Class to query a redmine server"""
  def __init__(self, internal):
    self.internal = internal
    self.can_modify = None
    if internal:
      self.token = Config.REDMINE_ALT_TOKEN
      self.api_url = Config.REDMINE_ALT_API_URL
      self.nrm_group = Config.REDMINE_ALT_NRM_GROUP
    else:
      self.token = Config.REDMINE_TOKEN
      self.api_url = Config.REDMINE_API_URL
      self.nrm_group = Config.REDMINE_NRM_GROUP

  def _query(self, query, post_data=None, put_data=None):
    """ Function to directly request the right redmine server """
    if post_data is not None:
      ret = requests.post(self.api_url + query, headers = {'X-Redmine-API-Key': self.token, 'Content-Type': 'application/json' }, data = post_data)
    elif put_data is not None:
      ret = requests.put(self.api_url + query, headers = {'X-Redmine-API-Key': self.token, 'Content-Type': 'application/json' }, data = put_data)
    else:
      ret = requests.get(self.api_url + query, headers = {'X-Redmine-API-Key': self.token, 'Content-Type': 'application/json' })
    return ret

  def create_issue(self, project_id, subject, description, tracker_id, version_id):
    new_info = { 'project_id': project_id, 'description': description, 'subject': subject,
                 'fixed_version_id': version_id, 'tracker_id': tracker_id }
    return self._create_issue(new_info)

  def _create_issue(self, new_info):
    """ Private method for use just above, do not call """
    ticket_json = json.dumps({ 'issue': new_info })
    ret = self._query("/issues.json", ticket_json)
    if ret.status_code != 201:
      logfail("Issue creation error: " + ret.reason + "\n" + ret.text)
      if not Config.force:
        exit(3)
    if self.internal:
      new_id = 'i' + str(ret.json()['issue']['id'])
    else:
      new_id = str(ret.json()['issue']['id'])
    return Issue(new_id)

  def can_modify_issues(self, project_id):
    """ Return true if the current user can modify an issue in the given project """
    if self.can_modify is not None:
      return self.can_modify
    if self.token is None or self.token == "":
      return False
    user = self._query("/users/current.json?include=memberships").json()
    for membership in user['user']['memberships']:
      if membership['project']['id'] == project_id:
        for role in membership['roles']:
          if role['id'] in Config.ACCESS_ROLE_LIST:
            self.can_modify = True
            return True
    self.can_modify = False
    return False

  def list_nrm_users(self):
    return self._query("/users.json?group_id=" + str(self.nrm_group)).json()['users']

  def version_list(self, project):
    """ Return a list of version as given by the redmine API """
    return self._query("/projects/" + str(project) + "/versions.json").json()['versions']

  def has_locked_version(self, project):
    """ True if there is at least one locked version for this project """
    for v in self.version_list(project):
      if v['status'] == "locked":
        return True

  def get_redmine_uid(self):
    """ Get current redmine user """
    if self.token is None or self.token == "":
      return None
    user = self._query("/users/current.json")
    return user.json()['user']['id']

  def major_or_master(self, version):
    """ Return the major version or "master" if it is in alpha status """
    for k,v,recheck in Config.REDMINE_VERSION_DETECTOR:
      if re.match(k, version):
        major = re.sub(k, v, version)
        if recheck:
          # we d n't know the alpha status, let's check it against the rudder project
          alpha = False
          for v in self.version_list("rudder"):
            if v["status"] == "closed":
              continue
            # stop if we found a version matching major that is not alpha
            if v["name"].startswith(major):
              if re.search(r"alpha", v["name"]):
                alpha = True
              else:
                return major
          # If there is only alpha then it is currently alpha
          if alpha:
            return "master"
          # If it is not declared -> ??
          else:
            logfail("***** ERROR: Cannot find version status " + version)
            exit(4)
        return major
    return None


def issue_from_branch(branch, must_be_open=True):
  """ Create issue object from given branch """
  match = re.match(r'[A-Za-z]+_(i?\d+)/.*', branch)
  if match:
    return Issue(match.group(1), must_be_open)
  else:
    logfail("***** ERROR: This is not a ticket branch: " + branch)
    exit(4)

#
# distutils/version.py
#
# Implements multiple version numbering conventions for the
# Python Module Distribution Utilities.
#
# $Id$
#

"""Provides classes to represent module version numbers (one class for
each style of version numbering).  There are currently two such classes
implemented: StrictVersion and LooseVersion.

Every version number class implements the following interface:
  * the 'parse' method takes a string and parses it to some internal
    representation; if the string is an invalid version number,
    'parse' raises a ValueError exception
  * the class constructor takes an optional string argument which,
    if supplied, is passed to 'parse'
  * __str__ reconstructs the string that was passed to 'parse' (or
    an equivalent string -- ie. one that will generate an equivalent
    version number instance)
  * __repr__ generates Python code to recreate the version number instance
  * _cmp compares the current instance with either another instance
    of the same class or a string (which will be parsed to an instance
    of the same class, thus must follow the same rules)
"""

import re
from itertools import zip_longest

class Version:
    """Abstract base class for version numbering classes.  Just provides
    constructor (__init__) and reproducer (__repr__), because those
    seem to be the same for all version numbering classes; and route
    rich comparisons to _cmp.
    """

    def __init__ (self, vstring=None):
        if vstring:
            self.parse(vstring)

    def __repr__ (self):
        return "%s ('%s')" % (self.__class__.__name__, str(self))

    def __eq__(self, other):
        c = self._cmp(other)
        if c is NotImplemented:
            return c
        return c == 0

    def __lt__(self, other):
        c = self._cmp(other)
        if c is NotImplemented:
            return c
        return c < 0

    def __le__(self, other):
        c = self._cmp(other)
        if c is NotImplemented:
            return c
        return c <= 0

    def __gt__(self, other):
        c = self._cmp(other)
        if c is NotImplemented:
            return c
        return c > 0

    def __ge__(self, other):
        c = self._cmp(other)
        if c is NotImplemented:
            return c
        return c >= 0


# The rules according to Greg Stein:
# 1) a version number has 1 or more numbers separated by a period or by
#    sequences of letters. If only periods, then these are compared
#    left-to-right to determine an ordering.
# 2) sequences of letters are part of the tuple for comparison and are
#    compared lexicographically
# 3) recognize the numeric components may have leading zeroes
#
# The LooseVersion class below implements these rules: a version number
# string is split up into a tuple of integer and string components, and
# comparison is a simple tuple comparison.  This means that version
# numbers behave in a predictable and obvious way, but a way that might
# not necessarily be how people *want* version numbers to behave.  There
# wouldn't be a problem if people could stick to purely numeric version
# numbers: just split on period and compare the numbers as tuples.
# However, people insist on putting letters into their version numbers;
# the most common purpose seems to be:
#   - indicating a "pre-release" version
#     ('alpha', 'beta', 'a', 'b', 'pre', 'p')
#   - indicating a post-release patch ('p', 'pl', 'patch')
# but of course this can't cover all version number schemes, and there's
# no way to know what a programmer means without asking him.
#
# The problem is what to do with letters (and other non-numeric
# characters) in a version number.  The current implementation does the
# obvious and predictable thing: keep them as strings and compare
# lexically within a tuple comparison.  This has the desired effect if
# an appended letter sequence implies something "post-release":
# eg. "0.99" < "0.99pl14" < "1.0", and "5.001" < "5.001m" < "5.002".
#
# However, if letters in a version number imply a pre-release version,
# the "obvious" thing isn't correct.  Eg. you would expect that
# "1.5.1" < "1.5.2a2" < "1.5.2", but under the tuple/lexical comparison
# implemented here, this just isn't so.
#
# Two possible solutions come to mind.  The first is to tie the
# comparison algorithm to a particular set of semantic rules, as has
# been done in the StrictVersion class above.  This works great as long
# as everyone can go along with bondage and discipline.  Hopefully a
# (large) subset of Python module programmers will agree that the
# particular flavour of bondage and discipline provided by StrictVersion
# provides enough benefit to be worth using, and will submit their
# version numbering scheme to its domination.  The free-thinking
# anarchists in the lot will never give in, though, and something needs
# to be done to accommodate them.
#
# Perhaps a "moderately strict" version class could be implemented that
# lets almost anything slide (syntactically), and makes some heuristic
# assumptions about non-digits in version number strings.  This could
# sink into special-case-hell, though; if I was as talented and
# idiosyncratic as Larry Wall, I'd go ahead and implement a class that
# somehow knows that "1.2.1" < "1.2.2a2" < "1.2.2" < "1.2.2pl3", and is
# just as happy dealing with things like "2g6" and "1.13++".  I don't
# think I'm smart enough to do it right though.
#
# In any case, I've coded the test suite for this module (see
# ../test/test_version.py) specifically to fail on things like comparing
# "1.2a2" and "1.2".  That's not because the *code* is doing anything
# wrong, it's because the simple, obvious design doesn't match my
# complicated, hairy expectations for real-world version numbers.  It
# would be a snap to fix the test suite to say, "Yep, LooseVersion does
# the Right Thing" (ie. the code matches the conception).  But I'd rather
# have a conception that matches common notions about version numbers.

class LooseVersion (Version):

    """Version numbering for anarchists and software realists.
    Implements the standard interface for version number classes as
    described above.  A version number consists of a series of numbers,
    separated by either periods or strings of letters.  When comparing
    version numbers, the numeric components will be compared
    numerically, and the alphabetic components lexically.  The following
    are all valid version numbers, in no particular order:

        1.5.1
        1.5.2b2
        161
        3.10a
        8.02
        3.4j
        1996.07.12
        3.2.pl0
        3.1.1.6
        2g6
        11g
        0.960923
        2.2beta29
        1.13++
        5.5.kw
        2.0b1pl0

    In fact, there is no such thing as an invalid version number under
    this scheme; the rules for comparison are simple and predictable,
    but may not always give the results you want (for some definition
    of "want").
    """

    component_re = re.compile(r'(\d+ | [a-z]+ | \.)', re.VERBOSE)

    def __init__ (self, vstring=None):
        if vstring:
            self.parse(vstring)


    def parse (self, vstring):
        # I've given up on thinking I can reconstruct the version string
        # from the parsed tuple -- so I just store the string here for
        # use by __str__
        self.vstring = vstring
        components = [x for x in self.component_re.split(vstring)
                              if x and x != '.']
        for i, obj in enumerate(components):
            try:
                components[i] = int(obj)
            except ValueError:
                pass

        self.version = components


    def __str__ (self):
        return self.vstring


    def __repr__ (self):
        return "LooseVersion ('%s')" % str(self)


    def _cmp (self, other):
        if isinstance(other, str):
            other = LooseVersion(other)

        for i, j in zip_longest(self.version, other.version, fillvalue=''):
            if type(i) != type(j):
                i = str(i)
                j = str(j)
            if i == j:
                continue
            elif i < j:
                return -1
            else:  # i > j
                return 1
        return 0

# end class LooseVersion


## GLOBAL VARIABLES
Config.RUDDER_DEV_ORIGIN = "https://repository.rudder.io/tools/rudder-dev"
Config.WARN_FOR_UPDATE_AFTER = 30 # days

Config.LIFECYCLES = [ { "name": "rudder",
                 "project_id": "21",
                 "detection": r'^\*?\s+remotes/{}/branches/rudder/(.*)',
                 "format": [ "branches/rudder/{}" ],
               },

               { "name" : "master_only",
                 "detection": r'(.*)',
                 "format": "master",
               },
             ]

###
###  Internal functions
###

# update normation user list cache and return updated user list
def update_nrm_users(project_id):
  users = get_cache_info(Config.REDMINE_API_URL)
  if users is None:
    users = {}
  # just update it otherwise we would have only project specific data in it
  user_list = requests.get(Config.REDMINE_API_URL + "/groups/314.json?include=users", headers = {'X-Redmine-API-Key': Config.REDMINE_TOKEN } )
  for user in user_list.json()['group']['users']:
    id = user['id']
    user_data = requests.get(Config.REDMINE_API_URL + "/users/" + str(id) + ".json", headers = {'X-Redmine-API-Key': Config.REDMINE_TOKEN } )
    try:
      user_detail = user_data.json()['user']
    # some user cannot get info on some other users, but not all
    except ValueError:
      continue
    trigraph = (user_detail['firstname'][0] + user_detail['lastname'][:2]).upper()
    github_account = ''
    for field in user_detail['custom_fields']:
      if field['name'] == "GitHub":
        github_account = field['value']
    users[trigraph] = { 'id': id, 'trigraph': trigraph, 'firstname':user_detail['firstname'] , 'lastname':user_detail['lastname'], 'github':github_account }
  set_cache_info(Config.REDMINE_API_URL, users)
  return users

# get users ad trigraph for normation users
def get_nrm_users(project_id):
  users = get_cache_info(Config.REDMINE_API_URL)
  if users is None:
    return update_nrm_users(project_id)
  return users

# Ask for a user using its trigraph
def ask_username(project_id, trigraph):
  # List Normation users and index by trigraph
  user_ids = get_nrm_users(project_id)

  # Ask for trigraph of user until found
  if trigraph is not None:
    trigraph = trigraph.upper()
  while trigraph not in user_ids:
    print("Assign to ?")
    for user in user_ids.values():
      print("  " + user['trigraph'] + ". " + user['firstname'] + " " + user['lastname'])
    print("  NON. No one")
    print("Enter trigraph (or 'r' to reload user list, 'n' for ): ", end='')
    sys.stdout.flush() # to display previous unfinished line
    trigraph = sys.stdin.readline().strip().upper()
    if trigraph == 'R':
      user_ids = update_nrm_users(project_id)
    if trigraph == 'NON':
      return None
  # update cache if it was created by an old rudder_dev
  if 'github' not in user_ids[trigraph]:
    user_ids = update_nrm_users(project_id)
  return (user_ids[trigraph]['id'], user_ids[trigraph]['github'])


current_lifecycle = None
# Get the current lifecycle
def get_lifecycle():
  global current_lifecycle
  if current_lifecycle is not None:
    return current_lifecycle
  # Update branches so get new branches from UPSTREAM
  shell("git fetch --force " + Config.UPSTREAM_REPOSITORY, "Fetching upstream " + Config.UPSTREAM_REPOSITORY + " to detect lifecycle")
  lines = os.popen("git branch --no-color --list -a").readlines()
  for lifecycle in Config.LIFECYCLES:
    for line in lines:
      if re.match(lifecycle["detection"].format(Config.UPSTREAM_REPOSITORY), line):
        current_lifecycle = lifecycle
        return current_lifecycle
  return current_lifecycle


version_list = None
version_dict = None
# get the version list for a given project
def get_versions(internal=False):
  global version_list, version_dict
  if version_list is not None:
    return (version_list, version_dict)

  # list all versions in git
  lifecycle = get_lifecycle()
  if lifecycle['name'] == 'master_only':
    return (['master'], {}) # unknown version id wince master can match many versions
  git_version_list = []
  for line in os.popen("git branch --no-color --list -a"):
    m = re.match(lifecycle["detection"].format(Config.UPSTREAM_REPOSITORY), line)
    if m:
      git_version_list.append(m.group(1))

  # list all versions using redmine
  server = Redmine(False)
  redmine_version_list = []
  version_dict = {}
  for version in server.version_list(lifecycle['project_id']):
    if version['status'] != 'open':
      continue
    v = None
    for k,val,recheck in Config.REDMINE_VERSION_DETECTOR:
      if re.match(k, version['name']):
        if recheck:
          logfail("***** ERROR: There should not be any version in the rudder project that looks like a plugin version: " + k )
        v = re.sub(k, val, version['name'])
        break
    if v in git_version_list or v == 'master':
      redmine_version_list.append(v)
      version_dict[v] = version['id']

  if not "master" in redmine_version_list:
    redmine_version_list.append("master")
  # keep unique name and sort
  version_list = list(set(redmine_version_list))
  # order is the right one because x.y < 'master'
  version_list.sort(key=LooseVersion)
  return (version_list, version_dict)


# get the version after old
def get_next_version(old, internal=False):
  all_versions = get_versions(internal)[0]
  new = None
  for idx, version in enumerate(all_versions):
    if old == version:
      if idx < len(all_versions)-1:
        new = all_versions[idx+1]
  if new is None:
    logfail("Don't know how to find the version after " + old)
    exit(9)
  return new


# Get branch name from version
def branch_from_version(version):
  if version == "master":
   return version
  else:
    # detect lifecycle and base the name on it
    # assume first position is not next
    lifecycle = get_lifecycle()
    branch = lifecycle['format'][0].format(version)
    print("Branch name for version '"+version+"' is: " + branch)
    return branch

# Get a version from a branch name
def version_from_branch(branch):
  if branch == "master":
    return branch
  else:
    # detect lifecycle and extract the name
    lifecycle = get_lifecycle()
    for f in lifecycle["format"]:
      m = re.match(f.format('(.*)'), branch)
      if m:
        print("Version for branch '" + branch + "' is: " + m.group(1))
        return m.group(1)
  return None


# Find remote repository name from current directory
def remote_repo():
  value = shell("git remote -v", keep_output=True).strip()
  match = re.search(r'git@github.com:Normation/(.*?).git', value)
  if match:
    return match.group(1)
  match = re.search(r'https://github.com/Normation/(.*?)(/|.git)', value)
  if match:
    return match.group(1)
  logfail("Can't find remote repository")
  exit(10)


# create a new PR
def create_pr(master_branch, issue, message, draft=False):
  url = "https://api.github.com/repos/Normation/{repo}/pulls"
  user = get_github_user()
  body = Config.REDMINE_API_URL + "/issues/" + str(issue.id)
  if message is not None:
    body += "\n\n" + message + "\n"

  if issue['private']:
    title = ''
  else:
    title = issue['name']
  pr = '{ "title": ' + json.dumps('Fixes #' + str(issue.id) + ": " + title) + ','
  pr += ' "body": ' + json.dumps(body) + ','
  pr += ' "draft": ' + str(draft).lower() + ','
  pr += ' "head": "' + user + ':' + current_branch + '",'
  pr += ' "base": "' + master_branch + '" }'
  result = github_request(url, "Creating PR ...", pr_url=None, post_data=pr)

  if 'html_url' in result:
    return result['html_url']
  elif 'errors' in result and 'message' in result['errors'][0]:
    error_message = result['errors'][0]['message']
    logfail("Error occurred in create PR: " + error_message)
  else:
    logfail("Unknown error occurred in create PR")

  if not Config.force:
    exit(16)
  else:
    return None


# add a message to a given pull-request on github
def update_pr(pr_url, info, comment):
  message = info
  if comment is not None:
    message += "\\n" + comment
  issue = '{ "body": "' + message + '" }'
  url = "https://api.github.com/repos/Normation/{repo}/issues/{pr_id}/comments"
  github_request(url, "Updating PR ...", pr_url, issue)


# close a given pull-request on github
def close_pr(pr_url, comment):
  # Add comment on the closing reason
  issue = '{ "body": "' + comment + '" }'
  url = "https://api.github.com/repos/Normation/{repo}/issues/{pr_id}/comments"
  github_request(url, "Commenting PR ...", pr_url, issue)

  # close the PR
  issue = '{ "state": "closed" }'
  url = "https://api.github.com/repos/Normation/{repo}/pulls/{pr_id}"
  github_request(url, "Closing PR ...", pr_url, issue)


# get PR upstream branch
def get_pr_upstream(pr_url):
  url = "https://api.github.com/repos/Normation/{repo}/pulls/{pr_id}"
  pr = github_request(url, None, pr_url)
  return pr['base']['ref']

# get PR merge commit
def get_pr_merge_commit(pr_url):
  url = "https://api.github.com/repos/Normation/{repo}/issues/{pr_id}/events"
  pr_events = github_request(url, None, pr_url)
  pr_merged = [ pr for pr in pr_events if pr["event"] == "merged" ]
  if len(pr_merged) != 1:
    logfail("Can't find merge commit for pull request")
    exit(12)
  return pr_merged[0]['commit_id']


# Commit and push, if needed squash WIP and force push
def commit_push(branch, message, force_amend=False, fixup=False):
  # Fork if needed
  if github_fork():
    # github fork is asynchronous but pretty fast, this should be sufficient
    time.sleep(2)

  # in case of amend, check that we have a base commit
  if force_amend:
    # get commit message
    commit = shell("git show --oneline -s ", keep_output=True).strip()
    # extract issue id from commit
    match = re.match(r'.* Fixes #(i?\d+):.*', commit, flags=re.IGNORECASE)
    if match:
      commit_issue_id = int(match.group(1))
    else:
      commit_issue_id = 0
    # extract issue id from branch
    branch_issue_id = issue_from_branch(current_branch).id
    # refuse wrong amend unless forced
    if commit_issue_id != branch_issue_id:
      print("You are trying to amend on a commit that is not on the same branch as your issue, this is likely an error !")
      print("  Commit: " + commit)
      print("  Branch: " + current_branch)
      print("You may want to do a 'rudder-dev commit' first")
      if not Config.force:
        exit(1)

  # git add if needed
  files = shell("git status --untracked-files=no --porcelain", keep_output=True)
  # git status returns a space at beginning of a line if the file is not indexed
  # if all lines start with a space then nothing is indexed
  if re.search(r'^[^ ]', files, flags=re.M) is None: # beware, this is a double negation
    # if nothing is indexed, call git add -u
    shell("git add -u", "No file added, adding all tracked files that were modified")

  (code, wip, err) = shell("git log --grep \"^Work in progress\" HEAD^..", keep_output=True, fail_exit=False)
  if code == 0:
    wip = wip.strip()
  else:
    wip = ''
  commit_cmd = "git commit"
  if message != '':
    # replace ' with '"'"', which will be replaced by the shell
    commit_cmd += " -m '" + message.replace("'", "'\"'\"'") + "'"
  if force_amend or wip != "":
    commit_cmd += " --amend --no-edit"
  if fixup:
    commit_cmd += " --fixup HEAD"
  shell(commit_cmd, "Committing")

  # Should we check the need for rebasing before pushing ?

  push_cmd = "git push " + Config.OWN_REPOSITORY + " " + branch
  if force_amend or wip != "":
    push_cmd += " --force"
  shell(push_cmd, "Pushing to " + Config.OWN_REPOSITORY + " ... ")


# insert a line in a file after a regex has matched
def insert_line(filename, regex, text):
  content=[]
  with open(filename, "r+") as fd:
    for line in fd:
      content.append(line)
      if re.match(regex, line):
        content.append(text)
    fd.seek(0)
    fd.truncate()
    fd.writelines(content)


# fetch a branch from a PR
def fetch_branch_from_pr(pr):
  branch_name = pr.remote_branch()
  fetch_branch_from_pr_as(pr, branch_name + "_pr")
  return branch_name + "_pr"

def fetch_branch_from_pr_as(pr, new_name):
  branch_name = pr.remote_branch()
  # compare remote_branch/repo_name with upstream/repo_name
  remote_url = os.popen("git ls-remote --get-url " + Config.UPSTREAM_REPOSITORY).readline()
  match = re.match(r'.*/([^/]+)\.git$', remote_url)
  if pr.repo_name != match.group(1):
    logfail("The repository of this PR doesn't match your current repository name")
    if not Config.force:
      exit(17)
  # PR can be rebased, force fetch the branch
  shell("git fetch --update-head-ok " + pr.repo() + " +" + branch_name + ":" + new_name, "Fetching branch from remote " + pr.repo())


# fetch a branch from a PR given in an Issue
def fetch_branch_from_issue(issue):
  if 'pr' not in issue or issue['pr'] == '':
    logfail("There is no PR in this issue " + issue.name)
    exit(15)
  pr = PR(issue['pr'])
  new_name = issue.branch_name()
  fetch_branch_from_pr_as(pr, new_name)
  return new_name


def github_fork():
  # we don't need to fork if the remote is known
  remotes = shell("git remote show", keep_output=True).strip()
  if re.search('^'+Config.OWN_REPOSITORY+"$", remotes, flags=re.MULTILINE):
    return False

  # fork the report if not already forked
  if Config.REMOTE_PROTOCOL == 'ssh':
    base_url = "git@github.com:"
  else:
    base_url = "https://github.com/"
  user = get_github_user()
  remote = remote_repo()
  fork_needed = shell("git ls-remote " + base_url + user + "/" + remote + ".git HEAD > /dev/null 2>/dev/null; echo $?", keep_output=True).strip()
  if fork_needed != "0":
    github_request("https://api.github.com/repos/Normation/{repo}/forks", "Forking repository ...", pr_url=None, post_data="")

  # Add the remote repository to the local list
  shell("git remote add " + Config.OWN_REPOSITORY + " " + base_url + user + "/" + remote + ".git", "Adding own remote repository")

  # Tell if we forked or not
  return fork_needed != "0"


# Find if there is work in progress and stash it
stashed=False
def stash():
  global stashed
  # count number of change
  change_count = shell("git status --porcelain --untracked-files=no | wc -l", keep_output=True).strip()
  if change_count != "0":
    # stash them
    shell("git stash save --keep-index", "Stashing current work")
    stashed = True
  # If we stash, it means we are going to checkout
  # Store global current_branch for use by rudder-dev -
  set_cache_info("last_branch", current_branch, remote_repo())


# Find if there is stashed work and unstash it
unstashed=False
not_unstashed=False
def unstash():
  global unstashed,not_unstashed

  # get current branch (the global variable may not be accurate in flight)
  this_branch = shell("git rev-parse --abbrev-ref HEAD", keep_output=True).strip()

  # do not unstash if we are on an origin branch since you should not work directly on them
  remote_count = shell("git ls-remote -h " + Config.UPSTREAM_REPOSITORY + " " + this_branch + " | wc -l", keep_output=True).strip()
  if remote_count != "0":
    return

  # list available stashes
  stash_output = shell("git stash list", keep_output=True)
  for line in stash_output.splitlines():
    # parse
    match = re.match(r'^(stash@{.*?}): WIP on (.*?):.*', line)
    if match:
      # match with current branch
      if match.group(2) == this_branch:
        # only unstash things we have stashed
        if not stashed:
          not_unstashed=True
          return
        # unstash it
        shell("git stash pop --index '" + match.group(1) + "'" , "Unstashing previous work")
        unstashed=True
        # unstash only once
        return


# Tell the user if some stash command happened
def stash_info():
  if stashed:
    print("Some work in progress has been found, " + Config.ERROR_TPL.format("I stashed it") + " before running the commands, to retrieve it use git stash pop")
  if unstashed:
    print("Previous work in progress has been found in the stash, " + Config.ERROR_TPL.format("I unstashed it") + ", to hide it again, use git stash save")

  if not_unstashed:
    print("Previous work in progress has been found in the stash, " + Config.ERROR_TPL.format("I left it stashed") + ", to unstash it, use git stash pop")


###
### MAIN methods
###   branch, commit, rebase, clone, pull
###

# Create a branch from a ticket id
def create_branch(ticket, base=None):
  global current_branch
  issue = Issue(ticket)

  existing_branch = issue.existing_branch()
  if existing_branch is not None:
    shell("git checkout " + existing_branch, "Found existing branch, checkouting " + existing_branch)
    return

  # get ticket info
  print("* Found " + issue['type'] + " #" + str(issue.id) + ": " + issue['name'])
  print("* Target: " + issue['version'])
  print("* URL: " + Config.REDMINE_API_URL + "/issues/" + str(issue.id))

  # Detect an existing PR
  if 'pr' in issue:
    print("Warning, a pull request already exists for this issue!")
    print("Do you want to create a new branch from scratch or take over existing code?")
    print("Type 'c' to create or 't' to takeover [c]:", end='')
    sys.stdout.flush() # to display previous unfinished line
    choice = sys.stdin.readline().strip().upper()
    if choice == "T":
      takeover(ticket)
      return
    # else continue

  # manage the original branch
  if base is not None:
    if re.match(r'^.\d*$', base):
      # fetch base branch
      branch_name = fetch_branch_from_issue(Issue(base))
    else:
      branch_name = base
    # checkout the new branch
    shell("git checkout " + branch_name, "Checkouting the base branch " + branch_name)
  else:
    # Look for release branch and ckeckout its last version
    pull(issue['version'])

  # Create the branch
  current_branch = issue.branch_name()
  shell("git checkout -b " + current_branch, "Creating branch " + current_branch)

  # Set ticket's status
  issue.to_in_progress()

  print("")
  print("# Now you can edit files")
  print("# When you're ready, add them with git add")
  print("# Then type:")
  print(os.path.basename(sys.argv[0]) + " commit")
  print("")


# new version, changelog, commit
def technique(version, message):
  # check current directory
  cwd = os.getcwd()
  match = re.match(r'(.*)/techniques/(.*)', cwd)
  if not re.match(r'.*/techniques/.*', cwd):
    logfail("***** ERROR: You must be in a technique directory")
    exit(7)
  basedir = match.group(1)
  techniquedir = match.group(2)
  script = basedir + '/scripts/technique-files'
  if not os.path.isfile(script):
    logfail("***** ERROR: You must be in rudder-technique repository")
    exit(7)
  last_version = shell(script + " -ld .", keep_output=True).strip()
  match = re.match(r'\./(\d+\.\d+)', last_version)
  if not match:
    logfail("***** ERROR: There must be at least one version of the technique in the current directory")
    exit(7)
  last_version = match.group(1)

  # check next version
  match = re.match(r'\d+\.\d+', version)
  if not match:
    logfail("***** ERROR: Version must be of the form x.y")
    exit(7)

  # check branch info
  issue = issue_from_branch(current_branch)

  # new version
  shell("cp -r " + last_version + " " + version, "Creating the version " + version)

  # Commentend out since we should deprecated only in next major realse
  # Kept here just in case we need the code later
  ## Deprecate old version
  #insert_line(last_version + "/metadata.xml",
  #            r'\s*<TECHNIQUE name=',
  #            "  <DEPRECATED>This technique version has been superseded by a new version. It will no longer be available in the next stable version of Rudder. Please upgrade to the latest version.</DEPRECATED>\n")
  #shell("git add " + last_version + "/metadata.xml", "Adding deprecated info to old version")

  # changelog
  locale.setlocale(locale.LC_TIME, 'C')
  date = time.strftime("%c")
  user = shell("git config --get user.name", keep_output=True).strip()
  mail = shell("git config --get user.email", keep_output=True).strip()
  changelog = " -- " + user + " <" + mail + "> " + date + "\n"
  changelog += "  * Version " + version + "\n"
  changelog += "  ** " + message + "\n"
  with open(version + "/changelog", "a") as fd:
    fd.write(changelog)

  # maintained technique list
  insert_line(basedir + "/maintained-techniques",
              techniquedir + "/" + last_version,
              techniquedir + "/" + version + "\n")
  shell("git add " + basedir + "/maintained-techniques", "Adding maintained-techniques")

  # commit
  shell("git add " + version, "Adding next version")
  technique = os.path.basename(cwd)
  commit_cmd = "git commit -m \"Refs #" + str(issue.id) + ": Creation of " + technique + " version " + version + " from " + last_version + "\""
  shell(commit_cmd, "Committing")


# clone ticket, change version, branch, helper
def subtask(next_version, new_title, base=None, bug=False):
  # make a clone child ticket with next version
  if base is None:
    issue = issue_from_branch(current_branch, False)
  else:
    issue = Issue(base, False)
  if next_version is None:
    if bug:
      next_version = issue['version']
      next_version_id = issue['version_id']
    else:
      next_version = get_next_version(issue['version'])
      all_versions = get_versions()[0]
      answer = ""
      while answer not in all_versions:
        print("You didn't specify --bug nor a specific version")
        print("I assume that you are writing a patch for merging into next version")
        print("Current version: " + issue['version'] + " Next version? ", end="")
        print(all_versions, end="")
        print(" (default " + next_version + "): ", end="")
        sys.stdout.flush() # to display previous unfinished line
        answer = sys.stdin.readline().strip()
        if answer == "":
          answer = next_version
      next_version = answer
      next_version_id = get_versions()[1][next_version]
  else:
    next_version_id = get_versions()[1][next_version]
  print("Next version will be " + str(next_version))
  print("Cloning ticket #" + issue.name)
  new_issue = issue.clone(next_version_id, new_title, bug)
  print(" > ticket #" + str(new_issue.id) + " created " + Config.REDMINE_API_URL + "/issues/" + str(new_issue.id))

  # test merge status
  pull(next_version)
  logs = shell("git log --no-merges '--grep=Fixes #" + str(issue.id) + "' --oneline", "Looking for ticket commit in " + next_version, keep_output=True)

  # rudder-dev branch
  # if branch has been merged, just do a regular branch
  if len(logs) != 0:
    create_branch(new_issue.name)
  # if branch has not been merged, base the ticket on the original
  else:
    create_branch(new_issue.name, current_branch)

    # rebase to next_version
    next_branch = branch_from_version(next_version)
    shell("git rebase -i " + next_branch, "Rebasing to " + next_version)


# open a quickfix issue then and link it
def quick_fix_pr(pr_url, trigraph=None, merge=False):
  # find PR
  pr = PR(pr_url)
  # get infos
  title = pr.title()
  version = version_from_branch(pr.base_branch())
  version_id = get_versions()[1][version]
  lifecycle = get_lifecycle()
  if 'project_id' not in lifecycle:
    logfail("Cannot guess redmine project, cannot open an issue")
    exit(3)
  project_id = lifecycle['project_id']
  # open issue
  server = Redmine(False)
  issue = server.create_issue(project_id, title, "Automatically opened by rudder-dev", Config.BUG_TACKER_ID, version_id)
  print("Opened issue #" + str(issue.id))
  # set pr url
  issue.update(pr_url=pr_url, status=Config.PENDING_TR_CODE)
  # add issue to pr
  pr.comment(issue.url());
  # Take over the issue
  takeover(issue.name)
  # commit
  if merge and trigraph is None:
    trigraph = "BPE" # or anywone willing to needlessly receive a mail
  commit(trigraph)
  # close original pr
  new_issue = Issue(issue.name) # same as the other one but updated
  pr.close("Superseded by " + new_issue['pr'])
  # merge
  if merge:
    new_pr = PR(new_issue['pr'])
    new_pr.label(Config.PR_VALIDATED_LABEL)


# open a type issue then and resolve it
def quick_fix_local(filename, subject, trigraph=None):
  print("Local fix not yet implemented")
  exit(1)


# Commit, push, create pull-request, update ticket
def commit(trigraph=None, message=None, nopr=False, draft=False):
  issue = issue_from_branch(current_branch)
  master_branch = branch_from_version(issue['version'])

  # commit and push
  if issue['private']:
    title = ''
  else:
    title = issue['name'].replace("\"", "\\\"")
  if nopr is True:
    prefix = "Refs"
  else:
    prefix = "Fixes"
  commit_push(current_branch, prefix + " #" + str(issue.id) + ": " + title)

  if nopr is True:
    return

  # create PR
  pr_url = create_pr(master_branch, issue, message, draft)
  pr = PR(pr_url)

  # update ticket
  if Config.REDMINE_TOKEN is not None:
    if issue.can_modify() and (trigraph or not draft):
      (user, github) = ask_username(issue['project_id'], trigraph)
    else:
      (user, github) = (None, '')
    print("Updating ticket #" + str(issue.id))
    issue.update(user, pr_url, message, status=Config.PENDING_TR_CODE)
    print("Ticket updated: " + Config.REDMINE_API_URL + "/issues/" + str(issue.id))
    if github != '':
      print("Updating issuer")
      pr.set_reviewer(github)
  else:
    print("You can now update the ticket here " + Config.REDMINE_API_URL + "/issues/" + str(issue.id))

  print("PR URL: " + str(pr_url))


def wip(trigraph=None, message=None, nopr=False, draft=False):
  commit(trigraph, message, nopr, draft=True)


# amend commit, push -f, comment PR
def amend(comment=None):
  issue = issue_from_branch(current_branch)

  # commit over last commit and push
  if issue['private']:
    title = ''
  else:
    title = issue['name'].replace("\"", "\\\"")
  commit_push(current_branch, "Fixes #" + str(issue.id) + ": " + title, True)

  # Remove labels
  if 'pr' in issue and issue['pr'] != '':
    pr = PR(issue['pr'])
    # Remove validated labels to avoid a direct merge by quality-assistant
    pr.unlabel(Config.PR_VALIDATED_LABEL)
    # Remove cannot merge labels to permit merge again
    pr.unlabel(Config.BOT_CANNOT_MERGE_LABEL)

  # reassign ticket
  if 'last_assignee' in issue:
    issue.update(user_id=issue['last_assignee'], status=Config.PENDING_TR_CODE)

  # Message on PR if there is one
  if 'pr' in issue and issue['pr'] != '':
    update_pr(issue['pr'], "Commit modified", comment)
    print("Done, you can see the PR here: " + issue['pr'])
  else:
    print("No PR found.")


# fixup commit, push, comment PR
def fixup(comment=None):
  issue = issue_from_branch(current_branch)

  # commit over last commit and push
  if issue['private']:
    title = ''
  else:
    title = issue['name'].replace("\"", "\\\"")
  commit_push(current_branch, "Fixes #" + str(issue.id) + ": " + title, False, True)

  # Remove labels
  if 'pr' in issue and issue['pr'] != '':
    pr = PR(issue['pr'])
    # Remove validated labels to avoid a direct merge by quality-assistant
    pr.unlabel(Config.PR_VALIDATED_LABEL)
    # Remove cannot merge labels to permit merge again
    pr.unlabel(Config.BOT_CANNOT_MERGE_LABEL)

  # reassign ticket
  if 'last_assignee' in issue:
    issue.update(user_id=issue['last_assignee'], status=Config.PENDING_TR_CODE)

  # Message on PR if there is one
  if 'pr' in issue and issue['pr'] != '':
    update_pr(issue['pr'], "PR updated with a new commit", comment)
    print("Done, you can see the PR here: " + issue['pr'])
  else:
    print("No PR found.")


# rebase, push -f, comment PR
def rebase(comment=None, base=None):
  issue = issue_from_branch(current_branch)

  # Remove labels to avoid a direct merge by quality-assistant
  if 'pr' in issue and issue['pr'] != '':
    pr = PR(issue['pr'])
    pr.unlabel(Config.PR_VALIDATED_LABEL)
    pr.unlabel(Config.BOT_CANNOT_MERGE_LABEL)

  # fetch before rebasing
  if base is not None:
    if re.match(r'^.\d*$', base):
      # base branch from ticket id
      branch_name = fetch_branch_from_issue(Issue(base))
    else:
      # base branch from branch name
      branch_name = base
  else:
    # base branch from upstream repository
    shell("git fetch --force " + Config.UPSTREAM_REPOSITORY, "Fetching upstream " + Config.UPSTREAM_REPOSITORY)
    origin_branch = branch_from_version(issue['version'])
    branch_name = Config.UPSTREAM_REPOSITORY + "/" + origin_branch

#  # First rebase without commits from source branch
#  # This is necessary because we may have modified the original commit and force pushed it
#  with NamedTemporaryFile(delete=False) as tmpscript:
#    script = """#!/bin/sh
#perl -i -ne "print unless s/^pick \w+ Fixes #(?!%(ticket)s)//" "$1"
#""" % { "ticket": str(issue.id)}
#    tmpscript.write(script.encode("utf-8"))
#    tmpscript.close()
#    os.system("chmod +x " + tmpscript.name)
#    # If it fails, let the interactive rebase try to fix it
#    shell("EDITOR=" + tmpscript.name + " git rebase -i " + branch_name, "First rebase to remove parent commits (EDITOR keeps only  \"pick Fixes #" + str(issue.id) + "\" lines)", fail_exit=False)
#    os.unlink(tmpscript.name)


  # Then interactive rebase
  shell("git rebase -i " + branch_name, "Rebasing")

  # if OK: continue
  shell("git push --force " + Config.OWN_REPOSITORY + " " + current_branch, "Pushing rebased branch")

  # reassign ticket
  if 'last_assignee' in issue:
    issue.update(user_id=issue['last_assignee'])

  # Message on PR if there is one
  if 'pr' in issue and issue['pr'] != '':
    update_pr(issue['pr'], "PR rebased", comment)
    print("Done, you can see the PR here: " + issue['pr'])
  else:
    print("No PR found.")


# close PR, rebase, push -f, create PR, update ticket
def retarget(version=None):
  issue = issue_from_branch(current_branch)

  # update ticket if required
  if version is not None:
    print("Changing target version in the ticket to " + version)
    issue.update_version(version)
    print("New version " + issue['version'])

  ticket_branch = branch_from_version(issue['version'])

  if 'pr' in issue and issue['pr'] != '':
    upstream_branch = get_pr_upstream(issue['pr'])
    if ticket_branch == upstream_branch:
      print("Ticket branch and PR branch match (" + upstream_branch + "), no need to retarget!")
      return

  # fetch before rebasing
  shell("git fetch " + Config.UPSTREAM_REPOSITORY, "Fetching upstream " + Config.UPSTREAM_REPOSITORY)

  # interactive rebase
  shell("git rebase -i " + Config.UPSTREAM_REPOSITORY + "/" + ticket_branch, "Rebasing")

  if 'pr' in issue and issue['pr'] != '':
    # if OK: continue
    shell("git push --force " + Config.OWN_REPOSITORY + " " + current_branch, "Pushing rebased branch")

    # create new PR
    pr_url = create_pr(ticket_branch, issue, "Replacing previous PR: " +  issue['pr'])

    if pr_url:
      # close old PR
      close_pr(issue['pr'], "PR replaced by " + pr_url)

      # update ticket
      user = None
      if 'last_assignee' in issue:
        user = issue['last_assignee']
      issue.update(user, pr_url)
      print("New PR URL: " + pr_url)


# ckeckout version, pull
def pull(version=None):
  global current_branch
  if version is not None:
    branch = branch_from_version(version)

    # branch if needed
    branch_status = shell("git branch --list " + branch, keep_output=True).strip()
    if branch_status == "":
      shell("git fetch " + Config.UPSTREAM_REPOSITORY, "Local branch doesn't exist, fetching from " + Config.UPSTREAM_REPOSITORY)
      remote_branch_status = os.popen("git branch --no-color --list --remote " + Config.UPSTREAM_REPOSITORY + "/" + branch)
      if remote_branch_status == "":
        logfail("No such branch on " + Config.UPSTREAM_REPOSITORY + ": " + branch + ", Aborting ...")
        exit(8)
      shell("git branch --track " + branch + " " + Config.UPSTREAM_REPOSITORY + "/" + branch, "Creating local branch")

    # checkout if needed
    if branch != current_branch:
      shell("git checkout " + branch, "Checkouting " + branch)
    current_branch = branch

  # Pull
  branch_detail = shell("git rev-parse --abbrev-ref --symbolic-full-name @{u}", keep_output=True).strip()
  if re.match(Config.UPSTREAM_REPOSITORY + '/.*', branch_detail):
    shell("git pull --ff-only", "Pulling on " + branch_detail)
  else:
    logfail("Your branch is not based on the same " + Config.UPSTREAM_REPOSITORY + " branch")
    exit(8)


# clone from NRM, fork, add OWN, set-upstream
def clone(name, fork=False):
  token = get_github_token(can_fail=True)
  if token is not None and Config.REMOTE_PROTOCOL == 'ssh':
    url = "git@github.com:Normation/"
  else:
    url = "https://github.com/Normation/"
  shell("git clone --origin " + Config.UPSTREAM_REPOSITORY + " " + url + name + ".git", "Cloning Normation repository")
  if fork:
    os.chdir(name)
    github_fork()


# takevover a ticket with an existing PR
def takeover(ticket):
  global current_branch
  issue = Issue(ticket)

  existing_branch = issue.existing_branch()
  if existing_branch is not None:
    logfail("***** ERROR: Can't take over a ticket with a matching branch already existing in your repository")
    exit(12)

  # fetch base branch
  current_branch = fetch_branch_from_issue(issue)

  # checkout the new branch
  shell("git checkout " + current_branch, "Checkouting the base branch " + current_branch)

  # same workflow as work in progress (but with an existing commit)
  commit_push(current_branch, "Work in progress", True)

  # Set ticket's status
  issue.to_in_progress("I'm taking over this issue!")

  print("")
  print("# Now you can edit files")
  print("# When you're ready, add them with git add")
  print("# Then type:")
  print(os.path.basename(sys.argv[0]) + " commit")
  print("")


# Merge remote version
first_merge=True
def merge_version(old, new, strategy=None, automatic=False, test=False):
  global current_branch
  pull(old)
  branch = branch_from_version(old)
  lifecycle = get_lifecycle()
  if test and not first_merge:
    test_branch = old + "_test"
    shell("git checkout " + test_branch)
    branch = test_branch
    current_branch = test_branch
  merge_branch(branch, new, strategy, automatic, new, test)

# Squash commits in PR if necessary
def pre_merge_squash(version, pr_branch, pr, test=False):
  global current_branch
  pull(version)
  log = shell("git log --pretty='format:%s' " + pr.base_branch() + ".." + pr_branch, keep_output=True)
  if re.search(r'^fixup! ', log, flags=re.MULTILINE):
    if not test:
      pr.comment("""OK, squash merging this PR""")
    shell("git checkout " + pr_branch, "Checkouting local PR branch")
    current_branch = pr_branch
    shell("EDITOR=true git rebase -i --autosquash " + pr.base_branch(), "Squashing multicommits")
    if not test:
      shell("git push --force " + pr.repo() + " " + pr_branch + ":" + pr.remote_branch(), "Pushing merged branch")
    return True
  return False

# Merge remote branch
def merge_branch(old, new, strategy=None, automatic=False, dsc_version="", test=False):
  global first_merge, current_branch


  currentDir = os.getcwd()
  isRudderPlugins = os.path.exists(currentDir + "/main-build.conf")
  isRudderPluginsPrivate = os.path.exists(currentDir + "/rudder-plugins")
  isPluginRepo = isRudderPlugins or isRudderPluginsPrivate

  # Check rudder-plugins-private old-branch modules are up to date
  if isRudderPluginsPrivate:
    shell("cd dsc/rudder-agent-windows && git fetch && git checkout "+old+" && git pull &&  cd ../.. && git add dsc/rudder-agent-windows " , "checkout rudder agent plugin to "+old, fail_exit=False)
    # rudder-plugins-private, checkout and update rudder-plugins to the same branch as we are currently on (new), as we can merge into branches, we need to use branch_from_version
    shell("cd rudder-plugins && git fetch && git checkout "+old+" && git pull && cd .. && git add rudder-plugins" , "checkout rudder plugin to " + old, fail_exit=False)
    opts = ""
    if automatic:
      opts += " --no-edit "
    shell(" git diff --quiet && git diff --quiet --staged || git commit -m 'Update submodules' "+opts, "Committing")
    if not test:
      shell("git push " + Config.UPSTREAM_REPOSITORY + " " + old, "Pushing merged branch")


  pull(new)
  opts = ""
  if strategy is not None:
    if strategy.startswith("upto_"):
      match = re.match(r'upto_(.*)', strategy)
      # Use "ours" strategy if are jut after the limit
      if old == branch_from_version(match.group(1)):
        opts = " -s ours"
      # no need for option otherwise: either we are before -> default option, or we are after and there should be nothing to merge
    else:
      opts = " -s " + strategy
  if automatic:
    opts += " --no-edit "
  if test:
    test_branch = new + "_test"
    shell("git branch -D " + test_branch + " 2>/dev/null || true")
    shell("git checkout -b " + test_branch)
    current_branch = test_branch

  # don't fail if merge fail within plugins, this is expected, especially on plugins-private
  shell("git merge " + old + opts, "Merging " + old + " into " + new, fail_exit=not isPluginRepo)

  # handle merge of specific rudder-plugins files
  if isPluginRepo:
    # rudder-plugins, only main-build.conf, we should ignore conflicts here and keep file as it is
    if isRudderPlugins:
      shell("git restore main-build.conf --ours && git add main-build.conf", "restore main build.conf", fail_exit=False)
    # rudder-plugins-private, checkout and update dsc to current top of branch (a branches/rudder/x.y), we use a new parameter (name parameter, that was unused and is now renamed to dsc_branch)
    new_branch = branch_from_version(new)
    if isRudderPluginsPrivate:
      shell("cd dsc/rudder-agent-windows && git fetch && git checkout "+new_branch+" && git pull &&  cd ../.. && git add dsc/rudder-agent-windows " , "checkout rudder agent plugin to "+new_branch, fail_exit=False)
      # rudder-plugins-private, checkout and update rudder-plugins to the same branch as we are currently on (new), as we can merge into branches, we need to use branch_from_version
      shell("cd rudder-plugins && git fetch && git checkout "+new_branch+" && git pull && cd .. && git add rudder-plugins" , "checkout rudder plugin to " + new_branch, fail_exit=False)
    opts = ""
    if automatic:
      opts += " --no-edit "
    shell(" git diff --quiet && git diff --quiet --staged || git commit -m 'Update submodules' "+opts, "Committing")
  first_merge = False
  if not test:
    shell("git push " + Config.UPSTREAM_REPOSITORY + " " + branch_from_version(new), "Pushing merged branch")


# Merge remote version automatically guessing the next one
def merge_to_next(old, strategy=None, automatic=False, test=False):
  new = get_next_version(old)
  merge_version(old, new, strategy, automatic, test)


# Merge all versions to next one
def merge_all(strategy=None, automatic=False, test=False):
  for version in get_versions()[0][:-1]:
    merge_to_next(version, strategy, automatic, test)

# Merge given issue into its target branch and upmerge
def merge_issue(issue_name=None, strategy=None, automatic=False, test=False, no_autosquash=False):
  # get issue info
  if issue_name is None:
    issue = issue_from_branch(current_branch)
  else:
    issue = Issue(issue_name)

  if 'pr' not in issue or issue['pr'] == '':
    logfail("There is no PR in this issue " + issue.name)
    exit(15)
  merge_pr(issue['pr'], strategy, automatic, test, no_autosquash)


# Merge given PR into its target branch and upmerge
def merge_pr(pr_url, strategy=None, automatic=False, test=False, no_autosquash=False):
  pr = PR(pr_url)

  # PR must be tested
  if pr.draft():
    print("This PR is a draft.")
    logfail("***** ERROR: PR is a draft. Exiting.")
    exit(15)

  # PR must be tested
  if not test and not pr.tests_passed():
    print("This PR tests haven't passed. You should not merge it.")
    logfail("***** ERROR: PR is not tested. Exiting.")
    if not Config.force:
      exit(15)

  # PR must be validated
  if not test and not pr.is_labeled(Config.PR_VALIDATED_LABEL) and pr.review_approval() is not True:
    print("This PR is not labeled '" + Config.PR_VALIDATED_LABEL + "' nor approved. You should not merge it.")
    logfail("***** ERROR: PR is not validated. Exiting.")
    if not Config.force:
      exit(15)

  # get merge info
  pr_branch = fetch_branch_from_pr(pr)
  version = version_from_branch(pr.base_branch())
  if version is None:
    logfail("**** ERROR: cannot guess version of branch " + pr.base_branch() + " Exiting.")
    exit(15)
  print("Target origin merge branch version found for PR: " + version)
  # squash commits
  squashed = False
  if not no_autosquash:
    squashed = pre_merge_squash(version, pr_branch, pr, test)
  if not test and not squashed:
    # comment for the merge if not already done
    pr.comment("""OK, merging this PR""")
  # regular merge
  merge_branch(pr_branch, version, strategy, automatic, pr.repo()+":"+pr.remote_branch(), test)
  # upmerge
  last_version = get_versions()[0][-1]
  while version != last_version:
    merge_to_next(version, strategy, automatic, test)
    version = get_next_version(version)


# Revert commit from ticket passed as parameter, use retarget to keep changes on next branch
def revert(ticket, retarget = False):
  issue = Issue(ticket, False)

  # Find merge commit id
  if 'pr' in issue and issue['pr'] != '':
    commit = get_pr_merge_commit(issue['pr'])
  else:
    logfail("There is no pull request linked to that issue, abort reverting")
    exit(1)

  pull(issue['version'])
  # If we retarget a change, we want only this change to be reverted, so ensure we have already merged the branch correctly
  if retarget:
    merge_to_next(issue['version'])

  # Reverting
  shell("git revert -m1 " + commit, "Reverting issue #" + str(issue.id) + ":" + issue["name"]+ ", commit: " + commit  )
  shell("git push " + Config.UPSTREAM_REPOSITORY, "Updating " + Config.UPSTREAM_REPOSITORY)

  # If we retarget that issue, merge it with ours strategy so the change is still present in next version
  if retarget:
    merge_to_next(issue['version'], "ours")


# Run a command on all branches
def find(command):
  status = {}
  for branch in get_versions()[0]:
    pull(branch)
    (status[branch], x, y) = shell(command, "Running your command", fail_exit=False)
  print("---")
  for branch in  get_versions()[0]:
    ok = "OK" if status[branch] == 0 else "ERR"
    print("%6s: %3s (%d)" % (branch, ok, status[branch]))
  print("---")


# cleanup branches
def cleanup(more=False, dry=False):
  shell("git fetch " + Config.UPSTREAM_REPOSITORY, "Fetching " + Config.UPSTREAM_REPOSITORY)
  pull('master') # necessary to avoid removal errors
  branch_list = shell("git branch --no-color --no-column", keep_output=True)
  for branch in branch_list.splitlines():
    m = re.match(r'^\s*((?:bug|dev|int|impl|ust|UST)_(\d+)/.*)\s*$', branch)
    if m:
      branch=m.group(1)
      ticket_id = m.group(2)
      print("#" + ticket_id + ": ", end='')
      issue = Issue(ticket_id)
      tickets_req = issue.server._query("/issues/" + str(ticket_id) + ".json")
      remove = False

      # guess if we should remove the branch
      if tickets_req.status_code == requests.codes.ok:
        ticket = tickets_req.json()['issue']
        # The ticket is closed -> probably
        if ticket['status']['id'] in Config.REDMINE_CLOSED_STATUSES:
          print("ticket closed, ", end='')
          version = None
          if 'fixed_version' in ticket:
            version = issue.server.major_or_master(ticket['fixed_version']['name'])
          should_ask = False
          if version is not None:
            upstream = Config.UPSTREAM_REPOSITORY + '/' + branch_from_version(version)
            (code, cherry_list, err) = shell("git cherry " + upstream + " " + branch + " 2>/dev/null", keep_output=True, fail_exit=False)
            if code == 0 and cherry_list == "":
              # Everything is merged -> YES
              print("commits merged upstream, ", end='')
              remove = True
            else:
              # Some commits mays not have been merged -> Ask the user
              print("some commits not merged upstream, ", end='')
              should_ask = True
          else:
            # Can't find upstream branch -> ask the user
            print("can't check upstream branch from ticket, ", end='')
            should_ask = True
          if more and should_ask:
            print(Config.REDMINE_API_URL + "/issues/" + ticket_id)
            print("Do you want to remove it ? [y/N]", end='')
            sys.stdout.flush() # to display previous unfinished line
            answer = sys.stdin.readline().strip().upper()
            if answer.upper() == "Y":
              remove = True
        # The ticket is open -> NO

      if remove:
        print("removing: " + branch)
        if not dry:
          shell("git branch -d " + branch, "Deleting local " + branch)
          shell("git push " + Config.OWN_REPOSITORY + " --delete " + branch, "Deleting remote " + branch, fail_exit=False)
      else:
        print("keeping: " + branch)


def update():
  my_path = os.path.abspath(__file__)
  with open(my_path, 'r') as fd:
    my_text = fd.read()

  data = requests.get(Config.RUDDER_DEV_ORIGIN)
  if data.status_code != requests.codes.ok:
    logfail("Cannot get last version of rudder-dev sorry!")
    exit(14)
  new_text = data.text

  # No update needed
  if my_text == new_text:
    print("No update needed!")
    (code, x, y) = shell("touch '" + my_path + "'", "Trying touch to avoid warnings", fail_exit=False)
    if code != 0:
      shell("sudo touch '" + my_path + "'", "Trying sudo touch to avoid warnings")
    exit(0)

  # Try to update rudder-dev with our access rights
  try:
    with open(my_path, 'w') as fd:
      fd.write(new_text)
  except Exception as e:
    # Try with sudo instead
    shell("cat <<'EOF' | sudo tee '" + my_path + "' > /dev/null \n" + new_text + "\nEOF\n")

  print("rudder-dev has been updated, well done!")


def check_update():
  my_path = os.path.abspath(__file__)
  # mtime = last modification = content changed
  ctime = os.path.getmtime(my_path)
  days_ago = (time.time() - ctime) / 60 / 60 / 24
  if days_ago > Config.WARN_FOR_UPDATE_AFTER:
    print("Your version of rudder-dev is old and probably needs an update, please run 'rudder-dev update'")
  # else everything is up to date


def blame(filename, long_format, before_commit, changed_from):
  commit_opt = ""
  if before_commit is not None:
    commit_opt = before_commit + "^1"
  reverse_opt = ""
  if changed_from is not None:
    reverse_opt = "--reverse " + changed_from + "..HEAD"
    # get ordered list of commit id
    commits = shell("git log --oneline --abbrev=7", keep_output=True) # abbrev used is +1 in git blame !!
    commit_list = commits.split("\n")
    commit_list = [ x.split(' ',1)[0] for x in commit_list ]
  blame = shell("git blame -s -w --abbrev=6 " + reverse_opt + " " + filename + " " + commit_opt, "Blaming file " + filename, keep_output=True)
  output = ""
  for line in blame.split('\n'):
    # extract commit id
    match = re.search(r'([0-9a-f]+?) (.*)', line)
    if match:
      commit_id = match.group(1)
      if commit_id == "0000000":
        log = "new"
      else:
        if changed_from is not None:
          # in reverse mode, git returns the previous commit, so replace it with the next one
          commit_index = commit_list.index(commit_id)
          if commit_index == 0:
            # we are on the last commit, so the line has never been changed
            log = "never"
            commit_id = "never"
          else:
            commit_id = commit_list[commit_list.index(commit_id)-1]
            log = shell("git show --oneline -s " + commit_id, keep_output=True).strip()
        else:
          log = shell("git show --oneline -s " + commit_id, keep_output=True).strip()
      # format the log line and add it to the blame
      if long_format:
        output += "%-65.65s %s\n" % (log, match.group(2))
      else:
        m = re.search(r'^([0-9a-f]+?) (Fixes|Merge|Ref).*?(\d{4,})', log)
        if m:
          logtype = ""
          if m.group(2) == "Fixes":
            logtype = "F"
          if m.group(2) == "Ref":
            logtype = "R"
          if m.group(2) == "Merge":
            logtype = "M"
          logline = m.group(1) + " " + logtype + " #" + m.group(3)
        else:
          logline = commit_id
        output += "%-20.20s %s\n" % (logline, match.group(2))
  print(output)


# Main loop
if __name__ == "__main__":
  arguments = docopt.docopt(__doc__)
  read_configuration("rudder-dev")

  # smart argument
  smart = arguments['<smart_arg>']
  arguments['back'] = False
  if smart:
    if smart == '-':
      arguments['back'] = True
    elif re.match(r'^i?\d{2,5}$', smart):
      arguments['branch'] = True
      arguments['<ticket_id>'] = smart
    elif re.match(r'^\d\.(\d+|x)$|^master$', smart):
      arguments['pull'] = True
      arguments['<branch>'] = smart
    elif re.match(r'^[A-Z]{3}$', smart):
      arguments['commit'] = True
      arguments['<trigraph>'] = smart
    else:
      logfail("Error, unknown <smartarg> " + smart)
      exit(10)

  # Force argument is a global variables
  Config.force = arguments['-f'] or arguments['--force']

  if not arguments['clone']: # this is the only exception
    # check repository and get common info
    (code, current_branch, error) = shell("git rev-parse --abbrev-ref HEAD", keep_output=True, fail_exit=False)
    current_branch = current_branch.strip()
    if code != 0:
      logfail("***** ERROR: Unable to get the current git branch name, this directory is probably not a git repository")
      exit(11)

  # check if update is needed
  check_update()

  # standard arguments
  if arguments['-d'] or arguments['--debug']:
    Config.LOGLEVEL = "debug"
  if arguments['back']:
    last_branch = get_cache_info("last_branch", remote_repo())
    if last_branch is None:
      logfail("***** ERROR: No last branch recorded.")
      exit(16)
    stash()
    shell("git checkout " + last_branch)
    stashed=True # automatically unstash previous work
    unstash()
  elif arguments['clone']:
    clone(arguments['<repository_name>'], arguments['--fork'])
  elif arguments['pull']:
    stash()
    pull(arguments['<branch>'])
    unstash()
    stash_info()
  elif arguments['branch']:
    stash()
    create_branch(arguments['<ticket_id>'], arguments['--base'])
    unstash()
    stash_info()
  elif arguments['quickfix']:
    arg = arguments['<pr_url/file>']
    if arg.startswith("http"):
      # subject / trigraph is for documentation, docopt makes no difference
      if arguments['<trigraph>'] is not None:
        logfail("You must not pass subject for an existing PR fix")
        exit(3)
      quick_fix_pr(arg, arguments['<subject>'], arguments['--merge'])
    else:
      if arguments['--merge']:
        logfail("You cannot merge your own quickfix")
        exit(3)
      quick_fix_local(arg, arguments['<subject>'], arguments['<trigraph>'])
  elif arguments['technique']:
    technique(arguments['<version>'], arguments['<comment>'])
  elif arguments['subtask']:
    stash()
    subtask(arguments['<next_branch>'], arguments['<new_title>'], arguments['--base'], arguments['--bug'])
    unstash()
    stash_info()
  elif arguments['wip']:
    wip()
  elif arguments['commit']:
    commit(arguments['<trigraph>'], arguments['<PR_comment>'], arguments['--nopr'])
  elif arguments['amend']:
    amend(arguments['<PR_comment>'])
  elif arguments['fixup']:
    fixup(arguments['<PR_comment>'])
  elif arguments['rebase']:
    rebase(arguments['<PR_comment>'], arguments['--base'])
  elif arguments['retarget']:
    retarget(arguments['<target_version>'])
  elif arguments['takeover']:
    stash()
    takeover(arguments['<ticket_id>'])
    unstash()
    stash_info()
  elif arguments['revert']:
    revert(arguments['<ticket_id>'], arguments['--retarget'])
  elif arguments['merge']:
    stash()
    before_merge_branch = current_branch
    automatic = arguments['-a'] or arguments['--automatic']
    test = arguments['-t'] or arguments['--test']
    if arguments['all']: # rudder-dev merge all
      merge_all(arguments['<strategy>'], automatic, test)
    elif arguments['<first_branch>'] is not None: # rudder-dev merge first next
      merge_version(arguments['<first_branch>'], arguments['<next_branch>'], arguments['<strategy>'], automatic, test)
    elif arguments['<first_branch/ticket_id/pr_url>'] is None: # rudder-dev merge
      merge_issue(strategy=arguments['<strategy>'], automatic=automatic, test=test, no_autosquash=arguments['--no-autosquash'])
    elif re.match(r'^i?\d{3,}$', arguments['<first_branch/ticket_id/pr_url>']): # 3+ digits -> ticket_id: rudder-dev merge ticket_id
      merge_issue(arguments['<first_branch/ticket_id/pr_url>'], arguments['<strategy>'], automatic, test)
    elif re.match(r'^https?://', arguments['<first_branch/ticket_id/pr_url>']): # starts with http:// -> pr_url: rudder-dev merge pr
      merge_pr(arguments['<first_branch/ticket_id/pr_url>'], arguments['<strategy>'], automatic, test, arguments['--no-autosquash'])
    else: # rudder-dev merge first
      merge_to_next(arguments['<first_branch/ticket_id/pr_url>'], arguments['<strategy>'], automatic, test)
    shell("git checkout " + before_merge_branch, "Going back to the branch we were before merge")
    unstash()
  elif arguments['find']:
    find(arguments['<command>'])
  elif arguments['cleanup']:
    cleanup(arguments['--more'], arguments['-n'] or arguments['--dry-run'])
  elif arguments['update']:
    update()
  elif arguments['blame']:
    blame(arguments['<file>'], arguments['--long'], arguments['--before'], arguments['--changed-after'])