Skip to content

Project

User facing class for interacting with a REDCap Project

Project (Arms, DataAccessGroups, Events, FieldNames, Files, Instruments, Logging, Metadata, ProjectInfo, Records, Repeating, Reports, Surveys, Users, UserRoles, Version)

Main class for interacting with REDCap projects

Attributes:

Name Type Description
verify_ssl

Verify SSL, default True. Can pass path to CA_BUNDLE

!!! note Your REDCap token should be kept secret! Treat it like a password and NEVER save it directly in your script/application. Rather it should be obscured and retrieved 'behind the scenes'. For example, saving the token as an environment variable and retrieving it with os.getenv. The creation of the TOKEN string in the example is not shown, for the above reasons

Examples:

>>> from redcap import Project
>>> URL = "https://redcapdemo.vanderbilt.edu/api/"
>>> proj = Project(URL, TOKEN)
>>> proj.field_names
['record_id', 'field_1', 'checkbox_field', 'upload_field']
>>> proj.is_longitudinal
True
>>> proj.def_field
'record_id'

The url and token attributes are read-only, to prevent users from accidentally overwriting them

>>> proj.url = "whoops"
Traceback (most recent call last):
...
AttributeError: ...
Source code in redcap/project.py
class Project(
    methods.arms.Arms,
    methods.data_access_groups.DataAccessGroups,
    methods.events.Events,
    methods.field_names.FieldNames,
    methods.files.Files,
    methods.instruments.Instruments,
    methods.logging.Logging,
    methods.metadata.Metadata,
    methods.project_info.ProjectInfo,
    methods.records.Records,
    methods.repeating.Repeating,
    methods.reports.Reports,
    methods.surveys.Surveys,
    methods.users.Users,
    methods.user_roles.UserRoles,
    methods.version.Version,
):
    """Main class for interacting with REDCap projects

    Attributes:
        verify_ssl: Verify SSL, default True. Can pass path to CA_BUNDLE

    Note:
        Your REDCap token should be kept **secret**! Treat it like a password
        and NEVER save it directly in your script/application. Rather it should be obscured
        and retrieved 'behind the scenes'. For example, saving the token as an environment
        variable and retrieving it with `os.getenv`. The creation of the `TOKEN` string in
        the example is not shown, for the above reasons

    Examples:
        >>> from redcap import Project
        >>> URL = "https://redcapdemo.vanderbilt.edu/api/"
        >>> proj = Project(URL, TOKEN)
        >>> proj.field_names
        ['record_id', 'field_1', 'checkbox_field', 'upload_field']
        >>> proj.is_longitudinal
        True
        >>> proj.def_field
        'record_id'

        The url and token attributes are read-only, to prevent users from accidentally
        overwriting them
        >>> proj.url = "whoops"
        Traceback (most recent call last):
        ...
        AttributeError: ...
    """

    @property
    def redcap_version(self) -> Optional[semantic_version.Version]:
        """REDCap version of the Project"""
        self._redcap_version: Optional[semantic_version.Version]
        try:
            return self._redcap_version
        except AttributeError:
            # weird pylint bug on windows where it can't find Version.export_version()
            # possible too many parents it's inheriting from? We also need to disable
            # useless-supression since this is a windows only issue
            # pylint: disable=no-member,useless-suppression
            self._redcap_version = self.export_version()
            # pylint: enable=no-member,useless-suppression
            return self._redcap_version

def_field: str inherited property readonly

The 'record_id' field equivalent for a project

field_names: List[str] inherited property readonly

Project field names

!!! note These are survey field names, not export field names

forms: List[str] inherited property readonly

Project form names

is_longitudinal: bool inherited property readonly

Whether or not this project is longitudinal

metadata: Json inherited property readonly

Project metadata in JSON format

redcap_version: Optional[semantic_version.base.Version] property readonly

REDCap version of the Project

token: str inherited property readonly

API token to a project

url: str inherited property readonly

API URL to a REDCap server

delete_arms(self, arms, return_format_type='json') inherited

Delete Arms from the Project

!!! note Because of this method's destructive nature, it is only available for use for projects in Development status. Additionally, please be aware that deleting an arm also automatically deletes all events that belong to that arm, and will also automatically delete any records/data that have been collected under that arm (this is non-reversible data loss). This only works for longitudinal projects.

Parameters:

Name Type Description Default
arms List[str]

List of arm numbers to delete from the project

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'

Returns:

Type Description
Union[int, str]

Number of arms deleted

Examples:

Create a new arm

>>> new_arm = [{"arm_num": 2, "name": "Arm 2"}]
>>> proj.import_arms(new_arm)
1

Delete the new arm

>>> proj.delete_arms([2])
1
Source code in redcap/project.py
def delete_arms(
    self,
    arms: List[str],
    return_format_type: Literal["json", "csv", "xml"] = "json",
):
    """
    Delete Arms from the Project

    Note:
        Because of this method's destructive nature, it is only available
        for use for projects in Development status.
        Additionally, please be aware that deleting an arm also automatically
        deletes all events that belong to that arm, and will also automatically
        delete any records/data that have been collected under that arm
        (this is non-reversible data loss).
        This only works for longitudinal projects.

    Args:
        arms: List of arm numbers to delete from the project
        return_format_type:
            Response format. By default, response will be json-decoded.

    Returns:
        Union[int, str]: Number of arms deleted

    Examples:
        Create a new arm
        >>> new_arm = [{"arm_num": 2, "name": "Arm 2"}]
        >>> proj.import_arms(new_arm)
        1

        Delete the new arm
        >>> proj.delete_arms([2])
        1
    """
    payload = self._initialize_payload(
        content="arm", return_format_type=return_format_type
    )
    payload["action"] = "delete"
    # Turn list of arms into dict, and append to payload
    arms_dict = {f"arms[{ idx }]": arm for idx, arm in enumerate(arms)}
    payload.update(arms_dict)

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="delete"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

delete_dags(self, dags, return_format_type='json') inherited

Delete dags from the project.

Parameters:

Name Type Description Default
dags List[str]

List of dags to delete from the project

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'

Returns:

Type Description
Union[int, str]

Number of dags deleted

Examples:

Create a new data access group

>>> new_dag = [{"data_access_group_name": "New DAG", "unique_group_name": ""}]
>>> proj.import_dags(new_dag)
1

We know that 'New DAG' will automatically be assigned 'new_dag' as it's unique group name

>>> proj.delete_dags(["new_dag"])
1
Source code in redcap/project.py
def delete_dags(
    self,
    dags: List[str],
    return_format_type: Literal["json", "csv", "xml"] = "json",
):
    """
    Delete dags from the project.

    Args:
        dags: List of dags to delete from the project
        return_format_type:
            Response format. By default, response will be json-decoded.

    Returns:
        Union[int, str]: Number of dags deleted

    Examples:
        Create a new data access group
        >>> new_dag = [{"data_access_group_name": "New DAG", "unique_group_name": ""}]
        >>> proj.import_dags(new_dag)
        1

        We know that 'New DAG' will automatically be assigned 'new_dag' as it's
        unique group name
        >>> proj.delete_dags(["new_dag"])
        1
    """
    payload = self._initialize_payload(
        content="dag", return_format_type=return_format_type
    )
    payload["action"] = "delete"
    # Turn list of dags into dict, and append to payload
    dags_dict = {f"dags[{ idx }]": dag for idx, dag in enumerate(dags)}
    payload.update(dags_dict)

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="delete"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))
    return response

delete_events(self, events, return_format_type='json') inherited

Delete Events from the Project

!!! note Because of this method's destructive nature, it is only available for use for projects in Development status. Additionally, please be aware that deleting an event will automatically delete any records/data that have been collected under that event (this is non-reversible data loss). This only works for longitudinal projects.

Parameters:

Name Type Description Default
events List[str]

List of unique event names to delete from the project

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'

Returns:

Type Description
Union[int, str]

Number of events deleted

Examples:

Create a new event

>>> new_event = [{"event_name": "Event 2", "arm_num": "1"}]
>>> proj.import_events(new_event)
1

Delete the new event

>>> proj.delete_events(["event_2_arm_1"])
1
Source code in redcap/project.py
def delete_events(
    self,
    events: List[str],
    return_format_type: Literal["json", "csv", "xml"] = "json",
):
    """
    Delete Events from the Project

    Note:
        Because of this method's destructive nature, it is only available
        for use for projects in Development status.
        Additionally, please be aware that deleting an event will automatically
        delete any records/data that have been collected under that event
        (this is non-reversible data loss).
        This only works for longitudinal projects.

    Args:
        events: List of unique event names to delete from the project
        return_format_type:
            Response format. By default, response will be json-decoded.

    Returns:
        Union[int, str]: Number of events deleted

    Examples:
        Create a new event
        >>> new_event = [{"event_name": "Event 2", "arm_num": "1"}]
        >>> proj.import_events(new_event)
        1

        Delete the new event
        >>> proj.delete_events(["event_2_arm_1"])
        1
    """
    payload = self._initialize_payload(
        content="event", return_format_type=return_format_type
    )
    payload["action"] = "delete"
    # Turn list of events into dict, and append to payload
    events_dict = {f"events[{ idx }]": event for idx, event in enumerate(events)}
    payload.update(events_dict)

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="delete"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

delete_file(self, record, field, event=None) inherited

Delete a file from REDCap

!!! note There is no undo button to this.

Parameters:

Name Type Description Default
record str

Record ID

required
field str

Field name

required
event Optional[str]

For longitudinal projects, the unique event name

None

Returns:

Type Description
List[dict]

Empty JSON object

Exceptions:

Type Description
ValueError

Incorrect file field

RedcapError

Bad Request e.g. invalid record_id

Examples:

Import a tempfile and then delete it

>>> import tempfile
>>> tmp_file = tempfile.TemporaryFile()
>>> proj.import_file(
...     record="2",
...     field="upload_field",
...     file_name="myupload.txt",
...     file_object=tmp_file,
...     event="event_1_arm_1",
... )
[{}]
>>> proj.delete_file(record="2", field="upload_field", event="event_1_arm_1")
[{}]
Source code in redcap/project.py
def delete_file(
    self,
    record: str,
    field: str,
    event: Optional[str] = None,
) -> EmptyJson:
    """
    Delete a file from REDCap

    Note:
        There is no undo button to this.

    Args:
        record: Record ID
        field: Field name
        event: For longitudinal projects, the unique event name

    Returns:
        Empty JSON object

    Raises:
        ValueError: Incorrect file field
        RedcapError: Bad Request e.g. invalid record_id

    Examples:
        Import a tempfile and then delete it

        >>> import tempfile
        >>> tmp_file = tempfile.TemporaryFile()
        >>> proj.import_file(
        ...     record="2",
        ...     field="upload_field",
        ...     file_name="myupload.txt",
        ...     file_object=tmp_file,
        ...     event="event_1_arm_1",
        ... )
        [{}]
        >>> proj.delete_file(record="2", field="upload_field", event="event_1_arm_1")
        [{}]
    """
    self._check_file_field(field)
    # Load up payload
    payload = self._initialize_payload(content="file")
    payload["action"] = "delete"
    payload["record"] = record
    payload["field"] = field
    if event:
        payload["event"] = event

    return cast(
        EmptyJson, self._call_api(payload=payload, return_type="empty_json")
    )

delete_records(self, records, return_format_type='json') inherited

Delete records from the project.

Parameters:

Name Type Description Default
records List[str]

List of record IDs to delete from the project

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'

Returns:

Type Description
Union[int, str]

Number of records deleted

Examples:

>>> new_records = [
...     {"record_id": 3, "redcap_repeat_instance": 1, "field_1": 1},
...     {"record_id": 4, "redcap_repeat_instance": 1}
... ]
>>> proj.import_records(new_records)
{'count': 2}
>>> proj.delete_records(["3", "4"])
2
Source code in redcap/project.py
def delete_records(
    self,
    records: List[str],
    return_format_type: Literal["json", "csv", "xml"] = "json",
):
    """
    Delete records from the project.

    Args:
        records: List of record IDs to delete from the project
        return_format_type:
            Response format. By default, response will be json-decoded.

    Returns:
        Union[int, str]: Number of records deleted

    Examples:
        >>> new_records = [
        ...     {"record_id": 3, "redcap_repeat_instance": 1, "field_1": 1},
        ...     {"record_id": 4, "redcap_repeat_instance": 1}
        ... ]
        >>> proj.import_records(new_records)
        {'count': 2}
        >>> proj.delete_records(["3", "4"])
        2
    """
    payload = self._initialize_payload(
        content="record", return_format_type=return_format_type
    )
    payload["action"] = "delete"
    # Turn list of records into dict, and append to payload
    records_dict = {
        f"records[{ idx }]": record for idx, record in enumerate(records)
    }
    payload.update(records_dict)

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="delete"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))
    return response

delete_user_roles(self, roles, return_format_type='json') inherited

Delete user roles from the project.

Parameters:

Name Type Description Default
roles List[str]

List of user roles to delete from the project

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'

Returns:

Type Description
Union[int, str]

Number of user roles deleted

Examples:

Create a new user role

>>> new_role = [{"role_label": "New Role"}]
>>> proj.import_user_roles(new_role)
1

We don't know what the 'unique_role_name' is for the newly created role, so we have to look it up by 'role_label'

>>> roles = proj.export_user_roles()
>>> new_role_id = [
...     role for role in roles
...     if role["role_label"] == "New Role"
... ][0]["unique_role_name"]

Delete the role

>>> proj.delete_user_roles([new_role_id])
1
Source code in redcap/project.py
def delete_user_roles(
    self,
    roles: List[str],
    return_format_type: Literal["json", "csv", "xml"] = "json",
):
    """
    Delete user roles from the project.

    Args:
        roles: List of user roles to delete from the project
        return_format_type:
            Response format. By default, response will be json-decoded.

    Returns:
        Union[int, str]: Number of user roles deleted

    Examples:
        Create a new user role
        >>> new_role = [{"role_label": "New Role"}]
        >>> proj.import_user_roles(new_role)
        1

        We don't know what the 'unique_role_name' is for the newly created role,
        so we have to look it up by 'role_label'
        >>> roles = proj.export_user_roles()
        >>> new_role_id = [
        ...     role for role in roles
        ...     if role["role_label"] == "New Role"
        ... ][0]["unique_role_name"]

        Delete the role
        >>> proj.delete_user_roles([new_role_id])
        1
    """
    payload = self._initialize_payload(
        content="userRole", return_format_type=return_format_type
    )
    payload["action"] = "delete"
    # Turn list of user roles into dict, and append to payload
    roles_dict = {f"roles[{ idx }]": role for idx, role in enumerate(roles)}
    payload.update(roles_dict)

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="delete"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))
    return response

delete_users(self, users, return_format_type='json') inherited

Delete users from the project.

Parameters:

Name Type Description Default
users List[str]

List of usernames to delete from the project

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'

Returns:

Type Description
Union[int, str]

Number of users deleted

Examples:

>>> new_user = [{"username": "pandeharris@gmail.com"}]
>>> proj.import_users(new_user)
1
>>> proj.delete_users(["pandeharris@gmail.com"], return_format_type="xml")
'1'
Source code in redcap/project.py
def delete_users(
    self,
    users: List[str],
    return_format_type: Literal["json", "csv", "xml"] = "json",
):
    """
    Delete users from the project.

    Args:
        users: List of usernames to delete from the project
        return_format_type:
            Response format. By default, response will be json-decoded.

    Returns:
        Union[int, str]: Number of users deleted

    Examples:
        >>> new_user = [{"username": "pandeharris@gmail.com"}]
        >>> proj.import_users(new_user)
        1
        >>> proj.delete_users(["pandeharris@gmail.com"], return_format_type="xml")
        '1'
    """
    payload = self._initialize_payload(
        content="user", return_format_type=return_format_type
    )
    payload["action"] = "delete"
    # Turn list of users into dict, and append to payload
    users_dict = {f"users[{ idx }]": user for idx, user in enumerate(users)}
    payload.update(users_dict)

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="delete"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))
    return response

export_arms(self, format_type='json', arms=None) inherited

Export the Arms of the Project

!!! note This only works for longitudinal projects.

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Response return format

'json'
arms Optional[List[str]]

An array of arm numbers that you wish to pull arms for (by default, all arms are pulled)

None

Returns:

Type Description
Union[List[Dict[str, Any]], str, pandas.DataFrame]

List of Arms

Examples:

>>> proj.export_arms()
[{'arm_num': 1, 'name': 'Arm 1'}]
Source code in redcap/project.py
def export_arms(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    arms: Optional[List[str]] = None,
):
    # pylint: disable=line-too-long
    """
    Export the Arms of the Project

    Note:
        This only works for longitudinal projects.

    Args:
        format_type:
            Response return format
        arms:
            An array of arm numbers that you wish to pull arms for
            (by default, all arms are pulled)

    Returns:
        Union[List[Dict[str, Any]], str, pandas.DataFrame]: List of Arms

    Examples:
        >>> proj.export_arms()
        [{'arm_num': 1, 'name': 'Arm 1'}]
    """
    # pylint:enable=line-too-long
    payload = self._initialize_payload(content="arm", format_type=format_type)
    if arms:
        # Turn list of arms into dict, and append to payload
        arms_dict = {f"arms[{ idx }]": arm for idx, arm in enumerate(arms)}
        payload.update(arms_dict)
    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="arm",
        format_type=format_type,
    )

export_dags(self, format_type='json', df_kwargs=None) inherited

Export the DAGs of the Project

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Response return format

'json'
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame. By default, nothing

None

Returns:

Type Description
Union[List[Dict[str, Any]], str, pandas.DataFrame]

List of DAGs

Examples:

>>> proj.export_dags()
[{'data_access_group_name': 'Test DAG', 'unique_group_name': 'test_dag', 'data_access_group_id': ...}]
Source code in redcap/project.py
def export_dags(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    # pylint: disable=line-too-long
    """
    Export the DAGs of the Project

    Args:
        format_type:
            Response return format
        df_kwargs:
            Passed to `pandas.read_csv` to control construction of
            returned DataFrame. By default, nothing

    Returns:
        Union[List[Dict[str, Any]], str, pandas.DataFrame]: List of DAGs

    Examples:
        >>> proj.export_dags()
        [{'data_access_group_name': 'Test DAG', 'unique_group_name': 'test_dag', 'data_access_group_id': ...}]
    """
    # pylint:enable=line-too-long
    payload = self._initialize_payload(content="dag", format_type=format_type)
    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="dag",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )

export_events(self, format_type='json', arms=None) inherited

Export the Events of the Project

!!! note This only works for longitudinal projects.

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Response return format

'json'
arms Optional[List[str]]

An array of arm numbers that you wish to pull events for (by default, all events are pulled)

None

Returns:

Type Description
Union[List[Dict[str, Any]], str, pandas.DataFrame]

List of Events

Examples:

>>> proj.export_events()
[{'event_name': 'Event 1', 'arm_num': 1, 'unique_event_name': 'event_1_arm_1',
'custom_event_label': '', 'event_id': ...}, {'event_name': 'Event 2', ...}]
Source code in redcap/project.py
def export_events(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    arms: Optional[List[str]] = None,
):
    # pylint: disable=line-too-long
    """
    Export the Events of the Project

    Note:
        This only works for longitudinal projects.

    Args:
        format_type:
            Response return format
        arms:
            An array of arm numbers that you wish to pull events for
            (by default, all events are pulled)

    Returns:
        Union[List[Dict[str, Any]], str, pandas.DataFrame]: List of Events

    Examples:
        >>> proj.export_events()
        [{'event_name': 'Event 1', 'arm_num': 1, 'unique_event_name': 'event_1_arm_1',
        'custom_event_label': '', 'event_id': ...}, {'event_name': 'Event 2', ...}]
    """
    # pylint:enable=line-too-long
    payload = self._initialize_payload(content="event", format_type=format_type)
    if arms:
        # Turn list of arms into dict, and append to payload
        arms_dict = {f"arms[{ idx }]": arm for idx, arm in enumerate(arms)}
        payload.update(arms_dict)
    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="event",
        format_type=format_type,
    )

export_field_names(self, format_type='json', field=None, df_kwargs=None) inherited

Export the project's export field names

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Return the metadata in native objects, csv or xml. 'df' will return a pandas.DataFrame

'json'
field Optional[str]

Limit exported field name to this field (only single field supported). When not provided, all fields returned

None
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame. by default {'index_col': 'original_field_name'}

None

Returns:

Type Description
Union[str, List[Dict[str, Any]], "pd.DataFrame"]

Metadata structure for the project.

Examples:

>>> proj.export_field_names()
[{'original_field_name': 'record_id', 'choice_value': '', 'export_field_name': 'record_id'},
{'original_field_name': 'field_1', 'choice_value': '', 'export_field_name': 'field_1'},
{'original_field_name': 'checkbox_field', 'choice_value': '1', 'export_field_name': 'checkbox_field___1'},
{'original_field_name': 'checkbox_field', 'choice_value': '2', 'export_field_name': 'checkbox_field___2'},
{'original_field_name': 'form_1_complete', 'choice_value': '', 'export_field_name': 'form_1_complete'}]
Source code in redcap/project.py
def export_field_names(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    field: Optional[str] = None,
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    # pylint: disable=line-too-long
    """
    Export the project's export field names

    Args:
        format_type:
            Return the metadata in native objects, csv or xml.
            `'df'` will return a `pandas.DataFrame`
        field:
            Limit exported field name to this field (only single field supported).
            When not provided, all fields returned
        df_kwargs:
            Passed to `pandas.read_csv` to control construction of
            returned DataFrame.
            by default `{'index_col': 'original_field_name'}`

    Returns:
        Union[str, List[Dict[str, Any]], "pd.DataFrame"]: Metadata structure for the project.

    Examples:
        >>> proj.export_field_names()
        [{'original_field_name': 'record_id', 'choice_value': '', 'export_field_name': 'record_id'},
        {'original_field_name': 'field_1', 'choice_value': '', 'export_field_name': 'field_1'},
        {'original_field_name': 'checkbox_field', 'choice_value': '1', 'export_field_name': 'checkbox_field___1'},
        {'original_field_name': 'checkbox_field', 'choice_value': '2', 'export_field_name': 'checkbox_field___2'},
        {'original_field_name': 'form_1_complete', 'choice_value': '', 'export_field_name': 'form_1_complete'}]
    """
    # pylint: enable=line-too-long
    payload = self._initialize_payload(
        content="exportFieldNames", format_type=format_type
    )

    if field:
        payload["field"] = field

    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="exportFieldNames",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )

export_file(self, record, field, event=None, repeat_instance=None) inherited

Export the contents of a file stored for a particular record

!!! note Unlike other export methods, this only works on a single record.

Parameters:

Name Type Description Default
record str

Record ID

required
field str

Field name containing the file to be exported.

required
event Optional[str]

For longitudinal projects, the unique event name

None
repeat_instance Optional[int]

(Only for projects with repeating instruments/events) The repeat instance number of the repeating event (if longitudinal) or the repeating instrument (if classic or longitudinal).

None

Returns:

Type Description
Tuple[bytes, dict]

Content of the file and content-type dictionary

Exceptions:

Type Description
ValueError

Incorrect file field

RedcapError

Bad Request e.g. invalid record_id

Examples:

If your project has events, then you must specifiy the event of interest. Otherwise, you can leave the event parameter blank

>>> proj.export_file(record="1", field="upload_field", event="event_1_arm_1")
(b'test upload\n', {'name': 'test_upload.txt', 'charset': 'UTF-8'})
Source code in redcap/project.py
def export_file(
    self,
    record: str,
    field: str,
    event: Optional[str] = None,
    repeat_instance: Optional[int] = None,
) -> FileMap:
    """
    Export the contents of a file stored for a particular record

    Note:
        Unlike other export methods, this only works on a single record.

    Args:
        record: Record ID
        field: Field name containing the file to be exported.
        event: For longitudinal projects, the unique event name
        repeat_instance:
            (Only for projects with repeating instruments/events)
            The repeat instance number of the repeating event (if longitudinal)
            or the repeating instrument (if classic or longitudinal).

    Returns:
        Content of the file and content-type dictionary

    Raises:
        ValueError: Incorrect file field
        RedcapError: Bad Request e.g. invalid record_id

    Examples:
        If your project has events, then you must specifiy the event of interest.
        Otherwise, you can leave the event parameter blank

        >>> proj.export_file(record="1", field="upload_field", event="event_1_arm_1")
        (b'test upload\\n', {'name': 'test_upload.txt', 'charset': 'UTF-8'})
    """
    self._check_file_field(field)
    # load up payload
    payload = self._initialize_payload(content="file")
    # there's no format field in this call
    payload["action"] = "export"
    payload["field"] = field
    payload["record"] = record
    if event:
        payload["event"] = event
    if repeat_instance:
        payload["repeat_instance"] = str(repeat_instance)
    content, headers = cast(
        FileMap, self._call_api(payload=payload, return_type="file_map")
    )
    # REDCap adds some useful things in content-type
    content_map = {}
    if "content-type" in headers:
        splat = [
            key_values.strip() for key_values in headers["content-type"].split(";")
        ]
        key_values = [
            (key_values.split("=")[0], key_values.split("=")[1].replace('"', ""))
            for key_values in splat
            if "=" in key_values
        ]
        content_map = dict(key_values)

    return content, content_map

export_instrument_event_mappings(self, format_type='json', arms=None, df_kwargs=None) inherited

Export the project's instrument to event mapping

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Return the form event mappings in native objects, csv or xml, 'df'' will return a pandas.DataFrame

'json'
arms Optional[List[str]]

Limit exported form event mappings to these arms

None
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame

None

Returns:

Type Description
Union[str, List[Dict[str, Any]], pd.DataFrame]

Instrument-event mapping for the project

Examples:

>>> proj.export_instrument_event_mappings()
[{'arm_num': 1, 'unique_event_name': 'event_1_arm_1', 'form': 'form_1'}]
Source code in redcap/project.py
def export_instrument_event_mappings(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    arms: Optional[List[str]] = None,
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    """
    Export the project's instrument to event mapping

    Args:
        format_type:
            Return the form event mappings in native objects,
            csv or xml, `'df''` will return a `pandas.DataFrame`
        arms: Limit exported form event mappings to these arms
        df_kwargs:
            Passed to pandas.read_csv to control construction of
            returned DataFrame

    Returns:
        Union[str, List[Dict[str, Any]], pd.DataFrame]: Instrument-event mapping for the project

    Examples:
        >>> proj.export_instrument_event_mappings()
        [{'arm_num': 1, 'unique_event_name': 'event_1_arm_1', 'form': 'form_1'}]
    """
    payload = self._initialize_payload(
        content="formEventMapping", format_type=format_type
    )

    if arms:
        for i, value in enumerate(arms):
            payload[f"arms[{ i }]"] = value

    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="formEventMapping",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )

export_instruments(self, format_type='json') inherited

Export the Instruments of the Project

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Response return format

'json'

Returns:

Type Description
Union[List[Dict[str, Any]], str, pandas.DataFrame]

List of Instruments

Examples:

>>> proj.export_instruments()
[{'instrument_name': 'form_1', 'instrument_label': 'Form 1'}]
Source code in redcap/project.py
def export_instruments(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
):
    """
    Export the Instruments of the Project

    Args:
        format_type:
            Response return format

    Returns:
        Union[List[Dict[str, Any]], str, pandas.DataFrame]: List of Instruments

    Examples:
        >>> proj.export_instruments()
        [{'instrument_name': 'form_1', 'instrument_label': 'Form 1'}]
    """
    payload = self._initialize_payload(
        content="instrument", format_type=format_type
    )
    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="instrument",
        format_type=format_type,
    )

export_logging(self, format_type='json', return_format_type=None, log_type=None, user=None, record=None, dag=None, begin_time=None, end_time=None, df_kwargs=None) inherited

Export the project's logs

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Return the metadata in native objects, csv or xml. 'df' will return a pandas.DataFrame

'json'
return_format_type Optional[Literal['json', 'csv', 'xml']]

Response format. By default, response will be json-decoded.

None
log_type Optional[Literal['export', 'manage', 'user', 'record', 'record_add', 'record_edit', 'record_delete', 'lock_record', 'page_view']]

Filter by specific event types

None
user Optional[str]

Filter by events created by a certain user

None
record Optional[str]

Filter by events created for a certain record

None
dag Optional[str]

Filter by events created by a certain data access group (group ID)

None
begin_time Optional[datetime.datetime]

Filter by events created after a given timestamp

None
end_time Optional[datetime.datetime]

Filter by events created before a given timestamp

None
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame.

None

Returns:

Type Description
Union[str, List[Dict[str, Any]], "pd.DataFrame"]

List of all changes made to this project, including data exports, data changes, and the creation or deletion of users

Examples:

>>> proj.export_logging()
[{'timestamp': ..., 'username': ..., 'action': 'Manage/Design ',
'details': 'Create project ...'}, ...]
Source code in redcap/project.py
def export_logging(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    return_format_type: Optional[Literal["json", "csv", "xml"]] = None,
    log_type: Optional[
        Literal[
            "export",
            "manage",
            "user",
            "record",
            "record_add",
            "record_edit",
            "record_delete",
            "lock_record",
            "page_view",
        ]
    ] = None,
    user: Optional[str] = None,
    record: Optional[str] = None,
    dag: Optional[str] = None,
    begin_time: Optional[datetime] = None,
    end_time: Optional[datetime] = None,
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    """
    Export the project's logs

    Args:
        format_type:
            Return the metadata in native objects, csv or xml.
            `'df'` will return a `pandas.DataFrame`
        return_format_type:
            Response format. By default, response will be json-decoded.
        log_type:
            Filter by specific event types
        user:
            Filter by events created by a certain user
        record:
            Filter by events created for a certain record
        dag:
            Filter by events created by a certain data access group (group ID)
        begin_time:
            Filter by events created after a given timestamp
        end_time:
            Filter by events created before a given timestamp
        df_kwargs:
            Passed to `pandas.read_csv` to control construction of
            returned DataFrame.
    Returns:
        Union[str, List[Dict[str, Any]], "pd.DataFrame"]:
            List of all changes made to this project, including data exports,
            data changes, and the creation or deletion of users

    Examples:
        >>> proj.export_logging()
        [{'timestamp': ..., 'username': ..., 'action': 'Manage/Design ',
        'details': 'Create project ...'}, ...]
    """
    payload: Dict[str, Any] = self._initialize_payload(
        content="log", format_type=format_type
    )
    optional_args = [
        ("returnFormat", return_format_type),
        ("logtype", log_type),
        ("user", user),
        ("record", record),
        ("dag", dag),
        ("beginTime", begin_time),
        ("endTime", end_time),
    ]

    for arg in optional_args:
        arg_name, arg_value = arg
        if arg_value:
            if arg_name in ["beginTime", "endTime"]:
                arg_value = cast(datetime, arg_value)
                arg_value = arg_value.strftime("%Y-%m-%d %H:%M:%S")

            payload[arg_name] = arg_value

    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="log",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )
    # pylint: enable=too-many-locals

export_metadata(self, format_type='json', fields=None, forms=None, df_kwargs=None) inherited

Export the project's metadata

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Return the metadata in native objects, csv, or xml. 'df' will return a pandas.DataFrame

'json'
fields Optional[List[str]]

Limit exported metadata to these fields

None
forms Optional[List[str]]

Limit exported metadata to these forms

None
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame. By default {'index_col': 'field_name'}

None

Returns:

Type Description
Union[str, List[Dict], pd.DataFrame]

Metadata structure for the project.

Examples:

>>> proj.export_metadata(format_type="df")
               form_name  section_header  ... matrix_ranking field_annotation
field_name                                ...
record_id         form_1             NaN  ...            NaN              NaN
field_1           form_1             NaN  ...            NaN              NaN
checkbox_field    form_1             NaN  ...            NaN              NaN
upload_field      form_1             NaN  ...            NaN              NaN
...
Source code in redcap/project.py
def export_metadata(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    fields: Optional[List[str]] = None,
    forms: Optional[List[str]] = None,
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    """
    Export the project's metadata

    Args:
        format_type:
            Return the metadata in native objects, csv, or xml.
            `'df'` will return a `pandas.DataFrame`
        fields: Limit exported metadata to these fields
        forms: Limit exported metadata to these forms
        df_kwargs:
            Passed to `pandas.read_csv` to control construction of
            returned DataFrame.
            By default `{'index_col': 'field_name'}`

    Returns:
        Union[str, List[Dict], pd.DataFrame]: Metadata structure for the project.

    Examples:
        >>> proj.export_metadata(format_type="df")
                       form_name  section_header  ... matrix_ranking field_annotation
        field_name                                ...
        record_id         form_1             NaN  ...            NaN              NaN
        field_1           form_1             NaN  ...            NaN              NaN
        checkbox_field    form_1             NaN  ...            NaN              NaN
        upload_field      form_1             NaN  ...            NaN              NaN
        ...
    """
    payload = self._initialize_payload(content="metadata", format_type=format_type)
    to_add = [fields, forms]
    str_add = ["fields", "forms"]
    for key, data in zip(str_add, to_add):
        if data:
            for i, value in enumerate(data):
                payload[f"{key}[{i}]"] = value

    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="metadata",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )

export_pdf(self, record=None, event=None, instrument=None, repeat_instance=None, all_records=None, compact_display=None) inherited

Export PDF file of instruments, either as blank or with data

Parameters:

Name Type Description Default
record Optional[str]

Record ID

None
event Optional[str]

For longitudinal projects, the unique event name

None
instrument Optional[str]

Unique instrument name

None
repeat_instance Optional[int]

(Only for projects with repeating instruments/events) The repeat instance number of the repeating event (if longitudinal) or the repeating instrument (if classic or longitudinal).

None
all_records Optional[bool]

If True, then all records will be exported as a single PDF file. Note: If this is True, then record, event, and instrument parameters are all ignored.

None
compact_display Optional[bool]

If True, then the PDF will be exported in compact display mode.

None

Returns:

Type Description
Tuple[bytes, dict]

Content of the file and dictionary of useful metadata

Examples:

>>> proj.export_pdf()
(b'%PDF-1.3\n3 0 obj\n..., {...})
Source code in redcap/project.py
def export_pdf(
    self,
    record: Optional[str] = None,
    event: Optional[str] = None,
    instrument: Optional[str] = None,
    repeat_instance: Optional[int] = None,
    all_records: Optional[bool] = None,
    compact_display: Optional[bool] = None,
) -> FileMap:
    """
    Export PDF file of instruments, either as blank or with data

    Args:
        record: Record ID
        event: For longitudinal projects, the unique event name
        instrument: Unique instrument name
        repeat_instance:
            (Only for projects with repeating instruments/events)
            The repeat instance number of the repeating event (if longitudinal)
            or the repeating instrument (if classic or longitudinal).
        all_records:
            If True, then all records will be exported as a single PDF file.
            Note: If this is True, then record, event, and instrument parameters
                  are all ignored.
        compact_display:
            If True, then the PDF will be exported in compact display mode.

    Returns:
        Content of the file and dictionary of useful metadata

    Examples:
        >>> proj.export_pdf()
        (b'%PDF-1.3\\n3 0 obj\\n..., {...})
    """
    # load up payload
    payload = self._initialize_payload(content="pdf", return_format_type="json")
    keys_to_add = (
        record,
        event,
        instrument,
        repeat_instance,
        all_records,
        compact_display,
    )
    str_keys = (
        "record",
        "event",
        "instrument",
        "repeat_instance",
        "allRecords",
        "compactDisplay",
    )
    for key, data in zip(str_keys, keys_to_add):
        data = cast(str, data)
        if data:
            payload[key] = data
    payload["action"] = "export"

    content, headers = cast(
        FileMap, self._call_api(payload=payload, return_type="file_map")
    )
    # REDCap adds some useful things in content-type
    content_map = {}
    if "content-type" in headers:
        splat = [
            key_values.strip() for key_values in headers["content-type"].split(";")
        ]
        key_values = [
            (key_values.split("=")[0], key_values.split("=")[1].replace('"', ""))
            for key_values in splat
            if "=" in key_values
        ]
        content_map = dict(key_values)

    return content, content_map

export_project_info(self, format_type='json', df_kwargs=None) inherited

Export Project Information

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Format of returned data

'json'
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame. By default, nothing

None

Returns:

Type Description
Union[str, List[Dict[str, Any]], pandas.DataFrame]

Project information

Examples:

>>> proj.export_project_info()
{'project_id': ...
...
'in_production': 0,
'project_language': 'English',
'purpose': 0,
'purpose_other': '',
...
'project_grant_number': '',
'project_pi_firstname': '',
'project_pi_lastname': '',
...
 'bypass_branching_erase_field_prompt': 0}
Source code in redcap/project.py
def export_project_info(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    """
    Export Project Information

    Args:
        format_type: Format of returned data
        df_kwargs:
            Passed to `pandas.read_csv` to control construction of
            returned DataFrame. By default, nothing

    Returns:
        Union[str, List[Dict[str, Any]], pandas.DataFrame]: Project information

    Examples:
        >>> proj.export_project_info()
        {'project_id': ...
        ...
        'in_production': 0,
        'project_language': 'English',
        'purpose': 0,
        'purpose_other': '',
        ...
        'project_grant_number': '',
        'project_pi_firstname': '',
        'project_pi_lastname': '',
        ...
         'bypass_branching_erase_field_prompt': 0}
    """

    payload = self._initialize_payload(content="project", format_type=format_type)
    return_type = self._lookup_return_type(format_type, request_type="export")

    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="project",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )

export_records(self, format_type='json', records=None, fields=None, forms=None, events=None, raw_or_label='raw', raw_or_label_headers='raw', event_name='label', record_type='flat', export_survey_fields=False, export_data_access_groups=False, export_checkbox_labels=False, filter_logic=None, date_begin=None, date_end=None, decimal_character=None, export_blank_for_gray_form_status=None, df_kwargs=None) inherited

Export data from the REDCap project.

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Format of returned data. 'json' returns json-decoded objects while 'csv' and 'xml' return other formats. 'df' will attempt to return a pandas.DataFrame

'json'
records Optional[List[str]]

Array of record names specifying specific records to export. By default, all records are exported

None
fields Union[List[str], str]

Single field name or array of field names specifying specific fields to pull. By default, all fields are exported

None
forms Union[List[str], str]

Single form name or array of form names to export. If in the web UI, the form name has a space in it, replace the space with an underscore. By default, all forms are exported

None
events Optional[List[str]]

An array of unique event names from which to export records Note: This only applies to longitudinal projects

None
raw_or_label Literal['raw', 'label', 'both']

Export the raw coded values or labels for the options of multiple choice fields, or both

'raw'
raw_or_label_headers Literal['raw', 'label']

Export the column names of the instrument as their raw value or their labeled value

'raw'
event_name Literal['label', 'unique']

Export the unique event name or the event label

'label'
record_type Literal['flat', 'eav']

Database output structure type

'flat'
export_survey_fields bool

Specifies whether or not to export the survey identifier field (e.g., "redcap_survey_identifier") or survey timestamp fields (e.g., form_name+"_timestamp") when surveys are utilized in the project

False
export_data_access_groups bool

Specifies whether or not to export the "redcap_data_access_group" field when data access groups are utilized in the project

Note: This flag is only viable if the user whose token is being used to make the API request is not in a data access group. If the user is in a group, then this flag will revert to its default value.

False
export_checkbox_labels bool

Specify whether to export checkbox values as their label on export.

False
filter_logic Optional[str]

Filter which records are returned using REDCap conditional syntax

None
date_begin Optional[datetime.datetime]

Filter on records created after a date

None
date_end Optional[datetime.datetime]

Filter on records created before a date

None
decimal_character Optional[Literal[',', '.']]

Force all numbers into same decimal format

None
export_blank_for_gray_form_status Optional[bool]

Whether or not to export blank values for instrument complete status fields that have a gray status icon

None
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame. By default, {'index_col': self.def_field}

None

Returns:

Type Description
Union[List[Dict[str, Any]], str, pd.DataFrame]

Exported data

Examples:

>>> proj.export_records()
[{'record_id': '1', 'redcap_event_name': 'event_1_arm_1', 'redcap_repeat_instrument': '',
'redcap_repeat_instance': 1, 'field_1': '1',
'checkbox_field___1': '0', 'checkbox_field___2': '1', 'upload_field': 'test_upload.txt',
'form_1_complete': '2'},
{'record_id': '2', 'redcap_event_name': 'event_1_arm_1', 'redcap_repeat_instrument': '',
'redcap_repeat_instance': 1, 'field_1': '0',
'checkbox_field___1': '0', 'checkbox_field___2': '0', 'upload_field': 'myupload.txt',
'form_1_complete': '0'}]
>>> proj.export_records(filter_logic="[field_1] = 1")
[{'record_id': '1', 'redcap_event_name': 'event_1_arm_1', 'redcap_repeat_instrument': '',
'redcap_repeat_instance': 1, 'field_1': '1',
'checkbox_field___1': '0', 'checkbox_field___2': '1', 'upload_field': 'test_upload.txt',
'form_1_complete': '2'}]
>>> proj.export_records(
...     format_type="csv",
...     fields=["field_1", "checkbox_field"],
...     raw_or_label="label"
... )
'record_id,redcap_event_name,redcap_repeat_instrument,redcap_repeat_instance,field_1,checkbox_field___1,checkbox_field___2\n1,"Event 1",,1,Yes,Unchecked,Checked\n2,"Event 1",,1,No,Unchecked,Unchecked\n'
>>> import pandas as pd
>>> pd.set_option("display.max_columns", 3)
>>> proj.export_records(format_type="df")
                             redcap_repeat_instrument  ...  form_1_complete
record_id redcap_event_name                            ...
1         event_1_arm_1                           NaN  ...                2
2         event_1_arm_1                           NaN  ...                0
...
Source code in redcap/project.py
def export_records(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    records: Optional[List[str]] = None,
    fields: Optional[Union[List[str], str]] = None,
    forms: Optional[Union[List[str], str]] = None,
    events: Optional[List[str]] = None,
    raw_or_label: Literal["raw", "label", "both"] = "raw",
    raw_or_label_headers: Literal["raw", "label"] = "raw",
    event_name: Literal["label", "unique"] = "label",
    record_type: Literal["flat", "eav"] = "flat",
    export_survey_fields: bool = False,
    export_data_access_groups: bool = False,
    export_checkbox_labels: bool = False,
    filter_logic: Optional[str] = None,
    date_begin: Optional[datetime] = None,
    date_end: Optional[datetime] = None,
    decimal_character: Optional[Literal[",", "."]] = None,
    export_blank_for_gray_form_status: Optional[bool] = None,
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    # pylint: disable=line-too-long
    r"""
    Export data from the REDCap project.

    Args:
        format_type:
            Format of returned data. `'json'` returns json-decoded
            objects while `'csv'` and `'xml'` return other formats.
            `'df'` will attempt to return a `pandas.DataFrame`
        records:
            Array of record names specifying specific records to export.
            By default, all records are exported
        fields:
            Single field name or array of field names specifying specific
            fields to pull.
            By default, all fields are exported
        forms:
            Single form name or array of form names to export. If in the
            web UI, the form name has a space in it, replace the space
            with an underscore.
            By default, all forms are exported
        events:
            An array of unique event names from which to export records
            Note:
                This only applies to longitudinal projects
        raw_or_label:
            Export the raw coded values or labels for the options of
            multiple choice fields, or both
        raw_or_label_headers:
            Export the column names of the instrument as their raw
            value or their labeled value
        event_name:
            Export the unique event name or the event label
        record_type:
            Database output structure type
        export_survey_fields:
            Specifies whether or not to export the survey identifier
            field (e.g., "redcap_survey_identifier") or survey timestamp
            fields (e.g., form_name+"_timestamp") when surveys are
            utilized in the project
        export_data_access_groups:
            Specifies whether or not to export the
            `"redcap_data_access_group"` field when data access groups
            are utilized in the project

            Note:
                This flag is only viable if the user whose token is
                being used to make the API request is *not* in a data
                access group. If the user is in a group, then this flag
                will revert to its default value.
        export_checkbox_labels:
            Specify whether to export checkbox values as their label on
            export.
        filter_logic:
            Filter which records are returned using REDCap conditional syntax
        date_begin:
            Filter on records created after a date
        date_end:
            Filter on records created before a date
        decimal_character:
            Force all numbers into same decimal format
        export_blank_for_gray_form_status:
            Whether or not to export blank values for instrument complete status fields
            that have a gray status icon
        df_kwargs:
            Passed to `pandas.read_csv` to control construction of
            returned DataFrame.
            By default, `{'index_col': self.def_field}`
    Returns:
        Union[List[Dict[str, Any]], str, pd.DataFrame]: Exported data

    Examples:
        >>> proj.export_records()
        [{'record_id': '1', 'redcap_event_name': 'event_1_arm_1', 'redcap_repeat_instrument': '',
        'redcap_repeat_instance': 1, 'field_1': '1',
        'checkbox_field___1': '0', 'checkbox_field___2': '1', 'upload_field': 'test_upload.txt',
        'form_1_complete': '2'},
        {'record_id': '2', 'redcap_event_name': 'event_1_arm_1', 'redcap_repeat_instrument': '',
        'redcap_repeat_instance': 1, 'field_1': '0',
        'checkbox_field___1': '0', 'checkbox_field___2': '0', 'upload_field': 'myupload.txt',
        'form_1_complete': '0'}]

        >>> proj.export_records(filter_logic="[field_1] = 1")
        [{'record_id': '1', 'redcap_event_name': 'event_1_arm_1', 'redcap_repeat_instrument': '',
        'redcap_repeat_instance': 1, 'field_1': '1',
        'checkbox_field___1': '0', 'checkbox_field___2': '1', 'upload_field': 'test_upload.txt',
        'form_1_complete': '2'}]

        >>> proj.export_records(
        ...     format_type="csv",
        ...     fields=["field_1", "checkbox_field"],
        ...     raw_or_label="label"
        ... )
        'record_id,redcap_event_name,redcap_repeat_instrument,redcap_repeat_instance,field_1,checkbox_field___1,checkbox_field___2\n1,"Event 1",,1,Yes,Unchecked,Checked\n2,"Event 1",,1,No,Unchecked,Unchecked\n'

        >>> import pandas as pd
        >>> pd.set_option("display.max_columns", 3)
        >>> proj.export_records(format_type="df")
                                     redcap_repeat_instrument  ...  form_1_complete
        record_id redcap_event_name                            ...
        1         event_1_arm_1                           NaN  ...                2
        2         event_1_arm_1                           NaN  ...                0
        ...
    """
    # pylint: enable=line-too-long
    payload: Dict[str, Any] = self._initialize_payload(
        content="record", format_type=format_type, record_type=record_type
    )

    if isinstance(fields, str):
        fields = [fields]

    if isinstance(forms, str):
        forms = [forms]

    fields = self._backfill_fields(fields, forms)

    keys_to_add = (
        records,
        fields,
        forms,
        events,
        raw_or_label,
        raw_or_label_headers,
        event_name,
        export_survey_fields,
        export_data_access_groups,
        export_checkbox_labels,
        filter_logic,
        decimal_character,
        export_blank_for_gray_form_status,
    )

    str_keys = (
        "records",
        "fields",
        "forms",
        "events",
        "rawOrLabel",
        "rawOrLabelHeaders",
        "eventName",
        "exportSurveyFields",
        "exportDataAccessGroups",
        "exportCheckboxLabel",
        "filterLogic",
        "decimalCharacter",
        "exportBlankForGrayFormStatus",
    )

    for key, data in zip(str_keys, keys_to_add):
        if data:
            if key in ("fields", "records", "forms", "events"):
                data = cast(List[str], data)
                for i, value in enumerate(data):
                    payload[f"{ key }[{ i }]"] = value
            else:
                payload[key] = data

    if date_begin:
        payload["dateRangeBegin"] = date_begin.strftime("%Y-%m-%d %H:%M:%S")

    if date_end:
        payload["dateRangeEnd"] = date_end.strftime("%Y-%m-%d %H:%M:%S")

    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="record",
        format_type=format_type,
        df_kwargs=df_kwargs,
        record_type=record_type,
    )

export_repeating_instruments_events(self, format_type='json', df_kwargs=None) inherited

Export the project's repeating instruments and events settings

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Return the repeating instruments and events in native objects, csv or xml, 'df'' will return a pandas.DataFrame

'json'
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame

None

Returns:

Type Description
Union[str, List[Dict[str, Any]], pd.DataFrame]

Repeating instruments and events for the project

Examples:

>>> proj.export_repeating_instruments_events()
[{'event_name': 'event_1_arm_1', 'form_name': '', 'custom_form_label': ''}]
Source code in redcap/project.py
def export_repeating_instruments_events(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    """
    Export the project's repeating instruments and events settings

    Args:
        format_type:
            Return the repeating instruments and events in native objects,
            csv or xml, `'df''` will return a `pandas.DataFrame`
        df_kwargs:
            Passed to pandas.read_csv to control construction of
            returned DataFrame

    Returns:
        Union[str, List[Dict[str, Any]], pd.DataFrame]: Repeating instruments and events
         for the project

    Examples:
        >>> proj.export_repeating_instruments_events()
        [{'event_name': 'event_1_arm_1', 'form_name': '', 'custom_form_label': ''}]
    """
    payload = self._initialize_payload(
        content="repeatingFormsEvents", format_type=format_type
    )

    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="repeatingFormsEvents",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )

export_report(self, report_id, format_type='json', raw_or_label='raw', raw_or_label_headers='raw', export_checkbox_labels=False, csv_delimiter=',', df_kwargs=None) inherited

Export a report of the Project

Parameters:

Name Type Description Default
report_id str

The report ID number provided next to the report name on the report list page

required
format_type Literal['json', 'csv', 'xml', 'df']

Format of returned data. 'json' returns json-decoded objects while 'csv' and 'xml' return strings. 'df' will attempt to return a pandas.DataFrame.

'json'
raw_or_label Literal['raw', 'label']

Export the raw coded values or labels for the options of multiple choice fields

'raw'
raw_or_label_headers Literal['raw', 'label']

For the CSV headers, export the variable/field names (raw) or the field labels (label)

'raw'
export_checkbox_labels bool

Specifies the format of checkbox field values specifically when exporting the data as labels (i.e. when rawOrLabel=label). When exporting labels, by default (without providing the exportCheckboxLabel flag or if exportCheckboxLabel=false), all checkboxes will either have a value 'Checked' if they are checked or 'Unchecked' if not checked. But if exportCheckboxLabel is set to true, it will instead export the checkbox value as the checkbox option's label (e.g., 'Choice 1') if checked or it will be blank/empty (no value) if not checked

False
csv_delimiter Literal[',', 'tab', ';', '|', '^']

For the csv format, choose how the csv delimiter.

','

Exceptions:

Type Description
ValueError

Unsupported format specified

Returns:

Type Description
Union[List[Dict[str, Any]], str, pd.DataFrame]

Data from the report ordered by the record (primary key of project) and then by event id

Examples:

>>> proj.export_report(report_id="4292")
[{'record_id': '1', 'redcap_event_name': 'event_1_arm_1',
'checkbox_field___1': '0', 'checkbox_field___2': '1'}]
Source code in redcap/project.py
def export_report(
    self,
    report_id: str,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    raw_or_label: Literal["raw", "label"] = "raw",
    raw_or_label_headers: Literal["raw", "label"] = "raw",
    export_checkbox_labels: bool = False,
    csv_delimiter: Literal[",", "tab", ";", "|", "^"] = ",",
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    """
    Export a report of the Project

    Args:
        report_id:
            The report ID number provided next to the report name
            on the report list page
        format_type:
            Format of returned data. `'json'` returns json-decoded
            objects while `'csv'` and `'xml'` return strings.
            `'df'` will attempt to return a `pandas.DataFrame`.
        raw_or_label:
            Export the raw coded values or
            labels for the options of multiple choice fields
        raw_or_label_headers:
            For the CSV headers, export the variable/field names
            (raw) or the field labels (label)
        export_checkbox_labels:
            Specifies the format of
            checkbox field values specifically when exporting the data as labels
            (i.e. when `rawOrLabel=label`). When exporting labels, by default
            (without providing the exportCheckboxLabel flag or if
            exportCheckboxLabel=false), all checkboxes will either have a value
            'Checked' if they are checked or 'Unchecked' if not checked.
            But if exportCheckboxLabel is set to true, it will instead export
            the checkbox value as the checkbox option's label (e.g., 'Choice 1')
            if checked or it will be blank/empty (no value) if not checked
        csv_delimiter:
            For the csv format, choose how the csv delimiter.

    Raises:
        ValueError: Unsupported format specified

    Returns:
        Union[List[Dict[str, Any]], str, pd.DataFrame]: Data from the report ordered by
        the record (primary key of project) and then by event id

    Examples:
        >>> proj.export_report(report_id="4292") # doctest: +SKIP
        [{'record_id': '1', 'redcap_event_name': 'event_1_arm_1',
        'checkbox_field___1': '0', 'checkbox_field___2': '1'}]
    """
    payload = self._initialize_payload(content="report", format_type=format_type)
    keys_to_add = (
        report_id,
        raw_or_label,
        raw_or_label_headers,
        export_checkbox_labels,
        csv_delimiter,
    )
    str_keys = (
        "report_id",
        "rawOrLabel",
        "rawOrLabelHeaders",
        "exportCheckboxLabel",
        "csvDelimiter",
    )
    for key, data in zip(str_keys, keys_to_add):
        data = cast(str, data)
        if data:
            payload[key] = data

    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="report",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )

export_survey_participant_list(self, instrument, format_type='json', event=None, df_kwargs=None) inherited

Export the Survey Participant List

!!! note The passed instrument must be set up as a survey instrument.

Parameters:

Name Type Description Default
instrument str

Name of instrument as seen in the Data Dictionary (metadata).

required
format_type Literal['json', 'csv', 'xml', 'df']

Format of returned data

'json'
event Optional[str]

Unique event name, only used in longitudinal projects

None
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame. By default, nothing

None

Returns:

Type Description
Union[List[Dict[str, Any]], str, pandas.DataFrame]

List of survey participants, along with other useful metadata such as the record, response status, etc.

Examples:

>>> proj.export_survey_participant_list(instrument="form_1", event="event_1_arm_1")
[{'email': '',
...
'survey_access_code': ...},
{'email': '',
...
'survey_access_code': ...}]
Source code in redcap/project.py
def export_survey_participant_list(
    self,
    instrument: str,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    event: Optional[str] = None,
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    """
    Export the Survey Participant List

    Note:
        The passed instrument must be set up as a survey instrument.

    Args:
        instrument:
            Name of instrument as seen in the Data Dictionary (metadata).
        format_type:
            Format of returned data
        event:
            Unique event name, only used in longitudinal projects
        df_kwargs:
            Passed to `pandas.read_csv` to control construction of
            returned DataFrame. By default, nothing

    Returns:
        Union[List[Dict[str, Any]], str, pandas.DataFrame]:
            List of survey participants,
            along with other useful
            metadata such as the record, response status, etc.

    Examples:
        >>> proj.export_survey_participant_list(instrument="form_1", event="event_1_arm_1")
        [{'email': '',
        ...
        'survey_access_code': ...},
        {'email': '',
        ...
        'survey_access_code': ...}]
    """
    payload = self._initialize_payload(
        content="participantList", format_type=format_type
    )
    payload["instrument"] = instrument
    if event:
        payload["event"] = event

    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="participantList",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )

export_user_dag_assignment(self, format_type='json', df_kwargs=None) inherited

Export the User-DAG assignment of the Project

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Response return format

'json'
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame. By default, nothing

None

Returns:

Type Description
Union[List[Dict[str, Any]], str, pandas.DataFrame]

List of User-DAGs assignments

Examples:

>>> proj.export_user_dag_assignment()
[{'username': ..., 'redcap_data_access_group': ''}]
Source code in redcap/project.py
def export_user_dag_assignment(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    """
    Export the User-DAG assignment of the Project

    Args:
        format_type:
            Response return format
        df_kwargs:
            Passed to `pandas.read_csv` to control construction of
            returned DataFrame. By default, nothing

    Returns:
        Union[List[Dict[str, Any]], str, pandas.DataFrame]:
            List of User-DAGs assignments

    Examples:
        >>> proj.export_user_dag_assignment()
        [{'username': ..., 'redcap_data_access_group': ''}]
    """
    payload = self._initialize_payload(
        content="userDagMapping", format_type=format_type
    )
    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="userDagMapping",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )

export_user_role_assignment(self, format_type='json', df_kwargs=None) inherited

Export the User-Role assignments of the Project

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Response return format

'json'
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame. By default, nothing

None

Returns:

Type Description
Union[List[Dict[str, Any]], str, pandas.DataFrame]

List of user-role assignments

Examples:

>>> proj.export_user_role_assignment()
[{'username': ..., 'unique_role_name': '', 'data_access_group': ''}]
Source code in redcap/project.py
def export_user_role_assignment(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    """
    Export the User-Role assignments of the Project

    Args:
        format_type:
            Response return format
        df_kwargs:
            Passed to `pandas.read_csv` to control construction of
            returned DataFrame. By default, nothing

    Returns:
        Union[List[Dict[str, Any]], str, pandas.DataFrame]:
            List of user-role assignments

    Examples:
        >>> proj.export_user_role_assignment()
        [{'username': ..., 'unique_role_name': '', 'data_access_group': ''}]
    """
    payload = self._initialize_payload(
        content="userRoleMapping", format_type=format_type
    )
    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="userRoleMapping",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )

export_user_roles(self, format_type='json', df_kwargs=None) inherited

Export the user roles of the Project

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Response return format

'json'
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame. By default, nothing

None

Returns:

Type Description
Union[List[Dict[str, Any]], str, pandas.DataFrame]

List of user roles with assigned user rights

Examples:

>>> proj.export_user_roles()
[{'unique_role_name': ..., 'role_label': 'Test role', 'design': '0', 'alerts': '0',
'user_rights': '0', 'data_access_groups': '0', 'reports': '0', 'stats_and_charts': '0',
'manage_survey_participants': '0', 'calendar': '0', 'data_import_tool': '0',
'data_comparison_tool': '0', 'logging': '0', 'file_repository': '0',
'data_quality_create': '0', 'data_quality_execute': '0', 'api_export': '0',
'api_import': '0', 'mobile_app': '0', 'mobile_app_download_data': '0',
'record_create': '0', 'record_rename': '0', 'record_delete': '0',
'lock_records_customization': '0', 'lock_records': '0', ...,
'forms': {'form_1': 2}, 'forms_export': {'form_1': 0}}]
Source code in redcap/project.py
def export_user_roles(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    """
    Export the user roles of the Project

    Args:
        format_type:
            Response return format
        df_kwargs:
            Passed to `pandas.read_csv` to control construction of
            returned DataFrame. By default, nothing

    Returns:
        Union[List[Dict[str, Any]], str, pandas.DataFrame]:
            List of user roles with assigned user rights

    Examples:
        >>> proj.export_user_roles()
        [{'unique_role_name': ..., 'role_label': 'Test role', 'design': '0', 'alerts': '0',
        'user_rights': '0', 'data_access_groups': '0', 'reports': '0', 'stats_and_charts': '0',
        'manage_survey_participants': '0', 'calendar': '0', 'data_import_tool': '0',
        'data_comparison_tool': '0', 'logging': '0', 'file_repository': '0',
        'data_quality_create': '0', 'data_quality_execute': '0', 'api_export': '0',
        'api_import': '0', 'mobile_app': '0', 'mobile_app_download_data': '0',
        'record_create': '0', 'record_rename': '0', 'record_delete': '0',
        'lock_records_customization': '0', 'lock_records': '0', ...,
        'forms': {'form_1': 2}, 'forms_export': {'form_1': 0}}]
    """
    payload = self._initialize_payload(content="userRole", format_type=format_type)
    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="userRole",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )

export_users(self, format_type='json', df_kwargs=None) inherited

Export the users of the Project

Parameters:

Name Type Description Default
format_type Literal['json', 'csv', 'xml', 'df']

Response return format

'json'
df_kwargs Optional[Dict[str, Any]]

Passed to pandas.read_csv to control construction of returned DataFrame. By default, nothing

None

Returns:

Type Description
Union[List[Dict[str, Any]], str, pandas.DataFrame]

List of users with metadata

Examples:

>>> proj.export_users()
[{'username': ..., 'email': ..., 'expiration': '', 'data_access_group': '',
'data_access_group_id': '', 'design': 1, 'alerts': 1, 'user_rights': 1,
'data_access_groups': 1, 'reports': 1, ...}]
Source code in redcap/project.py
def export_users(
    self,
    format_type: Literal["json", "csv", "xml", "df"] = "json",
    df_kwargs: Optional[Dict[str, Any]] = None,
):
    """
    Export the users of the Project

    Args:
        format_type:
            Response return format
        df_kwargs:
            Passed to `pandas.read_csv` to control construction of
            returned DataFrame. By default, nothing

    Returns:
        Union[List[Dict[str, Any]], str, pandas.DataFrame]: List of users with metadata

    Examples:
        >>> proj.export_users()
        [{'username': ..., 'email': ..., 'expiration': '', 'data_access_group': '',
        'data_access_group_id': '', 'design': 1, 'alerts': 1, 'user_rights': 1,
        'data_access_groups': 1, 'reports': 1, ...}]
    """
    payload = self._initialize_payload(content="user", format_type=format_type)
    return_type = self._lookup_return_type(format_type, request_type="export")
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return self._return_data(
        response=response,
        content="user",
        format_type=format_type,
        df_kwargs=df_kwargs,
    )

export_version(self) inherited

Get the REDCap version

Returns:

Type Description
Optional[semantic_version.base.Version]

REDCap version running on the url provided

Examples:

>>> import semantic_version
>>> redcap_version = proj.export_version()
>>> assert redcap_version >= semantic_version.Version("12.0.1")
Source code in redcap/project.py
def export_version(self) -> Optional[semantic_version.Version]:
    """
    Get the REDCap version

    Returns:
        REDCap version running on the url provided

    Examples:
        >>> import semantic_version
        >>> redcap_version = proj.export_version()
        >>> assert redcap_version >= semantic_version.Version("12.0.1")
    """
    payload = self._initialize_payload("version")
    resp = None

    redcap_version = self._call_api(payload, return_type="str")

    if semantic_version.validate(redcap_version):
        resp = semantic_version.Version(redcap_version)

    return resp

generate_next_record_name(self) inherited

Get the next record name

Returns:

Type Description
str

The next record name for a project with auto-numbering records enabled

Examples:

>>> proj.generate_next_record_name()
'3'
Source code in redcap/project.py
def generate_next_record_name(self) -> str:
    """
    Get the next record name

    Returns:
        The next record name for a project with auto-numbering records enabled

    Examples:
        >>> proj.generate_next_record_name()
        '3'
    """
    # Force the csv format here since if the project uses data access groups
    # or just non-standard record names then the result will not be JSON-compliant
    payload = self._initialize_payload(
        content="generateNextRecordName", format_type="csv"
    )

    return cast(str, self._call_api(payload, return_type="str"))

import_arms(self, to_import, return_format_type='json', import_format='json', override=0) inherited

Import Arms into the REDCap Project

!!! note This only works for longitudinal projects.

Parameters:

Name Type Description Default
to_import Union[str, List[Dict[str, Any]], pd.DataFrame]

array of dicts, csv/xml string, pandas.DataFrame Note: If you pass a csv or xml string, you should use the import format parameter appropriately.

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'
import_format Literal['json', 'csv', 'xml', 'df']

Format of incoming data. By default, to_import will be json-encoded

'json'
override Optional[int]

0 - false [default], 1 - true You may use override=1 as a 'delete all + import' action in order to erase all existing Arms in the project while importing new Arms. If override=0, then you can only add new Arms or rename existing ones.

0

Returns:

Type Description
Union[int, str]

Number of Arms added or updated

Examples:

Create a new arm

>>> new_arm = [{"arm_num": 2, "name": "Arm 2"}]
>>> proj.import_arms(new_arm)
1
Source code in redcap/project.py
def import_arms(
    self,
    to_import: Union[str, List[Dict[str, Any]], "pd.DataFrame"],
    return_format_type: Literal["json", "csv", "xml"] = "json",
    import_format: Literal["json", "csv", "xml", "df"] = "json",
    override: Optional[int] = 0,
):
    """
    Import Arms into the REDCap Project

    Note:
        This only works for longitudinal projects.

    Args:
        to_import: array of dicts, csv/xml string, `pandas.DataFrame`
            Note:
                If you pass a csv or xml string, you should use the
                `import format` parameter appropriately.
        return_format_type:
            Response format. By default, response will be json-decoded.
        import_format:
            Format of incoming data. By default, to_import will be json-encoded
        override:
            0 - false [default], 1 - true
            You may use override=1 as a 'delete all + import' action in order to
            erase all existing Arms in the project while importing new Arms.
            If override=0, then you can only add new Arms or rename existing ones.

    Returns:
        Union[int, str]: Number of Arms added or updated

    Examples:
        Create a new arm
        >>> new_arm = [{"arm_num": 2, "name": "Arm 2"}]
        >>> proj.import_arms(new_arm)
        1
    """
    payload = self._initialize_import_payload(
        to_import=to_import,
        import_format=import_format,
        return_format_type=return_format_type,
        content="arm",
    )
    payload["action"] = "import"
    payload["override"] = override

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="import"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

import_dags(self, to_import, return_format_type='json', import_format='json') inherited

Import DAGs into the REDCap Project

!!! note DAGs can be renamed by simply changing the group name (data_access_group_name). DAGs can be created by providing group name value while unique group name should be set to blank.

Parameters:

Name Type Description Default
to_import Union[str, List[Dict[str, Any]], pd.DataFrame]

array of dicts, csv/xml string, pandas.DataFrame Note: If you pass a csv or xml string, you should use the import format parameter appropriately.

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'
import_format Literal['json', 'csv', 'xml', 'df']

Format of incoming data. By default, to_import will be json-encoded

'json'

Returns:

Type Description
Union[int, str]

Number of DAGs added or updated

Examples:

Create a new data access group

>>> new_dag = [{"data_access_group_name": "New DAG", "unique_group_name": ""}]
>>> proj.import_dags(new_dag)
1
Source code in redcap/project.py
def import_dags(
    self,
    to_import: Union[str, List[Dict[str, Any]], "pd.DataFrame"],
    return_format_type: Literal["json", "csv", "xml"] = "json",
    import_format: Literal["json", "csv", "xml", "df"] = "json",
):
    """
    Import DAGs into the REDCap Project

    Note:
        DAGs can be renamed by simply changing the group name (data_access_group_name).
        DAGs can be created by providing group name value while unique group name should
        be set to blank.

    Args:
        to_import: array of dicts, csv/xml string, `pandas.DataFrame`
            Note:
                If you pass a csv or xml string, you should use the
                `import format` parameter appropriately.
        return_format_type:
            Response format. By default, response will be json-decoded.
        import_format:
            Format of incoming data. By default, to_import will be json-encoded

    Returns:
        Union[int, str]: Number of DAGs added or updated

    Examples:
        Create a new data access group
        >>> new_dag = [{"data_access_group_name": "New DAG", "unique_group_name": ""}]
        >>> proj.import_dags(new_dag)
        1
    """
    payload = self._initialize_import_payload(
        to_import=to_import,
        import_format=import_format,
        return_format_type=return_format_type,
        content="dag",
    )
    payload["action"] = "import"

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="import"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

import_events(self, to_import, return_format_type='json', import_format='json', override=0) inherited

Import Events into the REDCap Project

!!! note This only works for longitudinal projects.

Parameters:

Name Type Description Default
to_import Union[str, List[Dict[str, Any]], pd.DataFrame]

array of dicts, csv/xml string, pandas.DataFrame Note: If you pass a csv or xml string, you should use the import format parameter appropriately.

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'
import_format Literal['json', 'csv', 'xml', 'df']

Format of incoming data. By default, to_import will be json-encoded

'json'
override Optional[int]

0 - false [default], 1 - true You may use override=1 as a 'delete all + import' action in order to erase all existing Events in the project while importing new Events. If override=0, then you can only add new Events or rename existing ones.

0

Returns:

Type Description
Union[int, str]

Number of Events added or updated

Examples:

Create a new event

>>> new_event = [{"event_name": "Event 2", "arm_num": "1"}]
>>> proj.import_events(new_event)
1
Source code in redcap/project.py
def import_events(
    self,
    to_import: Union[str, List[Dict[str, Any]], "pd.DataFrame"],
    return_format_type: Literal["json", "csv", "xml"] = "json",
    import_format: Literal["json", "csv", "xml", "df"] = "json",
    override: Optional[int] = 0,
):
    """
    Import Events into the REDCap Project

    Note:
        This only works for longitudinal projects.

    Args:
        to_import: array of dicts, csv/xml string, `pandas.DataFrame`
            Note:
                If you pass a csv or xml string, you should use the
                `import format` parameter appropriately.
        return_format_type:
            Response format. By default, response will be json-decoded.
        import_format:
            Format of incoming data. By default, to_import will be json-encoded
        override:
            0 - false [default], 1 - true
            You may use override=1 as a 'delete all + import' action in order to
            erase all existing Events in the project while importing new Events.
            If override=0, then you can only add new Events or rename existing ones.

    Returns:
        Union[int, str]: Number of Events added or updated

    Examples:
        Create a new event
        >>> new_event = [{"event_name": "Event 2", "arm_num": "1"}]
        >>> proj.import_events(new_event)
        1
    """
    payload = self._initialize_import_payload(
        to_import=to_import,
        import_format=import_format,
        return_format_type=return_format_type,
        content="event",
    )
    payload["action"] = "import"
    payload["override"] = override

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="import"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

import_file(self, record, field, file_name, file_object, event=None, repeat_instance=None) inherited

Import the contents of a file represented by file_object to a particular records field

Parameters:

Name Type Description Default
record str

Record ID

required
field str

Field name where the file will go

required
file_name str

File name visible in REDCap UI

required
file_object TextIOWrapper

File object as returned by open

required
event Optional[str]

For longitudinal projects, the unique event name

None
repeat_instance Union[int, str]

(Only for projects with repeating instruments/events) The repeat instance number of the repeating event (if longitudinal) or the repeating instrument (if classic or longitudinal).

None

Returns:

Type Description
List[dict]

Empty JSON object

Exceptions:

Type Description
ValueError

Incorrect file field

RedcapError

Bad Request e.g. invalid record_id

Examples:

If your project has events, then you must specifiy the event of interest. Otherwise, you can leave the event parameter blank

>>> import tempfile
>>> tmp_file = tempfile.TemporaryFile()
>>> proj.import_file(
...     record="2",
...     field="upload_field",
...     file_name="myupload.txt",
...     file_object=tmp_file,
...     event="event_1_arm_1",
... )
[{}]
Source code in redcap/project.py
def import_file(
    self,
    record: str,
    field: str,
    file_name: str,
    file_object: "TextIOWrapper",
    event: Optional[str] = None,
    repeat_instance: Optional[Union[int, str]] = None,
) -> EmptyJson:
    """
    Import the contents of a file represented by file_object to a
    particular records field

    Args:
        record: Record ID
        field: Field name where the file will go
        file_name: File name visible in REDCap UI
        file_object: File object as returned by `open`
        event: For longitudinal projects, the unique event name
        repeat_instance:
            (Only for projects with repeating instruments/events)
            The repeat instance number of the repeating event (if longitudinal)
            or the repeating instrument (if classic or longitudinal).

    Returns:
        Empty JSON object

    Raises:
        ValueError: Incorrect file field
        RedcapError: Bad Request e.g. invalid record_id

    Examples:
        If your project has events, then you must specifiy the event of interest.
        Otherwise, you can leave the event parameter blank

        >>> import tempfile
        >>> tmp_file = tempfile.TemporaryFile()
        >>> proj.import_file(
        ...     record="2",
        ...     field="upload_field",
        ...     file_name="myupload.txt",
        ...     file_object=tmp_file,
        ...     event="event_1_arm_1",
        ... )
        [{}]
    """
    self._check_file_field(field)
    # load up payload
    payload: Dict[str, Any] = self._initialize_payload(content="file")
    payload["action"] = "import"
    payload["field"] = field
    payload["record"] = record
    if event:
        payload["event"] = event
    if repeat_instance:
        payload["repeat_instance"] = repeat_instance
    file_upload_dict: FileUpload = {"file": (file_name, file_object)}

    return cast(
        EmptyJson,
        self._call_api(
            payload=payload, return_type="empty_json", file=file_upload_dict
        ),
    )

import_instrument_event_mappings(self, to_import, return_format_type='json', import_format='json') inherited

Import the project's instrument to event mapping

!!! note This only works for longitudinal projects.

Parameters:

Name Type Description Default
to_import Union[str, List[Dict[str, Any]], pd.DataFrame]

array of dicts, csv/xml string, pandas.DataFrame Note: If you pass a csv or xml string, you should use the import format parameter appropriately.

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'
import_format Literal['json', 'csv', 'xml', 'df']

Format of incoming data. By default, import_format will be json-encoded

'json'

Returns:

Type Description
Union[int, str]

Number of instrument-event mappings imported

Examples:

Import instrument-event mappings

>>> instrument_event_mappings = [{"arm_num": "1", "unique_event_name": "event_1_arm_1", "form": "form_1"}]
>>> proj.import_instrument_event_mappings(instrument_event_mappings)
1
Source code in redcap/project.py
def import_instrument_event_mappings(
    self,
    to_import: Union[str, List[Dict[str, Any]], "pd.DataFrame"],
    return_format_type: Literal["json", "csv", "xml"] = "json",
    import_format: Literal["json", "csv", "xml", "df"] = "json",
):
    # pylint: disable=line-too-long
    """
    Import the project's instrument to event mapping

    Note:
        This only works for longitudinal projects.

    Args:
        to_import: array of dicts, csv/xml string, `pandas.DataFrame`
            Note:
                If you pass a csv or xml string, you should use the
                `import format` parameter appropriately.
        return_format_type:
            Response format. By default, response will be json-decoded.
        import_format:
            Format of incoming data. By default, import_format
            will be json-encoded

    Returns:
        Union[int, str]: Number of instrument-event mappings imported

    Examples:
        Import instrument-event mappings
        >>> instrument_event_mappings = [{"arm_num": "1", "unique_event_name": "event_1_arm_1", "form": "form_1"}]
        >>> proj.import_instrument_event_mappings(instrument_event_mappings)
        1
    """
    payload = self._initialize_import_payload(
        to_import=to_import,
        import_format=import_format,
        return_format_type=return_format_type,
        content="formEventMapping",
    )
    payload["action"] = "import"

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="import"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

import_metadata(self, to_import, return_format_type='json', import_format='json', date_format='YMD') inherited

Import metadata (Data Dictionary) into the REDCap Project

Parameters:

Name Type Description Default
to_import Union[str, List[Dict[str, Any]], pd.DataFrame]

array of dicts, csv/xml string, pandas.DataFrame Note: If you pass a csv or xml string, you should use the import_format parameter appropriately.

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'
import_format Literal['json', 'csv', 'xml', 'df']

Format of incoming data. By default, to_import will be json-encoded

'json'
date_format Literal['YMD', 'DMY', 'MDY']

Describes the formatting of dates. By default, date strings are formatted as 'YYYY-MM-DD' corresponding to 'YMD'. If date strings are formatted as 'MM/DD/YYYY' set this parameter as 'MDY' and if formatted as 'DD/MM/YYYY' set as 'DMY'. No other formattings are allowed.

'YMD'

Returns:

Type Description
Union[int, str]

The number of imported fields

Examples:

>>> metadata = proj.export_metadata(format_type="csv")
>>> proj.import_metadata(metadata, import_format="csv")
4
Source code in redcap/project.py
def import_metadata(
    self,
    to_import: Union[str, List[Dict[str, Any]], "pd.DataFrame"],
    return_format_type: Literal["json", "csv", "xml"] = "json",
    import_format: Literal["json", "csv", "xml", "df"] = "json",
    date_format: Literal["YMD", "DMY", "MDY"] = "YMD",
):
    """
    Import metadata (Data Dictionary) into the REDCap Project

    Args:
        to_import: array of dicts, csv/xml string, `pandas.DataFrame`
            Note:
                If you pass a csv or xml string, you should use the
                `import_format` parameter appropriately.
        return_format_type:
            Response format. By default, response will be json-decoded.
        import_format:
            Format of incoming data. By default, to_import will be json-encoded
        date_format:
            Describes the formatting of dates. By default, date strings
            are formatted as 'YYYY-MM-DD' corresponding to 'YMD'. If date
            strings are formatted as 'MM/DD/YYYY' set this parameter as
            'MDY' and if formatted as 'DD/MM/YYYY' set as 'DMY'. No
            other formattings are allowed.

    Returns:
        Union[int, str]: The number of imported fields

    Examples:
        >>> metadata = proj.export_metadata(format_type="csv")
        >>> proj.import_metadata(metadata, import_format="csv")
        4
    """
    payload = self._initialize_import_payload(
        to_import=to_import,
        import_format=import_format,
        return_format_type=return_format_type,
        content="metadata",
    )
    payload["dateFormat"] = date_format

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="import"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

import_records(self, to_import, return_format_type='json', return_content='count', overwrite='normal', import_format='json', date_format='YMD', force_auto_number=False) inherited

Import data into the REDCap Project

Parameters:

Name Type Description Default
to_import Union[str, List[Dict[str, Any]], pd.DataFrame]

Note: If you pass a df, csv, or xml string, you should use the import_format parameter appropriately. Note: Keys of the dictionaries should be subset of project's, fields, but this isn't a requirement. If you provide keys that aren't defined fields, the returned response will contain an 'error' key.

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'
return_content Literal['count', 'ids', 'auto_ids', 'nothing']

By default, the response contains a 'count' key with the number of records just imported. By specifying 'ids', a list of ids imported will be returned. 'nothing' will only return the HTTP status code and no message.

'count'
overwrite Literal['normal', 'overwrite']

'overwrite' will erase values previously stored in the database if not specified in the to_import dictionaries.

'normal'
import_format Literal['json', 'csv', 'xml', 'df']

Format of incoming data. By default, to_import will be json-encoded

'json'
date_format Literal['YMD', 'DMY', 'MDY']

Describes the formatting of dates. By default, date strings are formatted as 'YYYY-MM-DD' corresponding to 'YMD'. If date strings are formatted as 'MM/DD/YYYY' set this parameter as 'MDY' and if formatted as 'DD/MM/YYYY' set as 'DMY'. No other formattings are allowed.

'YMD'
force_auto_number bool

Enables automatic assignment of record IDs of imported records by REDCap. If this is set to true, and auto-numbering for records is enabled for the project, auto-numbering of imported records will be enabled.

False

Exceptions:

Type Description
RedcapError

Bad request made, double check field names and other inputs

Returns:

Type Description
Union[Dict, str]

response from REDCap API, json-decoded if return_format == 'json'

Examples:

>>> new_record = [{"record_id": 3, "redcap_repeat_instance": 1, "field_1": 1}]
>>> proj.import_records(new_record)
{'count': 1}
Source code in redcap/project.py
def import_records(
    self,
    to_import: Union[str, List[Dict[str, Any]], "pd.DataFrame"],
    return_format_type: Literal["json", "csv", "xml"] = "json",
    return_content: Literal["count", "ids", "auto_ids", "nothing"] = "count",
    overwrite: Literal["normal", "overwrite"] = "normal",
    import_format: Literal["json", "csv", "xml", "df"] = "json",
    date_format: Literal["YMD", "DMY", "MDY"] = "YMD",
    force_auto_number: bool = False,
):
    """
    Import data into the REDCap Project

    Args:
        to_import:
            Note:
                If you pass a df, csv, or xml string, you should use the
                `import_format` parameter appropriately.
            Note:
                Keys of the dictionaries should be subset of project's,
                fields, but this isn't a requirement. If you provide keys
                that aren't defined fields, the returned response will
                contain an `'error'` key.
        return_format_type:
            Response format. By default, response will be json-decoded.
        return_content:
            By default, the response contains a 'count' key with the number of
            records just imported. By specifying 'ids', a list of ids
            imported will be returned. 'nothing' will only return
            the HTTP status code and no message.
        overwrite:
            `'overwrite'` will erase values previously stored in the
            database if not specified in the to_import dictionaries.
        import_format:
            Format of incoming data. By default, to_import will be json-encoded
        date_format:
            Describes the formatting of dates. By default, date strings
            are formatted as 'YYYY-MM-DD' corresponding to 'YMD'. If date
            strings are formatted as 'MM/DD/YYYY' set this parameter as
            'MDY' and if formatted as 'DD/MM/YYYY' set as 'DMY'. No
            other formattings are allowed.
        force_auto_number:
            Enables automatic assignment of record IDs
            of imported records by REDCap. If this is set to true, and auto-numbering
            for records is enabled for the project, auto-numbering of imported records
            will be enabled.

    Raises:
        RedcapError: Bad request made, double check field names and other inputs

    Returns:
        Union[Dict, str]: response from REDCap API, json-decoded if `return_format` == `'json'`

    Examples:
        >>> new_record = [{"record_id": 3, "redcap_repeat_instance": 1, "field_1": 1}]
        >>> proj.import_records(new_record)
        {'count': 1}
    """
    payload = self._initialize_import_payload(
        to_import=to_import,
        import_format=import_format,
        return_format_type=return_format_type,
        content="record",
    )
    payload["overwriteBehavior"] = overwrite
    payload["returnContent"] = return_content
    payload["dateFormat"] = date_format
    payload["forceAutoNumber"] = force_auto_number

    return_type = self._lookup_return_type(
        format_type=return_format_type,
        request_type="import",
        import_records_format=return_content,
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

import_repeating_instruments_events(self, to_import, return_format_type='json', import_format='json') inherited

Import repeating instrument and event settings into the REDCap Project

Parameters:

Name Type Description Default
to_import Union[str, List[Dict[str, Any]], pd.DataFrame]

array of dicts, csv/xml string, pandas.DataFrame Note: If you pass a csv or xml string, you should use the import format parameter appropriately.

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'
import_format Literal['json', 'csv', 'xml', 'df']

Format of incoming data. By default, to_import will be json-encoded

'json'

Returns:

Type Description
Union[int, str]

The number of repeated instruments activated

Examples:

>>> rep_instruments = proj.export_repeating_instruments_events(format_type="csv")
>>> proj.import_repeating_instruments_events(rep_instruments, import_format="csv")
1
Source code in redcap/project.py
def import_repeating_instruments_events(
    self,
    to_import: Union[str, List[Dict[str, Any]], "pd.DataFrame"],
    return_format_type: Literal["json", "csv", "xml"] = "json",
    import_format: Literal["json", "csv", "xml", "df"] = "json",
):
    """
    Import repeating instrument and event settings into the REDCap Project

    Args:
        to_import: array of dicts, csv/xml string, `pandas.DataFrame`
            Note:
                If you pass a csv or xml string, you should use the
                `import format` parameter appropriately.
        return_format_type:
            Response format. By default, response will be json-decoded.
        import_format:
            Format of incoming data. By default, to_import will be json-encoded

    Returns:
        Union[int, str]: The number of repeated instruments activated

    Examples:
        >>> rep_instruments = proj.export_repeating_instruments_events(format_type="csv")
        >>> proj.import_repeating_instruments_events(rep_instruments, import_format="csv")
        1
    """
    payload = self._initialize_import_payload(
        to_import=to_import,
        import_format=import_format,
        return_format_type=return_format_type,
        content="repeatingFormsEvents",
    )

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="import"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

import_user_dag_assignment(self, to_import, return_format_type='json', import_format='json') inherited

Import User-DAG assignments into the REDCap Project

Parameters:

Name Type Description Default
to_import Union[str, List[Dict[str, Any]], pd.DataFrame]

array of dicts, csv/xml string, pandas.DataFrame Note: If you pass a csv or xml string, you should use the import format parameter appropriately.

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'
import_format Literal['json', 'csv', 'xml', 'df']

Format of incoming data. By default, to_import will be json-encoded

'json'

Returns:

Type Description
Union[int, str]

Number of User-DAGs assignments added or updated

Examples:

Create a new user

>>> new_user = "pandeharris@gmail.com"
>>> proj.import_users([{"username": new_user}])
1

Add that user to a DAG

>>> dag_mapping = [
...     {"username": new_user, "redcap_data_access_group": "test_dag"}
... ]
>>> proj.import_user_dag_assignment(dag_mapping)
1

New user-DAG mapping

>>> proj.export_user_dag_assignment()
[{'username': 'pandeharris@gmail.com', 'redcap_data_access_group': 'test_dag'},
{'username': ..., 'redcap_data_access_group': ''}]

Remove the user

>>> proj.delete_users([new_user])
1
Source code in redcap/project.py
def import_user_dag_assignment(
    self,
    to_import: Union[str, List[Dict[str, Any]], "pd.DataFrame"],
    return_format_type: Literal["json", "csv", "xml"] = "json",
    import_format: Literal["json", "csv", "xml", "df"] = "json",
):
    """
    Import User-DAG assignments into the REDCap Project

    Args:
        to_import: array of dicts, csv/xml string, `pandas.DataFrame`
            Note:
                If you pass a csv or xml string, you should use the
                `import format` parameter appropriately.
        return_format_type:
            Response format. By default, response will be json-decoded.
        import_format:
            Format of incoming data. By default, to_import will be json-encoded

    Returns:
        Union[int, str]:
            Number of User-DAGs assignments added or updated

    Examples:
        Create a new user
        >>> new_user = "pandeharris@gmail.com"
        >>> proj.import_users([{"username": new_user}])
        1

        Add that user to a DAG
        >>> dag_mapping = [
        ...     {"username": new_user, "redcap_data_access_group": "test_dag"}
        ... ]
        >>> proj.import_user_dag_assignment(dag_mapping)
        1

        New user-DAG mapping
        >>> proj.export_user_dag_assignment()
        [{'username': 'pandeharris@gmail.com', 'redcap_data_access_group': 'test_dag'},
        {'username': ..., 'redcap_data_access_group': ''}]

        Remove the user
        >>> proj.delete_users([new_user])
        1
    """
    payload = self._initialize_import_payload(
        to_import=to_import,
        import_format=import_format,
        return_format_type=return_format_type,
        content="userDagMapping",
    )
    payload["action"] = "import"

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="import"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

import_user_role_assignment(self, to_import, return_format_type='json', import_format='json') inherited

Import User-Role assignments into the REDCap Project

Parameters:

Name Type Description Default
to_import Union[str, List[Dict[str, Any]], pd.DataFrame]

array of dicts, csv/xml string, pandas.DataFrame Note: If you pass a csv or xml string, you should use the import format parameter appropriately.

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'
import_format Literal['json', 'csv', 'xml', 'df']

Format of incoming data. By default, to_import will be json-encoded

'json'

Returns:

Type Description
Union[int, str]

Number of user-role assignments added or updated

Examples:

>>> user_role_assignments = proj.export_user_role_assignment()
>>> proj.import_user_role_assignment(user_role_assignments)
1
Source code in redcap/project.py
def import_user_role_assignment(
    self,
    to_import: Union[str, List[Dict[str, Any]], "pd.DataFrame"],
    return_format_type: Literal["json", "csv", "xml"] = "json",
    import_format: Literal["json", "csv", "xml", "df"] = "json",
):
    """
    Import User-Role assignments into the REDCap Project

    Args:
        to_import: array of dicts, csv/xml string, `pandas.DataFrame`
            Note:
                If you pass a csv or xml string, you should use the
                `import format` parameter appropriately.
        return_format_type:
            Response format. By default, response will be json-decoded.
        import_format:
            Format of incoming data. By default, to_import will be json-encoded

    Returns:
        Union[int, str]: Number of user-role assignments added or updated

    Examples:
        >>> user_role_assignments = proj.export_user_role_assignment()
        >>> proj.import_user_role_assignment(user_role_assignments)
        1
    """
    payload = self._initialize_import_payload(
        to_import=to_import,
        import_format=import_format,
        return_format_type=return_format_type,
        content="userRoleMapping",
    )

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="import"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

import_user_roles(self, to_import, return_format_type='json', import_format='json') inherited

Import user roles into the REDCap Project

Parameters:

Name Type Description Default
to_import Union[str, List[Dict[str, Any]], pd.DataFrame]

array of dicts, csv/xml string, pandas.DataFrame Note: If you pass a csv or xml string, you should use the import format parameter appropriately.

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'
import_format Literal['json', 'csv', 'xml', 'df']

Format of incoming data. By default, to_import will be json-encoded

'json'

Returns:

Type Description
Union[int, str]

Number of user roles added or updated

Examples:

>>> roles = proj.export_user_roles()
>>> proj.import_user_roles(roles)
1
Source code in redcap/project.py
def import_user_roles(
    self,
    to_import: Union[str, List[Dict[str, Any]], "pd.DataFrame"],
    return_format_type: Literal["json", "csv", "xml"] = "json",
    import_format: Literal["json", "csv", "xml", "df"] = "json",
):
    """
    Import user roles into the REDCap Project

    Args:
        to_import: array of dicts, csv/xml string, `pandas.DataFrame`
            Note:
                If you pass a csv or xml string, you should use the
                `import format` parameter appropriately.
        return_format_type:
            Response format. By default, response will be json-decoded.
        import_format:
            Format of incoming data. By default, to_import will be json-encoded

    Returns:
        Union[int, str]: Number of user roles added or updated

    Examples:
        >>> roles = proj.export_user_roles()
        >>> proj.import_user_roles(roles)
        1
    """
    payload = self._initialize_import_payload(
        to_import=to_import,
        import_format=import_format,
        return_format_type=return_format_type,
        content="userRole",
    )

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="import"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

import_users(self, to_import, return_format_type='json', import_format='json') inherited

Import users/user rights into the REDCap Project

Parameters:

Name Type Description Default
to_import Union[str, List[Dict[str, Any]], pd.DataFrame]

array of dicts, csv/xml string, pandas.DataFrame Note: If you pass a csv or xml string, you should use the import format parameter appropriately.

required
return_format_type Literal['json', 'csv', 'xml']

Response format. By default, response will be json-decoded.

'json'
import_format Literal['json', 'csv', 'xml', 'df']

Format of incoming data. By default, to_import will be json-encoded

'json'

Returns:

Type Description
Union[int, str]

Number of users added or updated

Examples:

Add test user. Only username is required

>>> test_user = [{"username": "pandeharris@gmail.com"}]
>>> proj.import_users(test_user)
1

All currently valid options for user rights

>>> test_user = [
...     {"username": "pandeharris@gmail.com", "email": "pandeharris@gmail.com",
...     "firstname": "REDCap Trial", "lastname": "User", "expiration": "",
...     "data_access_group": "", "data_access_group_id": "", "design": 0,
...     "user_rights": 0, "data_export": 2, "reports": 1, "stats_and_charts": 1,
...     "manage_survey_participants": 1, "calendar": 1, "data_access_groups": 0,
...     "data_import_tool": 0, "data_comparison_tool": 0, "logging": 0,
...     "file_repository": 1, "data_quality_create": 0, "data_quality_execute": 0,
...     "api_export": 0, "api_import": 0, "mobile_app": 0,
...     "mobile_app_download_data": 0, "record_create": 1, "record_rename": 0,
...     "record_delete": 0, "lock_records_all_forms": 0, "lock_records": 0,
...      "lock_records_customization": 0, "forms": {"form_1": 3}}
... ]
>>> proj.import_users(test_user)
1
Source code in redcap/project.py
def import_users(
    self,
    to_import: Union[str, List[Dict[str, Any]], "pd.DataFrame"],
    return_format_type: Literal["json", "csv", "xml"] = "json",
    import_format: Literal["json", "csv", "xml", "df"] = "json",
):
    """
    Import users/user rights into the REDCap Project

    Args:
        to_import: array of dicts, csv/xml string, `pandas.DataFrame`
            Note:
                If you pass a csv or xml string, you should use the
                `import format` parameter appropriately.
        return_format_type:
            Response format. By default, response will be json-decoded.
        import_format:
            Format of incoming data. By default, to_import will be json-encoded

    Returns:
        Union[int, str]: Number of users added or updated

    Examples:
        Add test user. Only username is required
        >>> test_user = [{"username": "pandeharris@gmail.com"}]
        >>> proj.import_users(test_user)
        1

        All currently valid options for user rights
        >>> test_user = [
        ...     {"username": "pandeharris@gmail.com", "email": "pandeharris@gmail.com",
        ...     "firstname": "REDCap Trial", "lastname": "User", "expiration": "",
        ...     "data_access_group": "", "data_access_group_id": "", "design": 0,
        ...     "user_rights": 0, "data_export": 2, "reports": 1, "stats_and_charts": 1,
        ...     "manage_survey_participants": 1, "calendar": 1, "data_access_groups": 0,
        ...     "data_import_tool": 0, "data_comparison_tool": 0, "logging": 0,
        ...     "file_repository": 1, "data_quality_create": 0, "data_quality_execute": 0,
        ...     "api_export": 0, "api_import": 0, "mobile_app": 0,
        ...     "mobile_app_download_data": 0, "record_create": 1, "record_rename": 0,
        ...     "record_delete": 0, "lock_records_all_forms": 0, "lock_records": 0,
        ...      "lock_records_customization": 0, "forms": {"form_1": 3}}
        ... ]
        >>> proj.import_users(test_user)
        1
    """
    payload = self._initialize_import_payload(
        to_import=to_import,
        import_format=import_format,
        return_format_type=return_format_type,
        content="user",
    )

    return_type = self._lookup_return_type(
        format_type=return_format_type, request_type="import"
    )
    response = cast(Union[Json, str], self._call_api(payload, return_type))

    return response

switch_dag(self, dag) inherited

Allows the current API user to switch (assign/reassign/unassign) their current Data Access Group assignment.

The current user must have been assigned to multiple DAGs via the DAG Switcher page in the project

Parameters:

Name Type Description Default
dag str

The unique group name of the Data Access Group to which you wish to switch

required

Returns:

Type Description
Literal['1']

"1" if the user successfully switched DAGs

Examples:

>>> proj.switch_dag("test_dag")
'1'
Source code in redcap/project.py
def switch_dag(
    self,
    dag: str,
) -> Literal["1"]:
    """
    Allows the current API user to switch (assign/reassign/unassign)
    their current Data Access Group assignment.

    The current user must have been assigned to multiple DAGs via the
    DAG Switcher page in the project

    Args:
        dag: The unique group name of the Data Access Group to which you wish to switch

    Returns:
        "1" if the user successfully switched DAGs

    Examples:
        >>> proj.switch_dag("test_dag") # doctest: +SKIP
        '1'
    """
    # API docs say that "1" is the only valid value
    payload = self._initialize_payload(content="dag", return_format_type="csv")
    payload["action"] = "switch"
    payload["dag"] = dag

    response = cast(Literal["1"], self._call_api(payload, return_type="str"))
    return response