Datasets:
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Error code: JobManagerExceededMaximumDurationError
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
hexsha
string | repo
string | path
string | license
sequence | language
string | identifier
string | original_docstring
string | docstring
string | docstring_tokens
sequence | code
string | code_tokens
sequence | short_docstring
string | short_docstring_tokens
sequence | comment
sequence | parameters
list | docstring_params
dict |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
bf86d18d90c390905405a84287a89ad2f0b7c1bb | iSoron/Prescient | prescient/gosm/derivative_patterns/graph_utilities.py | [
"BSD-3-Clause"
] | Python | WeightedGraph |
This class represents a weighted graph for the purposes
of determining clusters via the Markov Clustering Algorithm.
To initialize an object of this class, pass in a dictionary
which maps pairs (tuples) of vertices to the corresponding weight.
Stores internally both an adjacency list and an adjacency matrix
This is fine as the number of expected vertices is small.
| This class represents a weighted graph for the purposes
of determining clusters via the Markov Clustering Algorithm.
To initialize an object of this class, pass in a dictionary
which maps pairs (tuples) of vertices to the corresponding weight.
Stores internally both an adjacency list and an adjacency matrix
This is fine as the number of expected vertices is small. | [
"This",
"class",
"represents",
"a",
"weighted",
"graph",
"for",
"the",
"purposes",
"of",
"determining",
"clusters",
"via",
"the",
"Markov",
"Clustering",
"Algorithm",
".",
"To",
"initialize",
"an",
"object",
"of",
"this",
"class",
"pass",
"in",
"a",
"dictionary",
"which",
"maps",
"pairs",
"(",
"tuples",
")",
"of",
"vertices",
"to",
"the",
"corresponding",
"weight",
".",
"Stores",
"internally",
"both",
"an",
"adjacency",
"list",
"and",
"an",
"adjacency",
"matrix",
"This",
"is",
"fine",
"as",
"the",
"number",
"of",
"expected",
"vertices",
"is",
"small",
"."
] | class WeightedGraph:
"""
This class represents a weighted graph for the purposes
of determining clusters via the Markov Clustering Algorithm.
To initialize an object of this class, pass in a dictionary
which maps pairs (tuples) of vertices to the corresponding weight.
Stores internally both an adjacency list and an adjacency matrix
This is fine as the number of expected vertices is small.
"""
def __init__(self, pair_weights):
self.adjacency_list = self._construct_adjacency_list(pair_weights)
self.vertices = list(self.adjacency_list.keys())
self.num_vertices = len(self.vertices)
self.adjacency_matrix = self._construct_adjacency_matrix()
def get_clusters(self, granularity):
"""
This method uses the Markov Clustering Algorithm
to cluster vertices together.
Args:
granularity: The granularity with which to inflate columns
Return:
A dictionary which maps a vertex to the set of vertices it is in a cluster with
"""
# Hardcoded in the expansion parameter, this reflects original implementation
# May wish to change this to have some option
e = 2
matrix = transform_matrix(self.adjacency_matrix)
matrix = normalize_columns(matrix)
error_convergence = np.linalg.norm(matrix)
while error_convergence > 10E-6:
# Store previous matrix
previous_matrix = matrix
matrix = np.linalg.matrix_power(matrix, e)
matrix = inflate_columns(matrix, granularity)
error_convergence = np.linalg.norm(matrix - previous_matrix)
return self._get_clusters(matrix)
def _get_clusters(self, matrix):
"""
Helper function to retrieve the list of clusters from the matrix
"""
# clusters is a set to have only unique sets in the partition of the vertices
clusters = set()
for i, v1 in enumerate(self.vertices):
# Already assigned a cluster
if np.sum(matrix[i, :]) < 10E-6: # If sum of row is essentially zero
continue
else:
cluster = []
for j, v2 in enumerate(self.vertices):
if matrix[i, j] > 10E-6:
cluster.append(v2)
clusters.add(frozenset(cluster))
clusters = [list(cluster) for cluster in clusters]
return clusters
def _construct_adjacency_list(self, pair_weights):
"""
Constructs an adjacency list representation of the graph as
a dictionary which maps vertices to a list of tuples (v, w) where
v is the adjacent vertex and w is the weight of the edge.
Args:
pair_weights: A dictionary mapping pairs of vertices to weights
Returns:
An adjacency list
"""
adjacency_list = {}
for v1, v2 in pair_weights:
weight = pair_weights[(v1, v2)]
if v1 in adjacency_list:
adjacency_list[v1].append((v2, weight))
else:
adjacency_list[v1] = [(v2, weight)]
if v2 in adjacency_list:
adjacency_list[v2].append((v1, weight))
else:
adjacency_list[v2] = [(v1, weight)]
return adjacency_list
def _construct_adjacency_matrix(self):
"""
Constructs an adjacency matrix from the internally stored adjacency list
Assigns M_ij to be the weight from vertex i to vertex j.
Returns:
The numpy matrix storing the weights
"""
adjacency_matrix = np.identity(self.num_vertices)
for i, v1 in enumerate(self.vertices):
for j, v2 in enumerate(self.vertices):
v1_v2_weight = 0
for vertex, weight in self.adjacency_list[v1]:
if v2 == vertex:
v1_v2_weight = weight
break
adjacency_matrix[i][j] = v1_v2_weight
return adjacency_matrix | [
"class",
"WeightedGraph",
":",
"def",
"__init__",
"(",
"self",
",",
"pair_weights",
")",
":",
"self",
".",
"adjacency_list",
"=",
"self",
".",
"_construct_adjacency_list",
"(",
"pair_weights",
")",
"self",
".",
"vertices",
"=",
"list",
"(",
"self",
".",
"adjacency_list",
".",
"keys",
"(",
")",
")",
"self",
".",
"num_vertices",
"=",
"len",
"(",
"self",
".",
"vertices",
")",
"self",
".",
"adjacency_matrix",
"=",
"self",
".",
"_construct_adjacency_matrix",
"(",
")",
"def",
"get_clusters",
"(",
"self",
",",
"granularity",
")",
":",
"\"\"\"\n This method uses the Markov Clustering Algorithm\n to cluster vertices together.\n Args:\n granularity: The granularity with which to inflate columns\n Return:\n A dictionary which maps a vertex to the set of vertices it is in a cluster with\n \"\"\"",
"e",
"=",
"2",
"matrix",
"=",
"transform_matrix",
"(",
"self",
".",
"adjacency_matrix",
")",
"matrix",
"=",
"normalize_columns",
"(",
"matrix",
")",
"error_convergence",
"=",
"np",
".",
"linalg",
".",
"norm",
"(",
"matrix",
")",
"while",
"error_convergence",
">",
"10E-6",
":",
"previous_matrix",
"=",
"matrix",
"matrix",
"=",
"np",
".",
"linalg",
".",
"matrix_power",
"(",
"matrix",
",",
"e",
")",
"matrix",
"=",
"inflate_columns",
"(",
"matrix",
",",
"granularity",
")",
"error_convergence",
"=",
"np",
".",
"linalg",
".",
"norm",
"(",
"matrix",
"-",
"previous_matrix",
")",
"return",
"self",
".",
"_get_clusters",
"(",
"matrix",
")",
"def",
"_get_clusters",
"(",
"self",
",",
"matrix",
")",
":",
"\"\"\"\n Helper function to retrieve the list of clusters from the matrix\n \"\"\"",
"clusters",
"=",
"set",
"(",
")",
"for",
"i",
",",
"v1",
"in",
"enumerate",
"(",
"self",
".",
"vertices",
")",
":",
"if",
"np",
".",
"sum",
"(",
"matrix",
"[",
"i",
",",
":",
"]",
")",
"<",
"10E-6",
":",
"continue",
"else",
":",
"cluster",
"=",
"[",
"]",
"for",
"j",
",",
"v2",
"in",
"enumerate",
"(",
"self",
".",
"vertices",
")",
":",
"if",
"matrix",
"[",
"i",
",",
"j",
"]",
">",
"10E-6",
":",
"cluster",
".",
"append",
"(",
"v2",
")",
"clusters",
".",
"add",
"(",
"frozenset",
"(",
"cluster",
")",
")",
"clusters",
"=",
"[",
"list",
"(",
"cluster",
")",
"for",
"cluster",
"in",
"clusters",
"]",
"return",
"clusters",
"def",
"_construct_adjacency_list",
"(",
"self",
",",
"pair_weights",
")",
":",
"\"\"\"\n Constructs an adjacency list representation of the graph as\n a dictionary which maps vertices to a list of tuples (v, w) where\n v is the adjacent vertex and w is the weight of the edge.\n\n Args:\n pair_weights: A dictionary mapping pairs of vertices to weights\n Returns:\n An adjacency list\n \"\"\"",
"adjacency_list",
"=",
"{",
"}",
"for",
"v1",
",",
"v2",
"in",
"pair_weights",
":",
"weight",
"=",
"pair_weights",
"[",
"(",
"v1",
",",
"v2",
")",
"]",
"if",
"v1",
"in",
"adjacency_list",
":",
"adjacency_list",
"[",
"v1",
"]",
".",
"append",
"(",
"(",
"v2",
",",
"weight",
")",
")",
"else",
":",
"adjacency_list",
"[",
"v1",
"]",
"=",
"[",
"(",
"v2",
",",
"weight",
")",
"]",
"if",
"v2",
"in",
"adjacency_list",
":",
"adjacency_list",
"[",
"v2",
"]",
".",
"append",
"(",
"(",
"v1",
",",
"weight",
")",
")",
"else",
":",
"adjacency_list",
"[",
"v2",
"]",
"=",
"[",
"(",
"v1",
",",
"weight",
")",
"]",
"return",
"adjacency_list",
"def",
"_construct_adjacency_matrix",
"(",
"self",
")",
":",
"\"\"\"\n Constructs an adjacency matrix from the internally stored adjacency list\n Assigns M_ij to be the weight from vertex i to vertex j.\n\n Returns:\n The numpy matrix storing the weights\n \"\"\"",
"adjacency_matrix",
"=",
"np",
".",
"identity",
"(",
"self",
".",
"num_vertices",
")",
"for",
"i",
",",
"v1",
"in",
"enumerate",
"(",
"self",
".",
"vertices",
")",
":",
"for",
"j",
",",
"v2",
"in",
"enumerate",
"(",
"self",
".",
"vertices",
")",
":",
"v1_v2_weight",
"=",
"0",
"for",
"vertex",
",",
"weight",
"in",
"self",
".",
"adjacency_list",
"[",
"v1",
"]",
":",
"if",
"v2",
"==",
"vertex",
":",
"v1_v2_weight",
"=",
"weight",
"break",
"adjacency_matrix",
"[",
"i",
"]",
"[",
"j",
"]",
"=",
"v1_v2_weight",
"return",
"adjacency_matrix"
] | This class represents a weighted graph for the purposes
of determining clusters via the Markov Clustering Algorithm. | [
"This",
"class",
"represents",
"a",
"weighted",
"graph",
"for",
"the",
"purposes",
"of",
"determining",
"clusters",
"via",
"the",
"Markov",
"Clustering",
"Algorithm",
"."
] | [
"\"\"\"\n This class represents a weighted graph for the purposes\n of determining clusters via the Markov Clustering Algorithm.\n To initialize an object of this class, pass in a dictionary\n which maps pairs (tuples) of vertices to the corresponding weight.\n\n Stores internally both an adjacency list and an adjacency matrix\n This is fine as the number of expected vertices is small.\n\n \"\"\"",
"\"\"\"\n This method uses the Markov Clustering Algorithm\n to cluster vertices together.\n Args:\n granularity: The granularity with which to inflate columns\n Return:\n A dictionary which maps a vertex to the set of vertices it is in a cluster with\n \"\"\"",
"# Hardcoded in the expansion parameter, this reflects original implementation",
"# May wish to change this to have some option",
"# Store previous matrix",
"\"\"\"\n Helper function to retrieve the list of clusters from the matrix\n \"\"\"",
"# clusters is a set to have only unique sets in the partition of the vertices",
"# Already assigned a cluster",
"# If sum of row is essentially zero",
"\"\"\"\n Constructs an adjacency list representation of the graph as\n a dictionary which maps vertices to a list of tuples (v, w) where\n v is the adjacent vertex and w is the weight of the edge.\n\n Args:\n pair_weights: A dictionary mapping pairs of vertices to weights\n Returns:\n An adjacency list\n \"\"\"",
"\"\"\"\n Constructs an adjacency matrix from the internally stored adjacency list\n Assigns M_ij to be the weight from vertex i to vertex j.\n\n Returns:\n The numpy matrix storing the weights\n \"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
44d82532b0e3acb3453df191ceef6f20d3f8da3d | plusus/plus-tuto | tuto-file-handling/libtuto/zone.py | [
"CC-BY-3.0"
] | Python | Zone |
Zone with defined boundaries
| Zone with defined boundaries | [
"Zone",
"with",
"defined",
"boundaries"
] | class Zone:
"""
Zone with defined boundaries
"""
def topLeft(self):
"""
:rtype: (int, int)
"""
raise NotImplementedError()
def bottomRight(self):
"""
:rtype: (int, int)
"""
raise NotImplementedError()
def center(self):
"""
:rtype: (int, int)
"""
raise NotImplementedError() | [
"class",
"Zone",
":",
"def",
"topLeft",
"(",
"self",
")",
":",
"\"\"\"\n :rtype: (int, int)\n \"\"\"",
"raise",
"NotImplementedError",
"(",
")",
"def",
"bottomRight",
"(",
"self",
")",
":",
"\"\"\"\n :rtype: (int, int)\n \"\"\"",
"raise",
"NotImplementedError",
"(",
")",
"def",
"center",
"(",
"self",
")",
":",
"\"\"\"\n :rtype: (int, int)\n \"\"\"",
"raise",
"NotImplementedError",
"(",
")"
] | Zone with defined boundaries | [
"Zone",
"with",
"defined",
"boundaries"
] | [
"\"\"\"\n Zone with defined boundaries\n \"\"\"",
"\"\"\"\n :rtype: (int, int)\n \"\"\"",
"\"\"\"\n :rtype: (int, int)\n \"\"\"",
"\"\"\"\n :rtype: (int, int)\n \"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
44f4e6ceee8ef657f109e2b9814db4070531e985 | OthmanEmpire/project_xcape | xcape/common/object.py | [
"MIT"
] | Python | GameObject |
The base class for all other classes.
| The base class for all other classes. | [
"The",
"base",
"class",
"for",
"all",
"other",
"classes",
"."
] | class GameObject:
"""
The base class for all other classes.
"""
MENU_EVENT = pg.USEREVENT + 1
SCENE_EVENT = pg.USEREVENT + 2
CUTSCENE_EVENT = pg.USEREVENT + 3
CATEGORIES_MENU = [
"screen",
"transition",
"complete",
"health",
"max_health"
]
CATEGORIES_SCENE = [
"screen",
"transition",
"complete",
"pause",
"unpause",
"no_mode",
"start_game",
"switch",
"door",
"death",
"revive"
]
CATEGORIES_CUTSCENE = [
"screen",
"transition"
]
def handleEvent(self, event):
"""
Handles the given event.
:param event: pygame.Event, allowing event-driven programming.
"""
raise NotImplementedError
def update(self):
"""
Updates the logic of the game object every game tick.
"""
raise NotImplementedError
def draw(self, camera=None):
"""
Renders the game object to the screen every game tick.
"""
raise NotImplementedError
def messageMenu(self, category, data=None):
"""
Creates an event that is posted for the menu engine.
:param category: String, the category of the message.
:param data: N-Tuple, containing the data for the relevant category.
"""
self._messageEngine(GameObject.CATEGORIES_MENU,
GameObject.MENU_EVENT,
self.__str__(),
category,
data)
def messageScene(self, category, data=None):
"""
Creates an event that is posted for the scene engine.
:param sender: String, the sender of the message.
:param category: String, the category of the message.
:param data: N-Tuple, containing the data for the relevant category.
"""
self._messageEngine(GameObject.CATEGORIES_SCENE,
GameObject.SCENE_EVENT,
self.__str__(),
category,
data)
def messageCutScene(self, category, data=None):
"""
Creates an event that is posted for the cutscene engine.
:param category: String, the category of the message.
:param data: N-Tuple, containing the data for the relevant category.
"""
self._messageEngine(GameObject.CATEGORIES_CUTSCENE,
GameObject.CUTSCENE_EVENT,
self.__str__(),
category,
data)
def _messageEngine(self, CATEGORIES, EVENT, sender, category, data=None):
"""
Creates an event that is posted to an engine.
:param CATEGORIES: List, containing strings of valid categories.
:param EVENT: pygame.event, the event that the engine handles.
:param sender: String, the sender of the message.
:param category: String, the category of the message.
:param data: N-Tuple, containing the data for the relevant category.
"""
if category not in CATEGORIES:
raise KeyError("'{}' is an invalid category! The categories allowed "
"are {}!".format(category, CATEGORIES))
contents = \
{
"sender": sender,
"category": category,
"data": data
}
message = pg.event.Event(EVENT, contents)
pg.event.post(message) | [
"class",
"GameObject",
":",
"MENU_EVENT",
"=",
"pg",
".",
"USEREVENT",
"+",
"1",
"SCENE_EVENT",
"=",
"pg",
".",
"USEREVENT",
"+",
"2",
"CUTSCENE_EVENT",
"=",
"pg",
".",
"USEREVENT",
"+",
"3",
"CATEGORIES_MENU",
"=",
"[",
"\"screen\"",
",",
"\"transition\"",
",",
"\"complete\"",
",",
"\"health\"",
",",
"\"max_health\"",
"]",
"CATEGORIES_SCENE",
"=",
"[",
"\"screen\"",
",",
"\"transition\"",
",",
"\"complete\"",
",",
"\"pause\"",
",",
"\"unpause\"",
",",
"\"no_mode\"",
",",
"\"start_game\"",
",",
"\"switch\"",
",",
"\"door\"",
",",
"\"death\"",
",",
"\"revive\"",
"]",
"CATEGORIES_CUTSCENE",
"=",
"[",
"\"screen\"",
",",
"\"transition\"",
"]",
"def",
"handleEvent",
"(",
"self",
",",
"event",
")",
":",
"\"\"\"\n Handles the given event.\n\n :param event: pygame.Event, allowing event-driven programming.\n \"\"\"",
"raise",
"NotImplementedError",
"def",
"update",
"(",
"self",
")",
":",
"\"\"\"\n Updates the logic of the game object every game tick.\n \"\"\"",
"raise",
"NotImplementedError",
"def",
"draw",
"(",
"self",
",",
"camera",
"=",
"None",
")",
":",
"\"\"\"\n Renders the game object to the screen every game tick.\n \"\"\"",
"raise",
"NotImplementedError",
"def",
"messageMenu",
"(",
"self",
",",
"category",
",",
"data",
"=",
"None",
")",
":",
"\"\"\"\n Creates an event that is posted for the menu engine.\n\n :param category: String, the category of the message.\n :param data: N-Tuple, containing the data for the relevant category.\n \"\"\"",
"self",
".",
"_messageEngine",
"(",
"GameObject",
".",
"CATEGORIES_MENU",
",",
"GameObject",
".",
"MENU_EVENT",
",",
"self",
".",
"__str__",
"(",
")",
",",
"category",
",",
"data",
")",
"def",
"messageScene",
"(",
"self",
",",
"category",
",",
"data",
"=",
"None",
")",
":",
"\"\"\"\n Creates an event that is posted for the scene engine.\n\n :param sender: String, the sender of the message.\n :param category: String, the category of the message.\n :param data: N-Tuple, containing the data for the relevant category.\n \"\"\"",
"self",
".",
"_messageEngine",
"(",
"GameObject",
".",
"CATEGORIES_SCENE",
",",
"GameObject",
".",
"SCENE_EVENT",
",",
"self",
".",
"__str__",
"(",
")",
",",
"category",
",",
"data",
")",
"def",
"messageCutScene",
"(",
"self",
",",
"category",
",",
"data",
"=",
"None",
")",
":",
"\"\"\"\n Creates an event that is posted for the cutscene engine.\n\n :param category: String, the category of the message.\n :param data: N-Tuple, containing the data for the relevant category.\n \"\"\"",
"self",
".",
"_messageEngine",
"(",
"GameObject",
".",
"CATEGORIES_CUTSCENE",
",",
"GameObject",
".",
"CUTSCENE_EVENT",
",",
"self",
".",
"__str__",
"(",
")",
",",
"category",
",",
"data",
")",
"def",
"_messageEngine",
"(",
"self",
",",
"CATEGORIES",
",",
"EVENT",
",",
"sender",
",",
"category",
",",
"data",
"=",
"None",
")",
":",
"\"\"\"\n Creates an event that is posted to an engine.\n\n :param CATEGORIES: List, containing strings of valid categories.\n :param EVENT: pygame.event, the event that the engine handles.\n :param sender: String, the sender of the message.\n :param category: String, the category of the message.\n :param data: N-Tuple, containing the data for the relevant category.\n \"\"\"",
"if",
"category",
"not",
"in",
"CATEGORIES",
":",
"raise",
"KeyError",
"(",
"\"'{}' is an invalid category! The categories allowed \"",
"\"are {}!\"",
".",
"format",
"(",
"category",
",",
"CATEGORIES",
")",
")",
"contents",
"=",
"{",
"\"sender\"",
":",
"sender",
",",
"\"category\"",
":",
"category",
",",
"\"data\"",
":",
"data",
"}",
"message",
"=",
"pg",
".",
"event",
".",
"Event",
"(",
"EVENT",
",",
"contents",
")",
"pg",
".",
"event",
".",
"post",
"(",
"message",
")"
] | The base class for all other classes. | [
"The",
"base",
"class",
"for",
"all",
"other",
"classes",
"."
] | [
"\"\"\"\n The base class for all other classes.\n \"\"\"",
"\"\"\"\n Handles the given event.\n\n :param event: pygame.Event, allowing event-driven programming.\n \"\"\"",
"\"\"\"\n Updates the logic of the game object every game tick.\n \"\"\"",
"\"\"\"\n Renders the game object to the screen every game tick.\n \"\"\"",
"\"\"\"\n Creates an event that is posted for the menu engine.\n\n :param category: String, the category of the message.\n :param data: N-Tuple, containing the data for the relevant category.\n \"\"\"",
"\"\"\"\n Creates an event that is posted for the scene engine.\n\n :param sender: String, the sender of the message.\n :param category: String, the category of the message.\n :param data: N-Tuple, containing the data for the relevant category.\n \"\"\"",
"\"\"\"\n Creates an event that is posted for the cutscene engine.\n\n :param category: String, the category of the message.\n :param data: N-Tuple, containing the data for the relevant category.\n \"\"\"",
"\"\"\"\n Creates an event that is posted to an engine.\n\n :param CATEGORIES: List, containing strings of valid categories.\n :param EVENT: pygame.event, the event that the engine handles.\n :param sender: String, the sender of the message.\n :param category: String, the category of the message.\n :param data: N-Tuple, containing the data for the relevant category.\n \"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
780191505bf40f3e172386d4238c7acea61b8ff1 | gustavo-bordin/scrapy | scrapy/core/http2/stream.py | [
"BSD-3-Clause"
] | Python | Stream | Represents a single HTTP/2 Stream.
Stream is a bidirectional flow of bytes within an established connection,
which may carry one or more messages. Handles the transfer of HTTP Headers
and Data frames.
Role of this class is to
1. Combine all the data frames
| Represents a single HTTP/2 Stream.
Stream is a bidirectional flow of bytes within an established connection,
which may carry one or more messages. Handles the transfer of HTTP Headers
and Data frames.
Role of this class is to
1. Combine all the data frames | [
"Represents",
"a",
"single",
"HTTP",
"/",
"2",
"Stream",
".",
"Stream",
"is",
"a",
"bidirectional",
"flow",
"of",
"bytes",
"within",
"an",
"established",
"connection",
"which",
"may",
"carry",
"one",
"or",
"more",
"messages",
".",
"Handles",
"the",
"transfer",
"of",
"HTTP",
"Headers",
"and",
"Data",
"frames",
".",
"Role",
"of",
"this",
"class",
"is",
"to",
"1",
".",
"Combine",
"all",
"the",
"data",
"frames"
] | class Stream:
"""Represents a single HTTP/2 Stream.
Stream is a bidirectional flow of bytes within an established connection,
which may carry one or more messages. Handles the transfer of HTTP Headers
and Data frames.
Role of this class is to
1. Combine all the data frames
"""
def __init__(
self,
stream_id: int,
request: Request,
protocol: "H2ClientProtocol",
download_maxsize: int = 0,
download_warnsize: int = 0,
) -> None:
"""
Arguments:
stream_id -- Unique identifier for the stream within a single HTTP/2 connection
request -- The HTTP request associated to the stream
protocol -- Parent H2ClientProtocol instance
"""
self.stream_id: int = stream_id
self._request: Request = request
self._protocol: "H2ClientProtocol" = protocol
self._download_maxsize = self._request.meta.get('download_maxsize', download_maxsize)
self._download_warnsize = self._request.meta.get('download_warnsize', download_warnsize)
# Metadata of an HTTP/2 connection stream
# initialized when stream is instantiated
self.metadata: Dict = {
'request_content_length': 0 if self._request.body is None else len(self._request.body),
# Flag to keep track whether the stream has initiated the request
'request_sent': False,
# Flag to track whether we have logged about exceeding download warnsize
'reached_warnsize': False,
# Each time we send a data frame, we will decrease value by the amount send.
'remaining_content_length': 0 if self._request.body is None else len(self._request.body),
# Flag to keep track whether client (self) have closed this stream
'stream_closed_local': False,
# Flag to keep track whether the server has closed the stream
'stream_closed_server': False,
}
# Private variable used to build the response
# this response is then converted to appropriate Response class
# passed to the response deferred callback
self._response: Dict = {
# Data received frame by frame from the server is appended
# and passed to the response Deferred when completely received.
'body': BytesIO(),
# The amount of data received that counts against the
# flow control window
'flow_controlled_size': 0,
# Headers received after sending the request
'headers': Headers({}),
}
def _cancel(_) -> None:
# Close this stream as gracefully as possible
# If the associated request is initiated we reset this stream
# else we directly call close() method
if self.metadata['request_sent']:
self.reset_stream(StreamCloseReason.CANCELLED)
else:
self.close(StreamCloseReason.CANCELLED)
self._deferred_response = Deferred(_cancel)
def __repr__(self):
return f'Stream(id={self.stream_id!r})'
@property
def _log_warnsize(self) -> bool:
"""Checks if we have received data which exceeds the download warnsize
and whether we have not already logged about it.
Returns:
True if both the above conditions hold true
False if any of the conditions is false
"""
content_length_header = int(self._response['headers'].get(b'Content-Length', -1))
return (
self._download_warnsize
and (
self._response['flow_controlled_size'] > self._download_warnsize
or content_length_header > self._download_warnsize
)
and not self.metadata['reached_warnsize']
)
def get_response(self) -> Deferred:
"""Simply return a Deferred which fires when response
from the asynchronous request is available
"""
return self._deferred_response
def check_request_url(self) -> bool:
# Make sure that we are sending the request to the correct URL
url = urlparse(self._request.url)
return (
url.netloc == str(self._protocol.metadata['uri'].host, 'utf-8')
or url.netloc == str(self._protocol.metadata['uri'].netloc, 'utf-8')
or url.netloc == f'{self._protocol.metadata["ip_address"]}:{self._protocol.metadata["uri"].port}'
)
def _get_request_headers(self) -> List[Tuple[str, str]]:
url = urlparse(self._request.url)
path = url.path
if url.query:
path += '?' + url.query
# This pseudo-header field MUST NOT be empty for "http" or "https"
# URIs; "http" or "https" URIs that do not contain a path component
# MUST include a value of '/'. The exception to this rule is an
# OPTIONS request for an "http" or "https" URI that does not include
# a path component; these MUST include a ":path" pseudo-header field
# with a value of '*' (refer RFC 7540 - Section 8.1.2.3)
if not path:
path = '*' if self._request.method == 'OPTIONS' else '/'
# Make sure pseudo-headers comes before all the other headers
headers = [
(':method', self._request.method),
(':authority', url.netloc),
]
# The ":scheme" and ":path" pseudo-header fields MUST
# be omitted for CONNECT method (refer RFC 7540 - Section 8.3)
if self._request.method != 'CONNECT':
headers += [
(':scheme', self._protocol.metadata['uri'].scheme),
(':path', path),
]
content_length = str(len(self._request.body))
headers.append(('Content-Length', content_length))
content_length_name = self._request.headers.normkey(b'Content-Length')
for name, values in self._request.headers.items():
for value in values:
value = str(value, 'utf-8')
if name == content_length_name:
if value != content_length:
logger.warning(
'Ignoring bad Content-Length header %r of request %r, '
'sending %r instead',
value,
self._request,
content_length,
)
continue
headers.append((str(name, 'utf-8'), value))
return headers
def initiate_request(self) -> None:
if self.check_request_url():
headers = self._get_request_headers()
self._protocol.conn.send_headers(self.stream_id, headers, end_stream=False)
self.metadata['request_sent'] = True
self.send_data()
else:
# Close this stream calling the response errback
# Note that we have not sent any headers
self.close(StreamCloseReason.INVALID_HOSTNAME)
def send_data(self) -> None:
"""Called immediately after the headers are sent. Here we send all the
data as part of the request.
If the content length is 0 initially then we end the stream immediately and
wait for response data.
Warning: Only call this method when stream not closed from client side
and has initiated request already by sending HEADER frame. If not then
stream will raise ProtocolError (raise by h2 state machine).
"""
if self.metadata['stream_closed_local']:
raise StreamClosedError(self.stream_id)
# Firstly, check what the flow control window is for current stream.
window_size = self._protocol.conn.local_flow_control_window(stream_id=self.stream_id)
# Next, check what the maximum frame size is.
max_frame_size = self._protocol.conn.max_outbound_frame_size
# We will send no more than the window size or the remaining file size
# of data in this call, whichever is smaller.
bytes_to_send_size = min(window_size, self.metadata['remaining_content_length'])
# We now need to send a number of data frames.
while bytes_to_send_size > 0:
chunk_size = min(bytes_to_send_size, max_frame_size)
data_chunk_start_id = self.metadata['request_content_length'] - self.metadata['remaining_content_length']
data_chunk = self._request.body[data_chunk_start_id:data_chunk_start_id + chunk_size]
self._protocol.conn.send_data(self.stream_id, data_chunk, end_stream=False)
bytes_to_send_size = bytes_to_send_size - chunk_size
self.metadata['remaining_content_length'] = self.metadata['remaining_content_length'] - chunk_size
self.metadata['remaining_content_length'] = max(0, self.metadata['remaining_content_length'])
# End the stream if no more data needs to be send
if self.metadata['remaining_content_length'] == 0:
self._protocol.conn.end_stream(self.stream_id)
# Q. What about the rest of the data?
# Ans: Remaining Data frames will be sent when we get a WindowUpdate frame
def receive_window_update(self) -> None:
"""Flow control window size was changed.
Send data that earlier could not be sent as we were
blocked behind the flow control.
"""
if (
self.metadata['remaining_content_length']
and not self.metadata['stream_closed_server']
and self.metadata['request_sent']
):
self.send_data()
def receive_data(self, data: bytes, flow_controlled_length: int) -> None:
self._response['body'].write(data)
self._response['flow_controlled_size'] += flow_controlled_length
# We check maxsize here in case the Content-Length header was not received
if self._download_maxsize and self._response['flow_controlled_size'] > self._download_maxsize:
self.reset_stream(StreamCloseReason.MAXSIZE_EXCEEDED)
return
if self._log_warnsize:
self.metadata['reached_warnsize'] = True
warning_msg = (
f'Received more ({self._response["flow_controlled_size"]}) bytes than download '
f'warn size ({self._download_warnsize}) in request {self._request}'
)
logger.warning(warning_msg)
# Acknowledge the data received
self._protocol.conn.acknowledge_received_data(
self._response['flow_controlled_size'],
self.stream_id
)
def receive_headers(self, headers: List[HeaderTuple]) -> None:
for name, value in headers:
self._response['headers'][name] = value
# Check if we exceed the allowed max data size which can be received
expected_size = int(self._response['headers'].get(b'Content-Length', -1))
if self._download_maxsize and expected_size > self._download_maxsize:
self.reset_stream(StreamCloseReason.MAXSIZE_EXCEEDED)
return
if self._log_warnsize:
self.metadata['reached_warnsize'] = True
warning_msg = (
f'Expected response size ({expected_size}) larger than '
f'download warn size ({self._download_warnsize}) in request {self._request}'
)
logger.warning(warning_msg)
def reset_stream(self, reason: StreamCloseReason = StreamCloseReason.RESET) -> None:
"""Close this stream by sending a RST_FRAME to the remote peer"""
if self.metadata['stream_closed_local']:
raise StreamClosedError(self.stream_id)
# Clear buffer earlier to avoid keeping data in memory for a long time
self._response['body'].truncate(0)
self.metadata['stream_closed_local'] = True
self._protocol.conn.reset_stream(self.stream_id, ErrorCodes.REFUSED_STREAM)
self.close(reason)
def close(
self,
reason: StreamCloseReason,
errors: Optional[List[BaseException]] = None,
from_protocol: bool = False,
) -> None:
"""Based on the reason sent we will handle each case.
"""
if self.metadata['stream_closed_server']:
raise StreamClosedError(self.stream_id)
if not isinstance(reason, StreamCloseReason):
raise TypeError(f'Expected StreamCloseReason, received {reason.__class__.__qualname__}')
# Have default value of errors as an empty list as
# some cases can add a list of exceptions
errors = errors or []
if not from_protocol:
self._protocol.pop_stream(self.stream_id)
self.metadata['stream_closed_server'] = True
# We do not check for Content-Length or Transfer-Encoding in response headers
# and add `partial` flag as in HTTP/1.1 as 'A request or response that includes
# a payload body can include a content-length header field' (RFC 7540 - Section 8.1.2.6)
# NOTE: Order of handling the events is important here
# As we immediately cancel the request when maxsize is exceeded while
# receiving DATA_FRAME's when we have received the headers (not
# having Content-Length)
if reason is StreamCloseReason.MAXSIZE_EXCEEDED:
expected_size = int(self._response['headers'].get(
b'Content-Length',
self._response['flow_controlled_size'])
)
error_msg = (
f'Cancelling download of {self._request.url}: received response '
f'size ({expected_size}) larger than download max size ({self._download_maxsize})'
)
logger.error(error_msg)
self._deferred_response.errback(CancelledError(error_msg))
elif reason is StreamCloseReason.ENDED:
self._fire_response_deferred()
# Stream was abruptly ended here
elif reason is StreamCloseReason.CANCELLED:
# Client has cancelled the request. Remove all the data
# received and fire the response deferred with no flags set
# NOTE: The data is already flushed in Stream.reset_stream() called
# immediately when the stream needs to be cancelled
# There maybe no :status in headers, we make
# HTTP Status Code: 499 - Client Closed Request
self._response['headers'][':status'] = '499'
self._fire_response_deferred()
elif reason is StreamCloseReason.RESET:
self._deferred_response.errback(ResponseFailed([
Failure(
f'Remote peer {self._protocol.metadata["ip_address"]} sent RST_STREAM',
ProtocolError
)
]))
elif reason is StreamCloseReason.CONNECTION_LOST:
self._deferred_response.errback(ResponseFailed(errors))
elif reason is StreamCloseReason.INACTIVE:
errors.insert(0, InactiveStreamClosed(self._request))
self._deferred_response.errback(ResponseFailed(errors))
else:
assert reason is StreamCloseReason.INVALID_HOSTNAME
self._deferred_response.errback(InvalidHostname(
self._request,
str(self._protocol.metadata['uri'].host, 'utf-8'),
f'{self._protocol.metadata["ip_address"]}:{self._protocol.metadata["uri"].port}'
))
def _fire_response_deferred(self) -> None:
"""Builds response from the self._response dict
and fires the response deferred callback with the
generated response instance"""
body = self._response['body'].getvalue()
response_cls = responsetypes.from_args(
headers=self._response['headers'],
url=self._request.url,
body=body,
)
response = response_cls(
url=self._request.url,
status=int(self._response['headers'][':status']),
headers=self._response['headers'],
body=body,
request=self._request,
certificate=self._protocol.metadata['certificate'],
ip_address=self._protocol.metadata['ip_address'],
protocol='h2',
)
self._deferred_response.callback(response) | [
"class",
"Stream",
":",
"def",
"__init__",
"(",
"self",
",",
"stream_id",
":",
"int",
",",
"request",
":",
"Request",
",",
"protocol",
":",
"\"H2ClientProtocol\"",
",",
"download_maxsize",
":",
"int",
"=",
"0",
",",
"download_warnsize",
":",
"int",
"=",
"0",
",",
")",
"->",
"None",
":",
"\"\"\"\n Arguments:\n stream_id -- Unique identifier for the stream within a single HTTP/2 connection\n request -- The HTTP request associated to the stream\n protocol -- Parent H2ClientProtocol instance\n \"\"\"",
"self",
".",
"stream_id",
":",
"int",
"=",
"stream_id",
"self",
".",
"_request",
":",
"Request",
"=",
"request",
"self",
".",
"_protocol",
":",
"\"H2ClientProtocol\"",
"=",
"protocol",
"self",
".",
"_download_maxsize",
"=",
"self",
".",
"_request",
".",
"meta",
".",
"get",
"(",
"'download_maxsize'",
",",
"download_maxsize",
")",
"self",
".",
"_download_warnsize",
"=",
"self",
".",
"_request",
".",
"meta",
".",
"get",
"(",
"'download_warnsize'",
",",
"download_warnsize",
")",
"self",
".",
"metadata",
":",
"Dict",
"=",
"{",
"'request_content_length'",
":",
"0",
"if",
"self",
".",
"_request",
".",
"body",
"is",
"None",
"else",
"len",
"(",
"self",
".",
"_request",
".",
"body",
")",
",",
"'request_sent'",
":",
"False",
",",
"'reached_warnsize'",
":",
"False",
",",
"'remaining_content_length'",
":",
"0",
"if",
"self",
".",
"_request",
".",
"body",
"is",
"None",
"else",
"len",
"(",
"self",
".",
"_request",
".",
"body",
")",
",",
"'stream_closed_local'",
":",
"False",
",",
"'stream_closed_server'",
":",
"False",
",",
"}",
"self",
".",
"_response",
":",
"Dict",
"=",
"{",
"'body'",
":",
"BytesIO",
"(",
")",
",",
"'flow_controlled_size'",
":",
"0",
",",
"'headers'",
":",
"Headers",
"(",
"{",
"}",
")",
",",
"}",
"def",
"_cancel",
"(",
"_",
")",
"->",
"None",
":",
"if",
"self",
".",
"metadata",
"[",
"'request_sent'",
"]",
":",
"self",
".",
"reset_stream",
"(",
"StreamCloseReason",
".",
"CANCELLED",
")",
"else",
":",
"self",
".",
"close",
"(",
"StreamCloseReason",
".",
"CANCELLED",
")",
"self",
".",
"_deferred_response",
"=",
"Deferred",
"(",
"_cancel",
")",
"def",
"__repr__",
"(",
"self",
")",
":",
"return",
"f'Stream(id={self.stream_id!r})'",
"@",
"property",
"def",
"_log_warnsize",
"(",
"self",
")",
"->",
"bool",
":",
"\"\"\"Checks if we have received data which exceeds the download warnsize\n and whether we have not already logged about it.\n\n Returns:\n True if both the above conditions hold true\n False if any of the conditions is false\n \"\"\"",
"content_length_header",
"=",
"int",
"(",
"self",
".",
"_response",
"[",
"'headers'",
"]",
".",
"get",
"(",
"b'Content-Length'",
",",
"-",
"1",
")",
")",
"return",
"(",
"self",
".",
"_download_warnsize",
"and",
"(",
"self",
".",
"_response",
"[",
"'flow_controlled_size'",
"]",
">",
"self",
".",
"_download_warnsize",
"or",
"content_length_header",
">",
"self",
".",
"_download_warnsize",
")",
"and",
"not",
"self",
".",
"metadata",
"[",
"'reached_warnsize'",
"]",
")",
"def",
"get_response",
"(",
"self",
")",
"->",
"Deferred",
":",
"\"\"\"Simply return a Deferred which fires when response\n from the asynchronous request is available\n \"\"\"",
"return",
"self",
".",
"_deferred_response",
"def",
"check_request_url",
"(",
"self",
")",
"->",
"bool",
":",
"url",
"=",
"urlparse",
"(",
"self",
".",
"_request",
".",
"url",
")",
"return",
"(",
"url",
".",
"netloc",
"==",
"str",
"(",
"self",
".",
"_protocol",
".",
"metadata",
"[",
"'uri'",
"]",
".",
"host",
",",
"'utf-8'",
")",
"or",
"url",
".",
"netloc",
"==",
"str",
"(",
"self",
".",
"_protocol",
".",
"metadata",
"[",
"'uri'",
"]",
".",
"netloc",
",",
"'utf-8'",
")",
"or",
"url",
".",
"netloc",
"==",
"f'{self._protocol.metadata[\"ip_address\"]}:{self._protocol.metadata[\"uri\"].port}'",
")",
"def",
"_get_request_headers",
"(",
"self",
")",
"->",
"List",
"[",
"Tuple",
"[",
"str",
",",
"str",
"]",
"]",
":",
"url",
"=",
"urlparse",
"(",
"self",
".",
"_request",
".",
"url",
")",
"path",
"=",
"url",
".",
"path",
"if",
"url",
".",
"query",
":",
"path",
"+=",
"'?'",
"+",
"url",
".",
"query",
"if",
"not",
"path",
":",
"path",
"=",
"'*'",
"if",
"self",
".",
"_request",
".",
"method",
"==",
"'OPTIONS'",
"else",
"'/'",
"headers",
"=",
"[",
"(",
"':method'",
",",
"self",
".",
"_request",
".",
"method",
")",
",",
"(",
"':authority'",
",",
"url",
".",
"netloc",
")",
",",
"]",
"if",
"self",
".",
"_request",
".",
"method",
"!=",
"'CONNECT'",
":",
"headers",
"+=",
"[",
"(",
"':scheme'",
",",
"self",
".",
"_protocol",
".",
"metadata",
"[",
"'uri'",
"]",
".",
"scheme",
")",
",",
"(",
"':path'",
",",
"path",
")",
",",
"]",
"content_length",
"=",
"str",
"(",
"len",
"(",
"self",
".",
"_request",
".",
"body",
")",
")",
"headers",
".",
"append",
"(",
"(",
"'Content-Length'",
",",
"content_length",
")",
")",
"content_length_name",
"=",
"self",
".",
"_request",
".",
"headers",
".",
"normkey",
"(",
"b'Content-Length'",
")",
"for",
"name",
",",
"values",
"in",
"self",
".",
"_request",
".",
"headers",
".",
"items",
"(",
")",
":",
"for",
"value",
"in",
"values",
":",
"value",
"=",
"str",
"(",
"value",
",",
"'utf-8'",
")",
"if",
"name",
"==",
"content_length_name",
":",
"if",
"value",
"!=",
"content_length",
":",
"logger",
".",
"warning",
"(",
"'Ignoring bad Content-Length header %r of request %r, '",
"'sending %r instead'",
",",
"value",
",",
"self",
".",
"_request",
",",
"content_length",
",",
")",
"continue",
"headers",
".",
"append",
"(",
"(",
"str",
"(",
"name",
",",
"'utf-8'",
")",
",",
"value",
")",
")",
"return",
"headers",
"def",
"initiate_request",
"(",
"self",
")",
"->",
"None",
":",
"if",
"self",
".",
"check_request_url",
"(",
")",
":",
"headers",
"=",
"self",
".",
"_get_request_headers",
"(",
")",
"self",
".",
"_protocol",
".",
"conn",
".",
"send_headers",
"(",
"self",
".",
"stream_id",
",",
"headers",
",",
"end_stream",
"=",
"False",
")",
"self",
".",
"metadata",
"[",
"'request_sent'",
"]",
"=",
"True",
"self",
".",
"send_data",
"(",
")",
"else",
":",
"self",
".",
"close",
"(",
"StreamCloseReason",
".",
"INVALID_HOSTNAME",
")",
"def",
"send_data",
"(",
"self",
")",
"->",
"None",
":",
"\"\"\"Called immediately after the headers are sent. Here we send all the\n data as part of the request.\n\n If the content length is 0 initially then we end the stream immediately and\n wait for response data.\n\n Warning: Only call this method when stream not closed from client side\n and has initiated request already by sending HEADER frame. If not then\n stream will raise ProtocolError (raise by h2 state machine).\n \"\"\"",
"if",
"self",
".",
"metadata",
"[",
"'stream_closed_local'",
"]",
":",
"raise",
"StreamClosedError",
"(",
"self",
".",
"stream_id",
")",
"window_size",
"=",
"self",
".",
"_protocol",
".",
"conn",
".",
"local_flow_control_window",
"(",
"stream_id",
"=",
"self",
".",
"stream_id",
")",
"max_frame_size",
"=",
"self",
".",
"_protocol",
".",
"conn",
".",
"max_outbound_frame_size",
"bytes_to_send_size",
"=",
"min",
"(",
"window_size",
",",
"self",
".",
"metadata",
"[",
"'remaining_content_length'",
"]",
")",
"while",
"bytes_to_send_size",
">",
"0",
":",
"chunk_size",
"=",
"min",
"(",
"bytes_to_send_size",
",",
"max_frame_size",
")",
"data_chunk_start_id",
"=",
"self",
".",
"metadata",
"[",
"'request_content_length'",
"]",
"-",
"self",
".",
"metadata",
"[",
"'remaining_content_length'",
"]",
"data_chunk",
"=",
"self",
".",
"_request",
".",
"body",
"[",
"data_chunk_start_id",
":",
"data_chunk_start_id",
"+",
"chunk_size",
"]",
"self",
".",
"_protocol",
".",
"conn",
".",
"send_data",
"(",
"self",
".",
"stream_id",
",",
"data_chunk",
",",
"end_stream",
"=",
"False",
")",
"bytes_to_send_size",
"=",
"bytes_to_send_size",
"-",
"chunk_size",
"self",
".",
"metadata",
"[",
"'remaining_content_length'",
"]",
"=",
"self",
".",
"metadata",
"[",
"'remaining_content_length'",
"]",
"-",
"chunk_size",
"self",
".",
"metadata",
"[",
"'remaining_content_length'",
"]",
"=",
"max",
"(",
"0",
",",
"self",
".",
"metadata",
"[",
"'remaining_content_length'",
"]",
")",
"if",
"self",
".",
"metadata",
"[",
"'remaining_content_length'",
"]",
"==",
"0",
":",
"self",
".",
"_protocol",
".",
"conn",
".",
"end_stream",
"(",
"self",
".",
"stream_id",
")",
"def",
"receive_window_update",
"(",
"self",
")",
"->",
"None",
":",
"\"\"\"Flow control window size was changed.\n Send data that earlier could not be sent as we were\n blocked behind the flow control.\n \"\"\"",
"if",
"(",
"self",
".",
"metadata",
"[",
"'remaining_content_length'",
"]",
"and",
"not",
"self",
".",
"metadata",
"[",
"'stream_closed_server'",
"]",
"and",
"self",
".",
"metadata",
"[",
"'request_sent'",
"]",
")",
":",
"self",
".",
"send_data",
"(",
")",
"def",
"receive_data",
"(",
"self",
",",
"data",
":",
"bytes",
",",
"flow_controlled_length",
":",
"int",
")",
"->",
"None",
":",
"self",
".",
"_response",
"[",
"'body'",
"]",
".",
"write",
"(",
"data",
")",
"self",
".",
"_response",
"[",
"'flow_controlled_size'",
"]",
"+=",
"flow_controlled_length",
"if",
"self",
".",
"_download_maxsize",
"and",
"self",
".",
"_response",
"[",
"'flow_controlled_size'",
"]",
">",
"self",
".",
"_download_maxsize",
":",
"self",
".",
"reset_stream",
"(",
"StreamCloseReason",
".",
"MAXSIZE_EXCEEDED",
")",
"return",
"if",
"self",
".",
"_log_warnsize",
":",
"self",
".",
"metadata",
"[",
"'reached_warnsize'",
"]",
"=",
"True",
"warning_msg",
"=",
"(",
"f'Received more ({self._response[\"flow_controlled_size\"]}) bytes than download '",
"f'warn size ({self._download_warnsize}) in request {self._request}'",
")",
"logger",
".",
"warning",
"(",
"warning_msg",
")",
"self",
".",
"_protocol",
".",
"conn",
".",
"acknowledge_received_data",
"(",
"self",
".",
"_response",
"[",
"'flow_controlled_size'",
"]",
",",
"self",
".",
"stream_id",
")",
"def",
"receive_headers",
"(",
"self",
",",
"headers",
":",
"List",
"[",
"HeaderTuple",
"]",
")",
"->",
"None",
":",
"for",
"name",
",",
"value",
"in",
"headers",
":",
"self",
".",
"_response",
"[",
"'headers'",
"]",
"[",
"name",
"]",
"=",
"value",
"expected_size",
"=",
"int",
"(",
"self",
".",
"_response",
"[",
"'headers'",
"]",
".",
"get",
"(",
"b'Content-Length'",
",",
"-",
"1",
")",
")",
"if",
"self",
".",
"_download_maxsize",
"and",
"expected_size",
">",
"self",
".",
"_download_maxsize",
":",
"self",
".",
"reset_stream",
"(",
"StreamCloseReason",
".",
"MAXSIZE_EXCEEDED",
")",
"return",
"if",
"self",
".",
"_log_warnsize",
":",
"self",
".",
"metadata",
"[",
"'reached_warnsize'",
"]",
"=",
"True",
"warning_msg",
"=",
"(",
"f'Expected response size ({expected_size}) larger than '",
"f'download warn size ({self._download_warnsize}) in request {self._request}'",
")",
"logger",
".",
"warning",
"(",
"warning_msg",
")",
"def",
"reset_stream",
"(",
"self",
",",
"reason",
":",
"StreamCloseReason",
"=",
"StreamCloseReason",
".",
"RESET",
")",
"->",
"None",
":",
"\"\"\"Close this stream by sending a RST_FRAME to the remote peer\"\"\"",
"if",
"self",
".",
"metadata",
"[",
"'stream_closed_local'",
"]",
":",
"raise",
"StreamClosedError",
"(",
"self",
".",
"stream_id",
")",
"self",
".",
"_response",
"[",
"'body'",
"]",
".",
"truncate",
"(",
"0",
")",
"self",
".",
"metadata",
"[",
"'stream_closed_local'",
"]",
"=",
"True",
"self",
".",
"_protocol",
".",
"conn",
".",
"reset_stream",
"(",
"self",
".",
"stream_id",
",",
"ErrorCodes",
".",
"REFUSED_STREAM",
")",
"self",
".",
"close",
"(",
"reason",
")",
"def",
"close",
"(",
"self",
",",
"reason",
":",
"StreamCloseReason",
",",
"errors",
":",
"Optional",
"[",
"List",
"[",
"BaseException",
"]",
"]",
"=",
"None",
",",
"from_protocol",
":",
"bool",
"=",
"False",
",",
")",
"->",
"None",
":",
"\"\"\"Based on the reason sent we will handle each case.\n \"\"\"",
"if",
"self",
".",
"metadata",
"[",
"'stream_closed_server'",
"]",
":",
"raise",
"StreamClosedError",
"(",
"self",
".",
"stream_id",
")",
"if",
"not",
"isinstance",
"(",
"reason",
",",
"StreamCloseReason",
")",
":",
"raise",
"TypeError",
"(",
"f'Expected StreamCloseReason, received {reason.__class__.__qualname__}'",
")",
"errors",
"=",
"errors",
"or",
"[",
"]",
"if",
"not",
"from_protocol",
":",
"self",
".",
"_protocol",
".",
"pop_stream",
"(",
"self",
".",
"stream_id",
")",
"self",
".",
"metadata",
"[",
"'stream_closed_server'",
"]",
"=",
"True",
"if",
"reason",
"is",
"StreamCloseReason",
".",
"MAXSIZE_EXCEEDED",
":",
"expected_size",
"=",
"int",
"(",
"self",
".",
"_response",
"[",
"'headers'",
"]",
".",
"get",
"(",
"b'Content-Length'",
",",
"self",
".",
"_response",
"[",
"'flow_controlled_size'",
"]",
")",
")",
"error_msg",
"=",
"(",
"f'Cancelling download of {self._request.url}: received response '",
"f'size ({expected_size}) larger than download max size ({self._download_maxsize})'",
")",
"logger",
".",
"error",
"(",
"error_msg",
")",
"self",
".",
"_deferred_response",
".",
"errback",
"(",
"CancelledError",
"(",
"error_msg",
")",
")",
"elif",
"reason",
"is",
"StreamCloseReason",
".",
"ENDED",
":",
"self",
".",
"_fire_response_deferred",
"(",
")",
"elif",
"reason",
"is",
"StreamCloseReason",
".",
"CANCELLED",
":",
"self",
".",
"_response",
"[",
"'headers'",
"]",
"[",
"':status'",
"]",
"=",
"'499'",
"self",
".",
"_fire_response_deferred",
"(",
")",
"elif",
"reason",
"is",
"StreamCloseReason",
".",
"RESET",
":",
"self",
".",
"_deferred_response",
".",
"errback",
"(",
"ResponseFailed",
"(",
"[",
"Failure",
"(",
"f'Remote peer {self._protocol.metadata[\"ip_address\"]} sent RST_STREAM'",
",",
"ProtocolError",
")",
"]",
")",
")",
"elif",
"reason",
"is",
"StreamCloseReason",
".",
"CONNECTION_LOST",
":",
"self",
".",
"_deferred_response",
".",
"errback",
"(",
"ResponseFailed",
"(",
"errors",
")",
")",
"elif",
"reason",
"is",
"StreamCloseReason",
".",
"INACTIVE",
":",
"errors",
".",
"insert",
"(",
"0",
",",
"InactiveStreamClosed",
"(",
"self",
".",
"_request",
")",
")",
"self",
".",
"_deferred_response",
".",
"errback",
"(",
"ResponseFailed",
"(",
"errors",
")",
")",
"else",
":",
"assert",
"reason",
"is",
"StreamCloseReason",
".",
"INVALID_HOSTNAME",
"self",
".",
"_deferred_response",
".",
"errback",
"(",
"InvalidHostname",
"(",
"self",
".",
"_request",
",",
"str",
"(",
"self",
".",
"_protocol",
".",
"metadata",
"[",
"'uri'",
"]",
".",
"host",
",",
"'utf-8'",
")",
",",
"f'{self._protocol.metadata[\"ip_address\"]}:{self._protocol.metadata[\"uri\"].port}'",
")",
")",
"def",
"_fire_response_deferred",
"(",
"self",
")",
"->",
"None",
":",
"\"\"\"Builds response from the self._response dict\n and fires the response deferred callback with the\n generated response instance\"\"\"",
"body",
"=",
"self",
".",
"_response",
"[",
"'body'",
"]",
".",
"getvalue",
"(",
")",
"response_cls",
"=",
"responsetypes",
".",
"from_args",
"(",
"headers",
"=",
"self",
".",
"_response",
"[",
"'headers'",
"]",
",",
"url",
"=",
"self",
".",
"_request",
".",
"url",
",",
"body",
"=",
"body",
",",
")",
"response",
"=",
"response_cls",
"(",
"url",
"=",
"self",
".",
"_request",
".",
"url",
",",
"status",
"=",
"int",
"(",
"self",
".",
"_response",
"[",
"'headers'",
"]",
"[",
"':status'",
"]",
")",
",",
"headers",
"=",
"self",
".",
"_response",
"[",
"'headers'",
"]",
",",
"body",
"=",
"body",
",",
"request",
"=",
"self",
".",
"_request",
",",
"certificate",
"=",
"self",
".",
"_protocol",
".",
"metadata",
"[",
"'certificate'",
"]",
",",
"ip_address",
"=",
"self",
".",
"_protocol",
".",
"metadata",
"[",
"'ip_address'",
"]",
",",
"protocol",
"=",
"'h2'",
",",
")",
"self",
".",
"_deferred_response",
".",
"callback",
"(",
"response",
")"
] | Represents a single HTTP/2 Stream. | [
"Represents",
"a",
"single",
"HTTP",
"/",
"2",
"Stream",
"."
] | [
"\"\"\"Represents a single HTTP/2 Stream.\n\n Stream is a bidirectional flow of bytes within an established connection,\n which may carry one or more messages. Handles the transfer of HTTP Headers\n and Data frames.\n\n Role of this class is to\n 1. Combine all the data frames\n \"\"\"",
"\"\"\"\n Arguments:\n stream_id -- Unique identifier for the stream within a single HTTP/2 connection\n request -- The HTTP request associated to the stream\n protocol -- Parent H2ClientProtocol instance\n \"\"\"",
"# Metadata of an HTTP/2 connection stream",
"# initialized when stream is instantiated",
"# Flag to keep track whether the stream has initiated the request",
"# Flag to track whether we have logged about exceeding download warnsize",
"# Each time we send a data frame, we will decrease value by the amount send.",
"# Flag to keep track whether client (self) have closed this stream",
"# Flag to keep track whether the server has closed the stream",
"# Private variable used to build the response",
"# this response is then converted to appropriate Response class",
"# passed to the response deferred callback",
"# Data received frame by frame from the server is appended",
"# and passed to the response Deferred when completely received.",
"# The amount of data received that counts against the",
"# flow control window",
"# Headers received after sending the request",
"# Close this stream as gracefully as possible",
"# If the associated request is initiated we reset this stream",
"# else we directly call close() method",
"\"\"\"Checks if we have received data which exceeds the download warnsize\n and whether we have not already logged about it.\n\n Returns:\n True if both the above conditions hold true\n False if any of the conditions is false\n \"\"\"",
"\"\"\"Simply return a Deferred which fires when response\n from the asynchronous request is available\n \"\"\"",
"# Make sure that we are sending the request to the correct URL",
"# This pseudo-header field MUST NOT be empty for \"http\" or \"https\"",
"# URIs; \"http\" or \"https\" URIs that do not contain a path component",
"# MUST include a value of '/'. The exception to this rule is an",
"# OPTIONS request for an \"http\" or \"https\" URI that does not include",
"# a path component; these MUST include a \":path\" pseudo-header field",
"# with a value of '*' (refer RFC 7540 - Section 8.1.2.3)",
"# Make sure pseudo-headers comes before all the other headers",
"# The \":scheme\" and \":path\" pseudo-header fields MUST",
"# be omitted for CONNECT method (refer RFC 7540 - Section 8.3)",
"# Close this stream calling the response errback",
"# Note that we have not sent any headers",
"\"\"\"Called immediately after the headers are sent. Here we send all the\n data as part of the request.\n\n If the content length is 0 initially then we end the stream immediately and\n wait for response data.\n\n Warning: Only call this method when stream not closed from client side\n and has initiated request already by sending HEADER frame. If not then\n stream will raise ProtocolError (raise by h2 state machine).\n \"\"\"",
"# Firstly, check what the flow control window is for current stream.",
"# Next, check what the maximum frame size is.",
"# We will send no more than the window size or the remaining file size",
"# of data in this call, whichever is smaller.",
"# We now need to send a number of data frames.",
"# End the stream if no more data needs to be send",
"# Q. What about the rest of the data?",
"# Ans: Remaining Data frames will be sent when we get a WindowUpdate frame",
"\"\"\"Flow control window size was changed.\n Send data that earlier could not be sent as we were\n blocked behind the flow control.\n \"\"\"",
"# We check maxsize here in case the Content-Length header was not received",
"# Acknowledge the data received",
"# Check if we exceed the allowed max data size which can be received",
"\"\"\"Close this stream by sending a RST_FRAME to the remote peer\"\"\"",
"# Clear buffer earlier to avoid keeping data in memory for a long time",
"\"\"\"Based on the reason sent we will handle each case.\n \"\"\"",
"# Have default value of errors as an empty list as",
"# some cases can add a list of exceptions",
"# We do not check for Content-Length or Transfer-Encoding in response headers",
"# and add `partial` flag as in HTTP/1.1 as 'A request or response that includes",
"# a payload body can include a content-length header field' (RFC 7540 - Section 8.1.2.6)",
"# NOTE: Order of handling the events is important here",
"# As we immediately cancel the request when maxsize is exceeded while",
"# receiving DATA_FRAME's when we have received the headers (not",
"# having Content-Length)",
"# Stream was abruptly ended here",
"# Client has cancelled the request. Remove all the data",
"# received and fire the response deferred with no flags set",
"# NOTE: The data is already flushed in Stream.reset_stream() called",
"# immediately when the stream needs to be cancelled",
"# There maybe no :status in headers, we make",
"# HTTP Status Code: 499 - Client Closed Request",
"\"\"\"Builds response from the self._response dict\n and fires the response deferred callback with the\n generated response instance\"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
783035b4b9b3f5d8991fd6a577b630ff9a4e1914 | PickBas/meta-social | chat/forms.py | [
"MIT"
] | Python | Meta |
Meta class. Getting fields.
| Meta class. Getting fields. | [
"Meta",
"class",
".",
"Getting",
"fields",
"."
] | class Meta:
"""
Meta class. Getting fields.
"""
model = Chat
fields = ('base_image', ) | [
"class",
"Meta",
":",
"model",
"=",
"Chat",
"fields",
"=",
"(",
"'base_image'",
",",
")"
] | Meta class. | [
"Meta",
"class",
"."
] | [
"\"\"\"\n Meta class. Getting fields.\n \"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
783035b4b9b3f5d8991fd6a577b630ff9a4e1914 | PickBas/meta-social | chat/forms.py | [
"MIT"
] | Python | Meta |
Meta class. Getting fields.
| Meta class. Getting fields. | [
"Meta",
"class",
".",
"Getting",
"fields",
"."
] | class Meta:
"""
Meta class. Getting fields.
"""
model = MessageImages
fields = ('image',) | [
"class",
"Meta",
":",
"model",
"=",
"MessageImages",
"fields",
"=",
"(",
"'image'",
",",
")"
] | Meta class. | [
"Meta",
"class",
"."
] | [
"\"\"\"\n Meta class. Getting fields.\n \"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
786957ff51f0aade241bd7bd3541659d809c48dd | pwo/irrd | irrd/scopefilter/validators.py | [
"BSD-2-Clause"
] | Python | ScopeFilterValidator |
The scope filter validator validates whether prefixes, ASNs or RPSL
objects fall within the configured scope filter.
| The scope filter validator validates whether prefixes, ASNs or RPSL
objects fall within the configured scope filter. | [
"The",
"scope",
"filter",
"validator",
"validates",
"whether",
"prefixes",
"ASNs",
"or",
"RPSL",
"objects",
"fall",
"within",
"the",
"configured",
"scope",
"filter",
"."
] | class ScopeFilterValidator:
"""
The scope filter validator validates whether prefixes, ASNs or RPSL
objects fall within the configured scope filter.
"""
def __init__(self):
self.load_filters()
def load_filters(self):
"""
(Re)load the local cache of the configured filters.
Also called by __init__
"""
prefixes = get_setting('scopefilter.prefixes', [])
self.filtered_prefixes = [IP(prefix) for prefix in prefixes]
self.filtered_asns = set()
self.filtered_asn_ranges = set()
asn_filters = get_setting('scopefilter.asns', [])
for asn_filter in asn_filters:
if '-' in str(asn_filter):
start, end = asn_filter.split('-')
self.filtered_asn_ranges.add((int(start), int(end)))
else:
self.filtered_asns.add(int(asn_filter))
def validate(self, source: str, prefix: Optional[IP]=None, asn: Optional[int]=None) -> ScopeFilterStatus:
"""
Validate a prefix and/or ASN, for a particular source.
Returns a tuple of a ScopeFilterStatus and an explanation string.
"""
if not prefix and asn is None:
raise ValueError('Scope Filter validator must be provided asn or prefix')
if get_setting(f'sources.{source}.scopefilter_excluded'):
return ScopeFilterStatus.in_scope
if prefix:
for filtered_prefix in self.filtered_prefixes:
if prefix.version() == filtered_prefix.version() and filtered_prefix.overlaps(prefix):
return ScopeFilterStatus.out_scope_prefix
if asn is not None:
if asn in self.filtered_asns:
return ScopeFilterStatus.out_scope_as
for range_start, range_end in self.filtered_asn_ranges:
if range_start <= asn <= range_end:
return ScopeFilterStatus.out_scope_as
return ScopeFilterStatus.in_scope
def _validate_rpsl_data(self, source: str, object_class: str, prefix: Optional[IP],
asn_first: Optional[int]) -> Tuple[ScopeFilterStatus, str]:
"""
Validate whether a particular set of RPSL data is in scope.
Depending on object_class, members and mp_members are also validated.
Returns a ScopeFilterStatus.
"""
out_of_scope = [ScopeFilterStatus.out_scope_prefix, ScopeFilterStatus.out_scope_as]
if object_class not in ['route', 'route6']:
return ScopeFilterStatus.in_scope, ''
if prefix:
prefix_state = self.validate(source, prefix)
if prefix_state in out_of_scope:
return prefix_state, f'prefix {prefix} is out of scope'
if asn_first is not None:
asn_state = self.validate(source, asn=asn_first)
if asn_state in out_of_scope:
return asn_state, f'ASN {asn_first} is out of scope'
return ScopeFilterStatus.in_scope, ''
def validate_rpsl_object(self, rpsl_object: RPSLObject) -> Tuple[ScopeFilterStatus, str]:
"""
Validate whether an RPSLObject is in scope.
Returns a tuple of a ScopeFilterStatus and an explanation string.
"""
return self._validate_rpsl_data(
rpsl_object.source(),
rpsl_object.rpsl_object_class,
rpsl_object.prefix,
rpsl_object.asn_first,
)
def validate_all_rpsl_objects(self, database_handler: DatabaseHandler) -> \
Tuple[List[Dict[str, str]], List[Dict[str, str]], List[Dict[str, str]]]:
"""
Apply the scope filter to all relevant objects.
Retrieves all routes from the DB, and aggregates the validation results.
Returns a tuple of three sets:
- one with routes that should be set to status in_scope, but are not now
- one with routes that should be set to status out_scope_as, but are not now
- one with routes that should be set to status out_scope_prefix, but are not now
Each object is recorded as a dict, which has the fields shown
in "columns" below.
Objects where their current status in the DB matches the new
validation result, are not included in the return value.
"""
columns = ['rpsl_pk', 'ip_first', 'prefix_length', 'asn_first', 'source', 'object_class',
'object_text', 'scopefilter_status']
objs_changed: Dict[ScopeFilterStatus, List[Dict[str, str]]] = defaultdict(list)
q = RPSLDatabaseQuery(column_names=columns, enable_ordering=False)
q = q.object_classes(['route', 'route6'])
results = database_handler.execute_query(q)
for result in results:
current_status = result['scopefilter_status']
result['old_status'] = current_status
prefix = None
if result['ip_first']:
prefix = IP(result['ip_first'] + '/' + str(result['prefix_length']))
new_status, _ = self._validate_rpsl_data(
result['source'],
result['object_class'],
prefix,
result['asn_first'],
)
if new_status != current_status:
result['scopefilter_status'] = new_status
objs_changed[new_status].append(result)
return (objs_changed[ScopeFilterStatus.in_scope],
objs_changed[ScopeFilterStatus.out_scope_as],
objs_changed[ScopeFilterStatus.out_scope_prefix]) | [
"class",
"ScopeFilterValidator",
":",
"def",
"__init__",
"(",
"self",
")",
":",
"self",
".",
"load_filters",
"(",
")",
"def",
"load_filters",
"(",
"self",
")",
":",
"\"\"\"\n (Re)load the local cache of the configured filters.\n Also called by __init__\n \"\"\"",
"prefixes",
"=",
"get_setting",
"(",
"'scopefilter.prefixes'",
",",
"[",
"]",
")",
"self",
".",
"filtered_prefixes",
"=",
"[",
"IP",
"(",
"prefix",
")",
"for",
"prefix",
"in",
"prefixes",
"]",
"self",
".",
"filtered_asns",
"=",
"set",
"(",
")",
"self",
".",
"filtered_asn_ranges",
"=",
"set",
"(",
")",
"asn_filters",
"=",
"get_setting",
"(",
"'scopefilter.asns'",
",",
"[",
"]",
")",
"for",
"asn_filter",
"in",
"asn_filters",
":",
"if",
"'-'",
"in",
"str",
"(",
"asn_filter",
")",
":",
"start",
",",
"end",
"=",
"asn_filter",
".",
"split",
"(",
"'-'",
")",
"self",
".",
"filtered_asn_ranges",
".",
"add",
"(",
"(",
"int",
"(",
"start",
")",
",",
"int",
"(",
"end",
")",
")",
")",
"else",
":",
"self",
".",
"filtered_asns",
".",
"add",
"(",
"int",
"(",
"asn_filter",
")",
")",
"def",
"validate",
"(",
"self",
",",
"source",
":",
"str",
",",
"prefix",
":",
"Optional",
"[",
"IP",
"]",
"=",
"None",
",",
"asn",
":",
"Optional",
"[",
"int",
"]",
"=",
"None",
")",
"->",
"ScopeFilterStatus",
":",
"\"\"\"\n Validate a prefix and/or ASN, for a particular source.\n Returns a tuple of a ScopeFilterStatus and an explanation string.\n \"\"\"",
"if",
"not",
"prefix",
"and",
"asn",
"is",
"None",
":",
"raise",
"ValueError",
"(",
"'Scope Filter validator must be provided asn or prefix'",
")",
"if",
"get_setting",
"(",
"f'sources.{source}.scopefilter_excluded'",
")",
":",
"return",
"ScopeFilterStatus",
".",
"in_scope",
"if",
"prefix",
":",
"for",
"filtered_prefix",
"in",
"self",
".",
"filtered_prefixes",
":",
"if",
"prefix",
".",
"version",
"(",
")",
"==",
"filtered_prefix",
".",
"version",
"(",
")",
"and",
"filtered_prefix",
".",
"overlaps",
"(",
"prefix",
")",
":",
"return",
"ScopeFilterStatus",
".",
"out_scope_prefix",
"if",
"asn",
"is",
"not",
"None",
":",
"if",
"asn",
"in",
"self",
".",
"filtered_asns",
":",
"return",
"ScopeFilterStatus",
".",
"out_scope_as",
"for",
"range_start",
",",
"range_end",
"in",
"self",
".",
"filtered_asn_ranges",
":",
"if",
"range_start",
"<=",
"asn",
"<=",
"range_end",
":",
"return",
"ScopeFilterStatus",
".",
"out_scope_as",
"return",
"ScopeFilterStatus",
".",
"in_scope",
"def",
"_validate_rpsl_data",
"(",
"self",
",",
"source",
":",
"str",
",",
"object_class",
":",
"str",
",",
"prefix",
":",
"Optional",
"[",
"IP",
"]",
",",
"asn_first",
":",
"Optional",
"[",
"int",
"]",
")",
"->",
"Tuple",
"[",
"ScopeFilterStatus",
",",
"str",
"]",
":",
"\"\"\"\n Validate whether a particular set of RPSL data is in scope.\n Depending on object_class, members and mp_members are also validated.\n Returns a ScopeFilterStatus.\n \"\"\"",
"out_of_scope",
"=",
"[",
"ScopeFilterStatus",
".",
"out_scope_prefix",
",",
"ScopeFilterStatus",
".",
"out_scope_as",
"]",
"if",
"object_class",
"not",
"in",
"[",
"'route'",
",",
"'route6'",
"]",
":",
"return",
"ScopeFilterStatus",
".",
"in_scope",
",",
"''",
"if",
"prefix",
":",
"prefix_state",
"=",
"self",
".",
"validate",
"(",
"source",
",",
"prefix",
")",
"if",
"prefix_state",
"in",
"out_of_scope",
":",
"return",
"prefix_state",
",",
"f'prefix {prefix} is out of scope'",
"if",
"asn_first",
"is",
"not",
"None",
":",
"asn_state",
"=",
"self",
".",
"validate",
"(",
"source",
",",
"asn",
"=",
"asn_first",
")",
"if",
"asn_state",
"in",
"out_of_scope",
":",
"return",
"asn_state",
",",
"f'ASN {asn_first} is out of scope'",
"return",
"ScopeFilterStatus",
".",
"in_scope",
",",
"''",
"def",
"validate_rpsl_object",
"(",
"self",
",",
"rpsl_object",
":",
"RPSLObject",
")",
"->",
"Tuple",
"[",
"ScopeFilterStatus",
",",
"str",
"]",
":",
"\"\"\"\n Validate whether an RPSLObject is in scope.\n Returns a tuple of a ScopeFilterStatus and an explanation string.\n \"\"\"",
"return",
"self",
".",
"_validate_rpsl_data",
"(",
"rpsl_object",
".",
"source",
"(",
")",
",",
"rpsl_object",
".",
"rpsl_object_class",
",",
"rpsl_object",
".",
"prefix",
",",
"rpsl_object",
".",
"asn_first",
",",
")",
"def",
"validate_all_rpsl_objects",
"(",
"self",
",",
"database_handler",
":",
"DatabaseHandler",
")",
"->",
"Tuple",
"[",
"List",
"[",
"Dict",
"[",
"str",
",",
"str",
"]",
"]",
",",
"List",
"[",
"Dict",
"[",
"str",
",",
"str",
"]",
"]",
",",
"List",
"[",
"Dict",
"[",
"str",
",",
"str",
"]",
"]",
"]",
":",
"\"\"\"\n Apply the scope filter to all relevant objects.\n\n Retrieves all routes from the DB, and aggregates the validation results.\n Returns a tuple of three sets:\n - one with routes that should be set to status in_scope, but are not now\n - one with routes that should be set to status out_scope_as, but are not now\n - one with routes that should be set to status out_scope_prefix, but are not now\n Each object is recorded as a dict, which has the fields shown\n in \"columns\" below.\n\n Objects where their current status in the DB matches the new\n validation result, are not included in the return value.\n \"\"\"",
"columns",
"=",
"[",
"'rpsl_pk'",
",",
"'ip_first'",
",",
"'prefix_length'",
",",
"'asn_first'",
",",
"'source'",
",",
"'object_class'",
",",
"'object_text'",
",",
"'scopefilter_status'",
"]",
"objs_changed",
":",
"Dict",
"[",
"ScopeFilterStatus",
",",
"List",
"[",
"Dict",
"[",
"str",
",",
"str",
"]",
"]",
"]",
"=",
"defaultdict",
"(",
"list",
")",
"q",
"=",
"RPSLDatabaseQuery",
"(",
"column_names",
"=",
"columns",
",",
"enable_ordering",
"=",
"False",
")",
"q",
"=",
"q",
".",
"object_classes",
"(",
"[",
"'route'",
",",
"'route6'",
"]",
")",
"results",
"=",
"database_handler",
".",
"execute_query",
"(",
"q",
")",
"for",
"result",
"in",
"results",
":",
"current_status",
"=",
"result",
"[",
"'scopefilter_status'",
"]",
"result",
"[",
"'old_status'",
"]",
"=",
"current_status",
"prefix",
"=",
"None",
"if",
"result",
"[",
"'ip_first'",
"]",
":",
"prefix",
"=",
"IP",
"(",
"result",
"[",
"'ip_first'",
"]",
"+",
"'/'",
"+",
"str",
"(",
"result",
"[",
"'prefix_length'",
"]",
")",
")",
"new_status",
",",
"_",
"=",
"self",
".",
"_validate_rpsl_data",
"(",
"result",
"[",
"'source'",
"]",
",",
"result",
"[",
"'object_class'",
"]",
",",
"prefix",
",",
"result",
"[",
"'asn_first'",
"]",
",",
")",
"if",
"new_status",
"!=",
"current_status",
":",
"result",
"[",
"'scopefilter_status'",
"]",
"=",
"new_status",
"objs_changed",
"[",
"new_status",
"]",
".",
"append",
"(",
"result",
")",
"return",
"(",
"objs_changed",
"[",
"ScopeFilterStatus",
".",
"in_scope",
"]",
",",
"objs_changed",
"[",
"ScopeFilterStatus",
".",
"out_scope_as",
"]",
",",
"objs_changed",
"[",
"ScopeFilterStatus",
".",
"out_scope_prefix",
"]",
")"
] | The scope filter validator validates whether prefixes, ASNs or RPSL
objects fall within the configured scope filter. | [
"The",
"scope",
"filter",
"validator",
"validates",
"whether",
"prefixes",
"ASNs",
"or",
"RPSL",
"objects",
"fall",
"within",
"the",
"configured",
"scope",
"filter",
"."
] | [
"\"\"\"\n The scope filter validator validates whether prefixes, ASNs or RPSL\n objects fall within the configured scope filter.\n \"\"\"",
"\"\"\"\n (Re)load the local cache of the configured filters.\n Also called by __init__\n \"\"\"",
"\"\"\"\n Validate a prefix and/or ASN, for a particular source.\n Returns a tuple of a ScopeFilterStatus and an explanation string.\n \"\"\"",
"\"\"\"\n Validate whether a particular set of RPSL data is in scope.\n Depending on object_class, members and mp_members are also validated.\n Returns a ScopeFilterStatus.\n \"\"\"",
"\"\"\"\n Validate whether an RPSLObject is in scope.\n Returns a tuple of a ScopeFilterStatus and an explanation string.\n \"\"\"",
"\"\"\"\n Apply the scope filter to all relevant objects.\n\n Retrieves all routes from the DB, and aggregates the validation results.\n Returns a tuple of three sets:\n - one with routes that should be set to status in_scope, but are not now\n - one with routes that should be set to status out_scope_as, but are not now\n - one with routes that should be set to status out_scope_prefix, but are not now\n Each object is recorded as a dict, which has the fields shown\n in \"columns\" below.\n\n Objects where their current status in the DB matches the new\n validation result, are not included in the return value.\n \"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
789fcdd940662c0ac1e593a1b04be96bee4e4bce | Romansko/MessageU | server/database.py | [
"MIT"
] | Python | Client | Represents a client entry | Represents a client entry | [
"Represents",
"a",
"client",
"entry"
] | class Client:
""" Represents a client entry """
def __init__(self, cid, cname, public_key, last_seen):
self.ID = bytes.fromhex(cid) # Unique client ID, 16 bytes.
self.Name = cname # Client's name, null terminated ascii string, 255 bytes.
self.PublicKey = public_key # Client's public key, 160 bytes.
self.LastSeen = last_seen # The Date & time of client's last request.
def validate(self):
""" Validate Client attributes according to the requirements """
if not self.ID or len(self.ID) != protocol.CLIENT_ID_SIZE:
return False
if not self.Name or len(self.Name) >= protocol.NAME_SIZE:
return False
if not self.PublicKey or len(self.PublicKey) != protocol.PUBLIC_KEY_SIZE:
return False
if not self.LastSeen:
return False
return True | [
"class",
"Client",
":",
"def",
"__init__",
"(",
"self",
",",
"cid",
",",
"cname",
",",
"public_key",
",",
"last_seen",
")",
":",
"self",
".",
"ID",
"=",
"bytes",
".",
"fromhex",
"(",
"cid",
")",
"self",
".",
"Name",
"=",
"cname",
"self",
".",
"PublicKey",
"=",
"public_key",
"self",
".",
"LastSeen",
"=",
"last_seen",
"def",
"validate",
"(",
"self",
")",
":",
"\"\"\" Validate Client attributes according to the requirements \"\"\"",
"if",
"not",
"self",
".",
"ID",
"or",
"len",
"(",
"self",
".",
"ID",
")",
"!=",
"protocol",
".",
"CLIENT_ID_SIZE",
":",
"return",
"False",
"if",
"not",
"self",
".",
"Name",
"or",
"len",
"(",
"self",
".",
"Name",
")",
">=",
"protocol",
".",
"NAME_SIZE",
":",
"return",
"False",
"if",
"not",
"self",
".",
"PublicKey",
"or",
"len",
"(",
"self",
".",
"PublicKey",
")",
"!=",
"protocol",
".",
"PUBLIC_KEY_SIZE",
":",
"return",
"False",
"if",
"not",
"self",
".",
"LastSeen",
":",
"return",
"False",
"return",
"True"
] | Represents a client entry | [
"Represents",
"a",
"client",
"entry"
] | [
"\"\"\" Represents a client entry \"\"\"",
"# Unique client ID, 16 bytes.",
"# Client's name, null terminated ascii string, 255 bytes.",
"# Client's public key, 160 bytes.",
"# The Date & time of client's last request.",
"\"\"\" Validate Client attributes according to the requirements \"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
789fcdd940662c0ac1e593a1b04be96bee4e4bce | Romansko/MessageU | server/database.py | [
"MIT"
] | Python | Message | Represents a message entry | Represents a message entry | [
"Represents",
"a",
"message",
"entry"
] | class Message:
""" Represents a message entry """
def __init__(self, to_client, from_client, mtype, content):
self.ID = 0 # Message ID, 4 bytes.
self.ToClient = to_client # Receiver's unique ID, 16 bytes.
self.FromClient = from_client # Sender's unique ID, 16 bytes.
self.Type = mtype # Message type, 1 byte.
self.Content = content # Message's content, Blob.
def validate(self):
""" Validate Message attributes according to the requirements """
if not self.ToClient or len(self.ToClient) != protocol.CLIENT_ID_SIZE:
return False
if not self.FromClient or len(self.FromClient) != protocol.CLIENT_ID_SIZE:
return False
if not self.Type or self.Type > protocol.MSG_TYPE_MAX:
return False
return True | [
"class",
"Message",
":",
"def",
"__init__",
"(",
"self",
",",
"to_client",
",",
"from_client",
",",
"mtype",
",",
"content",
")",
":",
"self",
".",
"ID",
"=",
"0",
"self",
".",
"ToClient",
"=",
"to_client",
"self",
".",
"FromClient",
"=",
"from_client",
"self",
".",
"Type",
"=",
"mtype",
"self",
".",
"Content",
"=",
"content",
"def",
"validate",
"(",
"self",
")",
":",
"\"\"\" Validate Message attributes according to the requirements \"\"\"",
"if",
"not",
"self",
".",
"ToClient",
"or",
"len",
"(",
"self",
".",
"ToClient",
")",
"!=",
"protocol",
".",
"CLIENT_ID_SIZE",
":",
"return",
"False",
"if",
"not",
"self",
".",
"FromClient",
"or",
"len",
"(",
"self",
".",
"FromClient",
")",
"!=",
"protocol",
".",
"CLIENT_ID_SIZE",
":",
"return",
"False",
"if",
"not",
"self",
".",
"Type",
"or",
"self",
".",
"Type",
">",
"protocol",
".",
"MSG_TYPE_MAX",
":",
"return",
"False",
"return",
"True"
] | Represents a message entry | [
"Represents",
"a",
"message",
"entry"
] | [
"\"\"\" Represents a message entry \"\"\"",
"# Message ID, 4 bytes.",
"# Receiver's unique ID, 16 bytes.",
"# Sender's unique ID, 16 bytes.",
"# Message type, 1 byte.",
"# Message's content, Blob.",
"\"\"\" Validate Message attributes according to the requirements \"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
78af814f45a2e5db390cc508ffe44e7889cbd00d | ur1ove/rl_algorithms | algorithms/her.py | [
"MIT"
] | Python | HER | HER (final strategy).
Attributes:
desired_states (np.ndarray): desired states
reward_func (Callable): returns reward from state, action, next_state
| HER (final strategy). | [
"HER",
"(",
"final",
"strategy",
")",
"."
] | class HER:
"""HER (final strategy).
Attributes:
desired_states (np.ndarray): desired states
reward_func (Callable): returns reward from state, action, next_state
"""
def __init__(self, demo_path: str, reward_func: Callable = default_reward_func):
"""Initialization.
Args:
demo_path (str): path of demonstration including desired states
reward_func (Callable): returns reward from state, action, next_state
"""
self.desired_states, self.demo_goal_indices = fetch_desired_states_from_demo(
demo_path
)
self.reward_func = reward_func
def sample_desired_state(self) -> np.ndarray:
"""Sample one of the desired states."""
return np.random.choice(self.desired_states, 1)[0]
def generate_demo_transitions(self, demo: list) -> list:
"""Return generated demo transitions for HER."""
new_demo: list = list()
# generate demo transitions
prev_idx = 0
for idx in self.demo_goal_indices:
demo_final_state = demo[idx][0]
transitions = [demo[i] for i in range(prev_idx, idx + 1)]
prev_idx = idx + 1
transitions = self.generate_transitions(
transitions, demo_final_state, demo=True
)
new_demo.extend(transitions)
return new_demo
def generate_transitions(
self, transitions: list, desired_state: np.ndarray, demo: bool = False
) -> list:
"""Generate new transitions concatenated with desired states."""
new_transitions = list()
final_state = transitions[-1][0]
for transition in transitions:
# process transitions with the initial goal state
new_transitions.append(self.__get_transition(transition, desired_state))
if not demo:
new_transitions.append(self.__get_transition(transition, final_state))
return new_transitions
def __get_transition(self, transition: tuple, goal_state: np.ndarray):
"""Get a single transition concatenated with a goal state."""
state, action, _, next_state, done = transition
done = np.array_equal(state, goal_state)
reward = self.reward_func(state, action, goal_state)
state = np.concatenate((state, goal_state), axis=-1)
next_state = np.concatenate((next_state, goal_state), axis=-1)
return (state, action, reward, next_state, done) | [
"class",
"HER",
":",
"def",
"__init__",
"(",
"self",
",",
"demo_path",
":",
"str",
",",
"reward_func",
":",
"Callable",
"=",
"default_reward_func",
")",
":",
"\"\"\"Initialization.\n\n Args:\n demo_path (str): path of demonstration including desired states\n reward_func (Callable): returns reward from state, action, next_state\n \"\"\"",
"self",
".",
"desired_states",
",",
"self",
".",
"demo_goal_indices",
"=",
"fetch_desired_states_from_demo",
"(",
"demo_path",
")",
"self",
".",
"reward_func",
"=",
"reward_func",
"def",
"sample_desired_state",
"(",
"self",
")",
"->",
"np",
".",
"ndarray",
":",
"\"\"\"Sample one of the desired states.\"\"\"",
"return",
"np",
".",
"random",
".",
"choice",
"(",
"self",
".",
"desired_states",
",",
"1",
")",
"[",
"0",
"]",
"def",
"generate_demo_transitions",
"(",
"self",
",",
"demo",
":",
"list",
")",
"->",
"list",
":",
"\"\"\"Return generated demo transitions for HER.\"\"\"",
"new_demo",
":",
"list",
"=",
"list",
"(",
")",
"prev_idx",
"=",
"0",
"for",
"idx",
"in",
"self",
".",
"demo_goal_indices",
":",
"demo_final_state",
"=",
"demo",
"[",
"idx",
"]",
"[",
"0",
"]",
"transitions",
"=",
"[",
"demo",
"[",
"i",
"]",
"for",
"i",
"in",
"range",
"(",
"prev_idx",
",",
"idx",
"+",
"1",
")",
"]",
"prev_idx",
"=",
"idx",
"+",
"1",
"transitions",
"=",
"self",
".",
"generate_transitions",
"(",
"transitions",
",",
"demo_final_state",
",",
"demo",
"=",
"True",
")",
"new_demo",
".",
"extend",
"(",
"transitions",
")",
"return",
"new_demo",
"def",
"generate_transitions",
"(",
"self",
",",
"transitions",
":",
"list",
",",
"desired_state",
":",
"np",
".",
"ndarray",
",",
"demo",
":",
"bool",
"=",
"False",
")",
"->",
"list",
":",
"\"\"\"Generate new transitions concatenated with desired states.\"\"\"",
"new_transitions",
"=",
"list",
"(",
")",
"final_state",
"=",
"transitions",
"[",
"-",
"1",
"]",
"[",
"0",
"]",
"for",
"transition",
"in",
"transitions",
":",
"new_transitions",
".",
"append",
"(",
"self",
".",
"__get_transition",
"(",
"transition",
",",
"desired_state",
")",
")",
"if",
"not",
"demo",
":",
"new_transitions",
".",
"append",
"(",
"self",
".",
"__get_transition",
"(",
"transition",
",",
"final_state",
")",
")",
"return",
"new_transitions",
"def",
"__get_transition",
"(",
"self",
",",
"transition",
":",
"tuple",
",",
"goal_state",
":",
"np",
".",
"ndarray",
")",
":",
"\"\"\"Get a single transition concatenated with a goal state.\"\"\"",
"state",
",",
"action",
",",
"_",
",",
"next_state",
",",
"done",
"=",
"transition",
"done",
"=",
"np",
".",
"array_equal",
"(",
"state",
",",
"goal_state",
")",
"reward",
"=",
"self",
".",
"reward_func",
"(",
"state",
",",
"action",
",",
"goal_state",
")",
"state",
"=",
"np",
".",
"concatenate",
"(",
"(",
"state",
",",
"goal_state",
")",
",",
"axis",
"=",
"-",
"1",
")",
"next_state",
"=",
"np",
".",
"concatenate",
"(",
"(",
"next_state",
",",
"goal_state",
")",
",",
"axis",
"=",
"-",
"1",
")",
"return",
"(",
"state",
",",
"action",
",",
"reward",
",",
"next_state",
",",
"done",
")"
] | HER (final strategy). | [
"HER",
"(",
"final",
"strategy",
")",
"."
] | [
"\"\"\"HER (final strategy).\n\n Attributes:\n desired_states (np.ndarray): desired states\n reward_func (Callable): returns reward from state, action, next_state\n\n \"\"\"",
"\"\"\"Initialization.\n\n Args:\n demo_path (str): path of demonstration including desired states\n reward_func (Callable): returns reward from state, action, next_state\n \"\"\"",
"\"\"\"Sample one of the desired states.\"\"\"",
"\"\"\"Return generated demo transitions for HER.\"\"\"",
"# generate demo transitions",
"\"\"\"Generate new transitions concatenated with desired states.\"\"\"",
"# process transitions with the initial goal state",
"\"\"\"Get a single transition concatenated with a goal state.\"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [
{
"identifier": "desired_states",
"type": null,
"docstring": null,
"docstring_tokens": [
"None"
],
"default": null,
"is_optional": false
},
{
"identifier": "reward_func",
"type": null,
"docstring": "returns reward from state, action, next_state",
"docstring_tokens": [
"returns",
"reward",
"from",
"state",
"action",
"next_state"
],
"default": null,
"is_optional": false
}
],
"others": []
} |
78b41cbc059a33010d5dfcf14fdde10feb7c0be0 | mwanakijiji/lbti_altair_fizeau | modules/host_removal.py | [
"MIT"
] | Python | HostRemoval |
PCA-decompose a saturated host star PSF and remove it
| PCA-decompose a saturated host star PSF and remove it | [
"PCA",
"-",
"decompose",
"a",
"saturated",
"host",
"star",
"PSF",
"and",
"remove",
"it"
] | class HostRemoval:
'''
PCA-decompose a saturated host star PSF and remove it
'''
def __init__(self,
n_PCA,
outdir,
abs_PCA_name,
config_data = config):
'''
INPUTS:
n_PCA: number of principal components to use
outdir: directory to deposit the host-subtracted images in (this has to be
defined at the function call because the images may or may not
contain fake planet PSFs, and I want to keep them separate)
abs_PCA_name: absolute file name of the PCA cube to reconstruct the host star
for making a fake planet (i.e., without saturation effects)
config_data: configuration data, as usual
'''
self.n_PCA = n_PCA
self.outdir = outdir
self.abs_PCA_name = abs_PCA_name
self.config_data = config_data
# read in the PCA vector cube for this series of frames
# (note the PCA needs to correspond to saturated PSFs, since I am subtracting
# saturated PSFs away)
self.pca_basis_cube_sat, self.header_pca_basis_cube_sat = fits.getdata(self.abs_PCA_name, 0, header=True)
##########
def __call__(self,
abs_sci_name):
'''
Reconstruct and inject, for a single frame so as to parallelize the job
INPUTS:
abs_sci_name: the absolute path of the science frame into which we want to inject a planet
'''
print(abs_sci_name)
# read in the cutout science frame
# (there should be no masking of this frame downstream)
sci, header_sci = fits.getdata(abs_sci_name, 0, header=True)
# define the mask of this science frame
## ## fine-tune this step later!
mask_weird = np.ones(np.shape(sci))
no_mask = np.copy(mask_weird) # a non-mask for reconstructing saturated PSFs
#mask_weird[sci > 1e8] = np.nan # mask saturating region
## TEST: WRITE OUT
#hdu = fits.PrimaryHDU(mask_weird)
#hdulist = fits.HDUList([hdu])
#hdu.writeto("junk_mask.fits", clobber=True)
## END TEST
###########################################
# PCA-decompose the host star PSF
# (note no de-rotation of the image here)
# do the PCA fit of masked host star
# returns dict: 'pca_vector': the PCA best-fit vector; and 'recon_2d': the 2D reconstructed PSF
# N.b. PCA reconstruction will be to get an UN-sat PSF; note PCA basis cube involves unsat PSFs
fit_unsat = fit_pca_star(self.pca_basis_cube_sat, sci, no_mask, n_PCA=100)
# subtract the PCA-reconstructed host star
image_host_removed = np.subtract(sci,fit_unsat["recon_2d"])
# pickle the PCA vector
pickle_stuff = {"pca_cube_file_name": self.abs_PCA_name,
"pca_vector": fit_unsat["pca_vector"],
"recons_2d_psf_unsat": fit_unsat["recon_2d"],
"sci_image_name": abs_sci_name}
print(pickle_stuff)
pca_fit_pickle_write_name = str(self.config_data["data_dirs"]["DIR_PICKLE"]) \
+ "pickle_pca_sat_psf_info_" + str(os.path.basename(abs_sci_name).split(".")[0]) + ".pkl"
print(pca_fit_pickle_write_name)
with open(pca_fit_pickle_write_name, "wb") as f:
pickle.dump(pickle_stuff, f)
# add info to the header indicating last reduction step, and PCA info
header_sci["RED_STEP"] = "host_removed"
# write FITS file out, with fake planet params in file name
## ## do I actually want to write out a separate FITS file for each fake planet?
abs_image_host_removed_name = str(self.outdir + os.path.basename(abs_sci_name))
fits.writeto(filename = abs_image_host_removed_name,
data = image_host_removed,
header = header_sci,
overwrite = True)
print("Writing out host_removed frame " + os.path.basename(abs_sci_name)) | [
"class",
"HostRemoval",
":",
"def",
"__init__",
"(",
"self",
",",
"n_PCA",
",",
"outdir",
",",
"abs_PCA_name",
",",
"config_data",
"=",
"config",
")",
":",
"'''\n INPUTS:\n n_PCA: number of principal components to use\n outdir: directory to deposit the host-subtracted images in (this has to be\n defined at the function call because the images may or may not\n contain fake planet PSFs, and I want to keep them separate)\n abs_PCA_name: absolute file name of the PCA cube to reconstruct the host star\n for making a fake planet (i.e., without saturation effects)\n config_data: configuration data, as usual\n '''",
"self",
".",
"n_PCA",
"=",
"n_PCA",
"self",
".",
"outdir",
"=",
"outdir",
"self",
".",
"abs_PCA_name",
"=",
"abs_PCA_name",
"self",
".",
"config_data",
"=",
"config_data",
"self",
".",
"pca_basis_cube_sat",
",",
"self",
".",
"header_pca_basis_cube_sat",
"=",
"fits",
".",
"getdata",
"(",
"self",
".",
"abs_PCA_name",
",",
"0",
",",
"header",
"=",
"True",
")",
"def",
"__call__",
"(",
"self",
",",
"abs_sci_name",
")",
":",
"'''\n Reconstruct and inject, for a single frame so as to parallelize the job\n\n INPUTS:\n\n abs_sci_name: the absolute path of the science frame into which we want to inject a planet\n '''",
"print",
"(",
"abs_sci_name",
")",
"sci",
",",
"header_sci",
"=",
"fits",
".",
"getdata",
"(",
"abs_sci_name",
",",
"0",
",",
"header",
"=",
"True",
")",
"mask_weird",
"=",
"np",
".",
"ones",
"(",
"np",
".",
"shape",
"(",
"sci",
")",
")",
"no_mask",
"=",
"np",
".",
"copy",
"(",
"mask_weird",
")",
"fit_unsat",
"=",
"fit_pca_star",
"(",
"self",
".",
"pca_basis_cube_sat",
",",
"sci",
",",
"no_mask",
",",
"n_PCA",
"=",
"100",
")",
"image_host_removed",
"=",
"np",
".",
"subtract",
"(",
"sci",
",",
"fit_unsat",
"[",
"\"recon_2d\"",
"]",
")",
"pickle_stuff",
"=",
"{",
"\"pca_cube_file_name\"",
":",
"self",
".",
"abs_PCA_name",
",",
"\"pca_vector\"",
":",
"fit_unsat",
"[",
"\"pca_vector\"",
"]",
",",
"\"recons_2d_psf_unsat\"",
":",
"fit_unsat",
"[",
"\"recon_2d\"",
"]",
",",
"\"sci_image_name\"",
":",
"abs_sci_name",
"}",
"print",
"(",
"pickle_stuff",
")",
"pca_fit_pickle_write_name",
"=",
"str",
"(",
"self",
".",
"config_data",
"[",
"\"data_dirs\"",
"]",
"[",
"\"DIR_PICKLE\"",
"]",
")",
"+",
"\"pickle_pca_sat_psf_info_\"",
"+",
"str",
"(",
"os",
".",
"path",
".",
"basename",
"(",
"abs_sci_name",
")",
".",
"split",
"(",
"\".\"",
")",
"[",
"0",
"]",
")",
"+",
"\".pkl\"",
"print",
"(",
"pca_fit_pickle_write_name",
")",
"with",
"open",
"(",
"pca_fit_pickle_write_name",
",",
"\"wb\"",
")",
"as",
"f",
":",
"pickle",
".",
"dump",
"(",
"pickle_stuff",
",",
"f",
")",
"header_sci",
"[",
"\"RED_STEP\"",
"]",
"=",
"\"host_removed\"",
"abs_image_host_removed_name",
"=",
"str",
"(",
"self",
".",
"outdir",
"+",
"os",
".",
"path",
".",
"basename",
"(",
"abs_sci_name",
")",
")",
"fits",
".",
"writeto",
"(",
"filename",
"=",
"abs_image_host_removed_name",
",",
"data",
"=",
"image_host_removed",
",",
"header",
"=",
"header_sci",
",",
"overwrite",
"=",
"True",
")",
"print",
"(",
"\"Writing out host_removed frame \"",
"+",
"os",
".",
"path",
".",
"basename",
"(",
"abs_sci_name",
")",
")"
] | PCA-decompose a saturated host star PSF and remove it | [
"PCA",
"-",
"decompose",
"a",
"saturated",
"host",
"star",
"PSF",
"and",
"remove",
"it"
] | [
"'''\n PCA-decompose a saturated host star PSF and remove it\n '''",
"'''\n INPUTS:\n n_PCA: number of principal components to use\n outdir: directory to deposit the host-subtracted images in (this has to be\n defined at the function call because the images may or may not\n contain fake planet PSFs, and I want to keep them separate)\n abs_PCA_name: absolute file name of the PCA cube to reconstruct the host star\n for making a fake planet (i.e., without saturation effects)\n config_data: configuration data, as usual\n '''",
"# read in the PCA vector cube for this series of frames",
"# (note the PCA needs to correspond to saturated PSFs, since I am subtracting",
"# saturated PSFs away)",
"##########",
"'''\n Reconstruct and inject, for a single frame so as to parallelize the job\n\n INPUTS:\n\n abs_sci_name: the absolute path of the science frame into which we want to inject a planet\n '''",
"# read in the cutout science frame",
"# (there should be no masking of this frame downstream)",
"# define the mask of this science frame",
"## ## fine-tune this step later!",
"# a non-mask for reconstructing saturated PSFs",
"#mask_weird[sci > 1e8] = np.nan # mask saturating region",
"## TEST: WRITE OUT",
"#hdu = fits.PrimaryHDU(mask_weird)",
"#hdulist = fits.HDUList([hdu])",
"#hdu.writeto(\"junk_mask.fits\", clobber=True)",
"## END TEST",
"###########################################",
"# PCA-decompose the host star PSF",
"# (note no de-rotation of the image here)",
"# do the PCA fit of masked host star",
"# returns dict: 'pca_vector': the PCA best-fit vector; and 'recon_2d': the 2D reconstructed PSF",
"# N.b. PCA reconstruction will be to get an UN-sat PSF; note PCA basis cube involves unsat PSFs",
"# subtract the PCA-reconstructed host star",
"# pickle the PCA vector",
"# add info to the header indicating last reduction step, and PCA info",
"# write FITS file out, with fake planet params in file name",
"## ## do I actually want to write out a separate FITS file for each fake planet?"
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
78b961a6673ec1e12f8d95c33ef081f75561a87c | AIS-Bonn/sl-cutscenes | sl_cutscenes/object_models.py | [
"MIT"
] | Python | MeshLoader |
Class to load the meshes for the objects in a scene.
| Class to load the meshes for the objects in a scene. | [
"Class",
"to",
"load",
"the",
"meshes",
"for",
"the",
"objects",
"in",
"a",
"scene",
"."
] | class MeshLoader:
"""
Class to load the meshes for the objects in a scene.
"""
def __init__(self):
"""Module initializer"""
self.base_dir = CONSTANTS.MESH_BASE_DIR
self.text_dir = CONSTANTS.TEXT_BASE_DIR
self.reset()
def reset(self):
self.loaded_meshes = []
def get_meshes(self):
""" """
extract_singular = lambda x: x[0] if len(x) == 1 else x
return [extract_singular(item) for item in self.loaded_meshes]
def load_meshes(self, obj_info: List[object_info.ObjectInfo], **kwargs):
"""
Loads the meshes whose information is given in parameter 'obj_info.
Each call of this method APPENDS a list to the loaded_meshes attribute.
:param obj_info: The object information of the meshes to be loaded.
:param kwargs: additional mesh modifiers such as scale, specified with a leading 'mod_'
"""
paths = []
for obj in obj_info:
path = self.text_dir if obj.name.endswith("_floor") or obj.name.endswith("_wall") else self.base_dir
paths.append((path / obj.mesh_fp).resolve())
scales = [obj.scale for obj in obj_info]
class_ids = [obj.class_id for obj in obj_info]
mod_scales = kwargs.get("mod_scale", [1.0] * len(scales))
scales = [s * ms for (s, ms) in zip(scales, mod_scales)]
flags = [mesh_flags(obj) for obj in obj_info]
meshes = sl.Mesh.load_threaded(filenames=paths, flags=flags)
# Setup class IDs
for _, (mesh, scale, class_id) in enumerate(zip(meshes, scales, class_ids)):
pt = torch.eye(4)
pt[:3, :3] *= scale
mesh.pretransform = pt
mesh.class_index = class_id
info_mesh_tuples = list(zip(obj_info, meshes))
self.loaded_meshes.append(info_mesh_tuples) | [
"class",
"MeshLoader",
":",
"def",
"__init__",
"(",
"self",
")",
":",
"\"\"\"Module initializer\"\"\"",
"self",
".",
"base_dir",
"=",
"CONSTANTS",
".",
"MESH_BASE_DIR",
"self",
".",
"text_dir",
"=",
"CONSTANTS",
".",
"TEXT_BASE_DIR",
"self",
".",
"reset",
"(",
")",
"def",
"reset",
"(",
"self",
")",
":",
"self",
".",
"loaded_meshes",
"=",
"[",
"]",
"def",
"get_meshes",
"(",
"self",
")",
":",
"\"\"\" \"\"\"",
"extract_singular",
"=",
"lambda",
"x",
":",
"x",
"[",
"0",
"]",
"if",
"len",
"(",
"x",
")",
"==",
"1",
"else",
"x",
"return",
"[",
"extract_singular",
"(",
"item",
")",
"for",
"item",
"in",
"self",
".",
"loaded_meshes",
"]",
"def",
"load_meshes",
"(",
"self",
",",
"obj_info",
":",
"List",
"[",
"object_info",
".",
"ObjectInfo",
"]",
",",
"**",
"kwargs",
")",
":",
"\"\"\"\n Loads the meshes whose information is given in parameter 'obj_info.\n Each call of this method APPENDS a list to the loaded_meshes attribute.\n :param obj_info: The object information of the meshes to be loaded.\n :param kwargs: additional mesh modifiers such as scale, specified with a leading 'mod_'\n \"\"\"",
"paths",
"=",
"[",
"]",
"for",
"obj",
"in",
"obj_info",
":",
"path",
"=",
"self",
".",
"text_dir",
"if",
"obj",
".",
"name",
".",
"endswith",
"(",
"\"_floor\"",
")",
"or",
"obj",
".",
"name",
".",
"endswith",
"(",
"\"_wall\"",
")",
"else",
"self",
".",
"base_dir",
"paths",
".",
"append",
"(",
"(",
"path",
"/",
"obj",
".",
"mesh_fp",
")",
".",
"resolve",
"(",
")",
")",
"scales",
"=",
"[",
"obj",
".",
"scale",
"for",
"obj",
"in",
"obj_info",
"]",
"class_ids",
"=",
"[",
"obj",
".",
"class_id",
"for",
"obj",
"in",
"obj_info",
"]",
"mod_scales",
"=",
"kwargs",
".",
"get",
"(",
"\"mod_scale\"",
",",
"[",
"1.0",
"]",
"*",
"len",
"(",
"scales",
")",
")",
"scales",
"=",
"[",
"s",
"*",
"ms",
"for",
"(",
"s",
",",
"ms",
")",
"in",
"zip",
"(",
"scales",
",",
"mod_scales",
")",
"]",
"flags",
"=",
"[",
"mesh_flags",
"(",
"obj",
")",
"for",
"obj",
"in",
"obj_info",
"]",
"meshes",
"=",
"sl",
".",
"Mesh",
".",
"load_threaded",
"(",
"filenames",
"=",
"paths",
",",
"flags",
"=",
"flags",
")",
"for",
"_",
",",
"(",
"mesh",
",",
"scale",
",",
"class_id",
")",
"in",
"enumerate",
"(",
"zip",
"(",
"meshes",
",",
"scales",
",",
"class_ids",
")",
")",
":",
"pt",
"=",
"torch",
".",
"eye",
"(",
"4",
")",
"pt",
"[",
":",
"3",
",",
":",
"3",
"]",
"*=",
"scale",
"mesh",
".",
"pretransform",
"=",
"pt",
"mesh",
".",
"class_index",
"=",
"class_id",
"info_mesh_tuples",
"=",
"list",
"(",
"zip",
"(",
"obj_info",
",",
"meshes",
")",
")",
"self",
".",
"loaded_meshes",
".",
"append",
"(",
"info_mesh_tuples",
")"
] | Class to load the meshes for the objects in a scene. | [
"Class",
"to",
"load",
"the",
"meshes",
"for",
"the",
"objects",
"in",
"a",
"scene",
"."
] | [
"\"\"\"\n Class to load the meshes for the objects in a scene.\n \"\"\"",
"\"\"\"Module initializer\"\"\"",
"\"\"\" \"\"\"",
"\"\"\"\n Loads the meshes whose information is given in parameter 'obj_info.\n Each call of this method APPENDS a list to the loaded_meshes attribute.\n :param obj_info: The object information of the meshes to be loaded.\n :param kwargs: additional mesh modifiers such as scale, specified with a leading 'mod_'\n \"\"\"",
"# Setup class IDs"
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
78b961a6673ec1e12f8d95c33ef081f75561a87c | AIS-Bonn/sl-cutscenes | sl_cutscenes/object_models.py | [
"MIT"
] | Python | ObjectLoader |
Class to load the objects in a scene
| Class to load the objects in a scene | [
"Class",
"to",
"load",
"the",
"objects",
"in",
"a",
"scene"
] | class ObjectLoader:
"""
Class to load the objects in a scene
"""
def __init__(self):
"""Module initializer"""
self.reset()
def reset(self):
self.instance_idx = 0
self.loaded_objects = dict()
@property
def static_objects(self):
return [obj for obj in self.loaded_objects.values() if obj.static]
@property
def dynamic_objects(self):
return [obj for obj in self.loaded_objects.values() if not obj.static]
def create_object(self, object_info: object_info.ObjectInfo, mesh: sl.Mesh, is_static: bool, **obj_mod):
"""
Proper object setup
:param mesh:
:param object_info:
:param is_static:
:param obj_mod: Optional object modifiers, specified with a leading 'mod_'.
IMPORTANT: scaling is done during mesh loading!!!
:return:
"""
ins_idx = self.instance_idx + 1
self.instance_idx += 1
obj = sl.Object(mesh)
mod_weight = obj_mod.get("mod_weight", obj_mod.get("mod_scale", 1.0) ** 3)
obj.mass = object_info.weight * mod_weight
obj.metallic = object_info.metallic
obj.roughness = object_info.roughness
obj.restitution = object_info.restitution
obj.static_friction = object_info.static_friction
obj.dynamic_friction = object_info.dynamic_friction
pose = obj_mod.get("mod_pose", torch.eye(4))
mod_R = obj_mod.get("mod_R", torch.eye(3))
pose[:3, :3] = torch.mm(mod_R, pose[:3, :3])
mod_t = obj_mod.get("mod_t", torch.tensor([obj_mod.get("mod_x", 0.0),
obj_mod.get("mod_y", 0.0),
obj_mod.get("mod_z", 0.0)]))
pose[:3, 3] += mod_t
obj.set_pose(pose)
obj.linear_velocity = obj_mod.get("mod_v_linear", torch.tensor([0.0, 0.0, 0.0]))
obj.angular_velocity = obj_mod.get("mod_v_angular", torch.tensor([0.0, 0.0, 0.0]))
obj.static = is_static
obj.instance_index = ins_idx
self.loaded_objects[ins_idx] = obj
return obj
def remove_object(self, instance_id, decrement_ins_idx=True):
obj = self.loaded_objects.pop(instance_id, None)
if decrement_ins_idx and obj is not None:
self.instance_idx -= 1
return obj | [
"class",
"ObjectLoader",
":",
"def",
"__init__",
"(",
"self",
")",
":",
"\"\"\"Module initializer\"\"\"",
"self",
".",
"reset",
"(",
")",
"def",
"reset",
"(",
"self",
")",
":",
"self",
".",
"instance_idx",
"=",
"0",
"self",
".",
"loaded_objects",
"=",
"dict",
"(",
")",
"@",
"property",
"def",
"static_objects",
"(",
"self",
")",
":",
"return",
"[",
"obj",
"for",
"obj",
"in",
"self",
".",
"loaded_objects",
".",
"values",
"(",
")",
"if",
"obj",
".",
"static",
"]",
"@",
"property",
"def",
"dynamic_objects",
"(",
"self",
")",
":",
"return",
"[",
"obj",
"for",
"obj",
"in",
"self",
".",
"loaded_objects",
".",
"values",
"(",
")",
"if",
"not",
"obj",
".",
"static",
"]",
"def",
"create_object",
"(",
"self",
",",
"object_info",
":",
"object_info",
".",
"ObjectInfo",
",",
"mesh",
":",
"sl",
".",
"Mesh",
",",
"is_static",
":",
"bool",
",",
"**",
"obj_mod",
")",
":",
"\"\"\"\n Proper object setup\n :param mesh:\n :param object_info:\n :param is_static:\n :param obj_mod: Optional object modifiers, specified with a leading 'mod_'.\n IMPORTANT: scaling is done during mesh loading!!!\n :return:\n \"\"\"",
"ins_idx",
"=",
"self",
".",
"instance_idx",
"+",
"1",
"self",
".",
"instance_idx",
"+=",
"1",
"obj",
"=",
"sl",
".",
"Object",
"(",
"mesh",
")",
"mod_weight",
"=",
"obj_mod",
".",
"get",
"(",
"\"mod_weight\"",
",",
"obj_mod",
".",
"get",
"(",
"\"mod_scale\"",
",",
"1.0",
")",
"**",
"3",
")",
"obj",
".",
"mass",
"=",
"object_info",
".",
"weight",
"*",
"mod_weight",
"obj",
".",
"metallic",
"=",
"object_info",
".",
"metallic",
"obj",
".",
"roughness",
"=",
"object_info",
".",
"roughness",
"obj",
".",
"restitution",
"=",
"object_info",
".",
"restitution",
"obj",
".",
"static_friction",
"=",
"object_info",
".",
"static_friction",
"obj",
".",
"dynamic_friction",
"=",
"object_info",
".",
"dynamic_friction",
"pose",
"=",
"obj_mod",
".",
"get",
"(",
"\"mod_pose\"",
",",
"torch",
".",
"eye",
"(",
"4",
")",
")",
"mod_R",
"=",
"obj_mod",
".",
"get",
"(",
"\"mod_R\"",
",",
"torch",
".",
"eye",
"(",
"3",
")",
")",
"pose",
"[",
":",
"3",
",",
":",
"3",
"]",
"=",
"torch",
".",
"mm",
"(",
"mod_R",
",",
"pose",
"[",
":",
"3",
",",
":",
"3",
"]",
")",
"mod_t",
"=",
"obj_mod",
".",
"get",
"(",
"\"mod_t\"",
",",
"torch",
".",
"tensor",
"(",
"[",
"obj_mod",
".",
"get",
"(",
"\"mod_x\"",
",",
"0.0",
")",
",",
"obj_mod",
".",
"get",
"(",
"\"mod_y\"",
",",
"0.0",
")",
",",
"obj_mod",
".",
"get",
"(",
"\"mod_z\"",
",",
"0.0",
")",
"]",
")",
")",
"pose",
"[",
":",
"3",
",",
"3",
"]",
"+=",
"mod_t",
"obj",
".",
"set_pose",
"(",
"pose",
")",
"obj",
".",
"linear_velocity",
"=",
"obj_mod",
".",
"get",
"(",
"\"mod_v_linear\"",
",",
"torch",
".",
"tensor",
"(",
"[",
"0.0",
",",
"0.0",
",",
"0.0",
"]",
")",
")",
"obj",
".",
"angular_velocity",
"=",
"obj_mod",
".",
"get",
"(",
"\"mod_v_angular\"",
",",
"torch",
".",
"tensor",
"(",
"[",
"0.0",
",",
"0.0",
",",
"0.0",
"]",
")",
")",
"obj",
".",
"static",
"=",
"is_static",
"obj",
".",
"instance_index",
"=",
"ins_idx",
"self",
".",
"loaded_objects",
"[",
"ins_idx",
"]",
"=",
"obj",
"return",
"obj",
"def",
"remove_object",
"(",
"self",
",",
"instance_id",
",",
"decrement_ins_idx",
"=",
"True",
")",
":",
"obj",
"=",
"self",
".",
"loaded_objects",
".",
"pop",
"(",
"instance_id",
",",
"None",
")",
"if",
"decrement_ins_idx",
"and",
"obj",
"is",
"not",
"None",
":",
"self",
".",
"instance_idx",
"-=",
"1",
"return",
"obj"
] | Class to load the objects in a scene | [
"Class",
"to",
"load",
"the",
"objects",
"in",
"a",
"scene"
] | [
"\"\"\"\n Class to load the objects in a scene\n \"\"\"",
"\"\"\"Module initializer\"\"\"",
"\"\"\"\n Proper object setup\n :param mesh:\n :param object_info:\n :param is_static:\n :param obj_mod: Optional object modifiers, specified with a leading 'mod_'.\n IMPORTANT: scaling is done during mesh loading!!!\n :return:\n \"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
78b961a6673ec1e12f8d95c33ef081f75561a87c | AIS-Bonn/sl-cutscenes | sl_cutscenes/object_models.py | [
"MIT"
] | Python | DecoratorLoader |
Class to add random decorative objects to the scene, which do not participate of the scene dynamics.
It is based on creating an occupancy matrix of the scene, finding empty locations and placing stuff there
| Class to add random decorative objects to the scene, which do not participate of the scene dynamics.
It is based on creating an occupancy matrix of the scene, finding empty locations and placing stuff there | [
"Class",
"to",
"add",
"random",
"decorative",
"objects",
"to",
"the",
"scene",
"which",
"do",
"not",
"participate",
"of",
"the",
"scene",
"dynamics",
".",
"It",
"is",
"based",
"on",
"creating",
"an",
"occupancy",
"matrix",
"of",
"the",
"scene",
"finding",
"empty",
"locations",
"and",
"placing",
"stuff",
"there"
] | class DecoratorLoader:
"""
Class to add random decorative objects to the scene, which do not participate of the scene dynamics.
It is based on creating an occupancy matrix of the scene, finding empty locations and placing stuff there
"""
def __init__(self, scene):
""" Object initializer """
self.config = SCENARIO_DEFAULTS["decorator"]
decorations = self.config["decorations"]
bounds = self.config["bounds"]
self.bounds = bounds
self.pi = torch.acos(torch.zeros(1))
self.scene = scene
self.mesh_loader = MeshLoader()
self.mesh_loader.load_meshes(decorations),
self.meshes = self.mesh_loader.get_meshes()[0]
self.x_vect = torch.arange(bounds["min_x"], bounds["max_x"] + bounds["res"], bounds["res"])
self.y_vect = torch.arange(bounds["min_y"], bounds["max_y"] + bounds["res"], bounds["res"])
return
def add_object(self, object_loader, object_id):
""" Loading an object and adding to the loader """
obj_info, obj_mesh = self.meshes[object_id]
pose = torch.eye(4)
obj_mod = {"mod_pose": pose}
obj = object_loader.create_object(obj_info, obj_mesh, True, **obj_mod)
self.scene.add_object(obj)
# shifting object to a free position and adjusting z-coord to be aligned with the table
position = self.occ_matrix.find_free_spot(obj=obj)
pose[:2, -1] = position if position is not None else torch.ones(2)
pose[2, -1] += obj.mesh.bbox.max[-1]
# Rotating object in yaw direction
yaw_angle = random.choice([torch.tensor([i * CONSTANTS.PI / 2]) for i in range(4)])
angles = torch.cat([yaw_angle, torch.zeros(2)])
rot_matrix = utils.get_rot_matrix(angles=angles)
pose[:3, :3] = pose[:3, :3] @ rot_matrix
obj.set_pose(pose)
self.occ_matrix.update_occupancy_matrix(obj)
self.occ_matrix.add_object_margings()
return
def decorate_scene(self, object_loader):
""" Randomly adding some decoderation to a scene """
# initializing occupancy matrix
self.occ_matrix = OccupancyMatrix(bounds=self.bounds, objects=self.scene.objects)
# iteratively placing objects while avoiding collision
N = torch.randint(low=self.config["min_objs"], high=self.config["max_objs"], size=(1,))
for i in range(N):
id = torch.randint(low=0, high=len(self.meshes), size=(1,))
self.add_object(object_loader, object_id=id)
return | [
"class",
"DecoratorLoader",
":",
"def",
"__init__",
"(",
"self",
",",
"scene",
")",
":",
"\"\"\" Object initializer \"\"\"",
"self",
".",
"config",
"=",
"SCENARIO_DEFAULTS",
"[",
"\"decorator\"",
"]",
"decorations",
"=",
"self",
".",
"config",
"[",
"\"decorations\"",
"]",
"bounds",
"=",
"self",
".",
"config",
"[",
"\"bounds\"",
"]",
"self",
".",
"bounds",
"=",
"bounds",
"self",
".",
"pi",
"=",
"torch",
".",
"acos",
"(",
"torch",
".",
"zeros",
"(",
"1",
")",
")",
"self",
".",
"scene",
"=",
"scene",
"self",
".",
"mesh_loader",
"=",
"MeshLoader",
"(",
")",
"self",
".",
"mesh_loader",
".",
"load_meshes",
"(",
"decorations",
")",
",",
"self",
".",
"meshes",
"=",
"self",
".",
"mesh_loader",
".",
"get_meshes",
"(",
")",
"[",
"0",
"]",
"self",
".",
"x_vect",
"=",
"torch",
".",
"arange",
"(",
"bounds",
"[",
"\"min_x\"",
"]",
",",
"bounds",
"[",
"\"max_x\"",
"]",
"+",
"bounds",
"[",
"\"res\"",
"]",
",",
"bounds",
"[",
"\"res\"",
"]",
")",
"self",
".",
"y_vect",
"=",
"torch",
".",
"arange",
"(",
"bounds",
"[",
"\"min_y\"",
"]",
",",
"bounds",
"[",
"\"max_y\"",
"]",
"+",
"bounds",
"[",
"\"res\"",
"]",
",",
"bounds",
"[",
"\"res\"",
"]",
")",
"return",
"def",
"add_object",
"(",
"self",
",",
"object_loader",
",",
"object_id",
")",
":",
"\"\"\" Loading an object and adding to the loader \"\"\"",
"obj_info",
",",
"obj_mesh",
"=",
"self",
".",
"meshes",
"[",
"object_id",
"]",
"pose",
"=",
"torch",
".",
"eye",
"(",
"4",
")",
"obj_mod",
"=",
"{",
"\"mod_pose\"",
":",
"pose",
"}",
"obj",
"=",
"object_loader",
".",
"create_object",
"(",
"obj_info",
",",
"obj_mesh",
",",
"True",
",",
"**",
"obj_mod",
")",
"self",
".",
"scene",
".",
"add_object",
"(",
"obj",
")",
"position",
"=",
"self",
".",
"occ_matrix",
".",
"find_free_spot",
"(",
"obj",
"=",
"obj",
")",
"pose",
"[",
":",
"2",
",",
"-",
"1",
"]",
"=",
"position",
"if",
"position",
"is",
"not",
"None",
"else",
"torch",
".",
"ones",
"(",
"2",
")",
"pose",
"[",
"2",
",",
"-",
"1",
"]",
"+=",
"obj",
".",
"mesh",
".",
"bbox",
".",
"max",
"[",
"-",
"1",
"]",
"yaw_angle",
"=",
"random",
".",
"choice",
"(",
"[",
"torch",
".",
"tensor",
"(",
"[",
"i",
"*",
"CONSTANTS",
".",
"PI",
"/",
"2",
"]",
")",
"for",
"i",
"in",
"range",
"(",
"4",
")",
"]",
")",
"angles",
"=",
"torch",
".",
"cat",
"(",
"[",
"yaw_angle",
",",
"torch",
".",
"zeros",
"(",
"2",
")",
"]",
")",
"rot_matrix",
"=",
"utils",
".",
"get_rot_matrix",
"(",
"angles",
"=",
"angles",
")",
"pose",
"[",
":",
"3",
",",
":",
"3",
"]",
"=",
"pose",
"[",
":",
"3",
",",
":",
"3",
"]",
"@",
"rot_matrix",
"obj",
".",
"set_pose",
"(",
"pose",
")",
"self",
".",
"occ_matrix",
".",
"update_occupancy_matrix",
"(",
"obj",
")",
"self",
".",
"occ_matrix",
".",
"add_object_margings",
"(",
")",
"return",
"def",
"decorate_scene",
"(",
"self",
",",
"object_loader",
")",
":",
"\"\"\" Randomly adding some decoderation to a scene \"\"\"",
"self",
".",
"occ_matrix",
"=",
"OccupancyMatrix",
"(",
"bounds",
"=",
"self",
".",
"bounds",
",",
"objects",
"=",
"self",
".",
"scene",
".",
"objects",
")",
"N",
"=",
"torch",
".",
"randint",
"(",
"low",
"=",
"self",
".",
"config",
"[",
"\"min_objs\"",
"]",
",",
"high",
"=",
"self",
".",
"config",
"[",
"\"max_objs\"",
"]",
",",
"size",
"=",
"(",
"1",
",",
")",
")",
"for",
"i",
"in",
"range",
"(",
"N",
")",
":",
"id",
"=",
"torch",
".",
"randint",
"(",
"low",
"=",
"0",
",",
"high",
"=",
"len",
"(",
"self",
".",
"meshes",
")",
",",
"size",
"=",
"(",
"1",
",",
")",
")",
"self",
".",
"add_object",
"(",
"object_loader",
",",
"object_id",
"=",
"id",
")",
"return"
] | Class to add random decorative objects to the scene, which do not participate of the scene dynamics. | [
"Class",
"to",
"add",
"random",
"decorative",
"objects",
"to",
"the",
"scene",
"which",
"do",
"not",
"participate",
"of",
"the",
"scene",
"dynamics",
"."
] | [
"\"\"\"\n Class to add random decorative objects to the scene, which do not participate of the scene dynamics.\n It is based on creating an occupancy matrix of the scene, finding empty locations and placing stuff there\n \"\"\"",
"\"\"\" Object initializer \"\"\"",
"\"\"\" Loading an object and adding to the loader \"\"\"",
"# shifting object to a free position and adjusting z-coord to be aligned with the table",
"# Rotating object in yaw direction",
"\"\"\" Randomly adding some decoderation to a scene \"\"\"",
"# initializing occupancy matrix",
"# iteratively placing objects while avoiding collision"
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
78c6d327eeab5125863f64f7fe72c2cd35fb66b0 | p-koskey/news-sources | app/models.py | [
"MIT"
] | Python | Source |
Source class to define source objects
| Source class to define source objects | [
"Source",
"class",
"to",
"define",
"source",
"objects"
] | class Source:
'''
Source class to define source objects
'''
def __init__(self,id,name,category):
self.id = id
self.name = name
self.category = category | [
"class",
"Source",
":",
"def",
"__init__",
"(",
"self",
",",
"id",
",",
"name",
",",
"category",
")",
":",
"self",
".",
"id",
"=",
"id",
"self",
".",
"name",
"=",
"name",
"self",
".",
"category",
"=",
"category"
] | Source class to define source objects | [
"Source",
"class",
"to",
"define",
"source",
"objects"
] | [
"'''\n Source class to define source objects\n '''"
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
78c6d327eeab5125863f64f7fe72c2cd35fb66b0 | p-koskey/news-sources | app/models.py | [
"MIT"
] | Python | Article |
Article class to define article objects
| Article class to define article objects | [
"Article",
"class",
"to",
"define",
"article",
"objects"
] | class Article:
'''
Article class to define article objects
'''
def __init__(self, name, author, title, description, link, image, publishDate):
self.name = name
self.author = author
self.title = title
self.description = description
self.link = link
self.image = image
self.publishDate = publishDate | [
"class",
"Article",
":",
"def",
"__init__",
"(",
"self",
",",
"name",
",",
"author",
",",
"title",
",",
"description",
",",
"link",
",",
"image",
",",
"publishDate",
")",
":",
"self",
".",
"name",
"=",
"name",
"self",
".",
"author",
"=",
"author",
"self",
".",
"title",
"=",
"title",
"self",
".",
"description",
"=",
"description",
"self",
".",
"link",
"=",
"link",
"self",
".",
"image",
"=",
"image",
"self",
".",
"publishDate",
"=",
"publishDate"
] | Article class to define article objects | [
"Article",
"class",
"to",
"define",
"article",
"objects"
] | [
"'''\n Article class to define article objects\n '''"
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
78c6d327eeab5125863f64f7fe72c2cd35fb66b0 | p-koskey/news-sources | app/models.py | [
"MIT"
] | Python | Top |
Top headlines class to define headlines objects
| Top headlines class to define headlines objects | [
"Top",
"headlines",
"class",
"to",
"define",
"headlines",
"objects"
] | class Top:
'''
Top headlines class to define headlines objects
'''
def __init__(self, source, author, title, description, link, image):
self.source = source
self.author = author
self.title = title
self.description = description
self.link = link
self.image = image | [
"class",
"Top",
":",
"def",
"__init__",
"(",
"self",
",",
"source",
",",
"author",
",",
"title",
",",
"description",
",",
"link",
",",
"image",
")",
":",
"self",
".",
"source",
"=",
"source",
"self",
".",
"author",
"=",
"author",
"self",
".",
"title",
"=",
"title",
"self",
".",
"description",
"=",
"description",
"self",
".",
"link",
"=",
"link",
"self",
".",
"image",
"=",
"image"
] | Top headlines class to define headlines objects | [
"Top",
"headlines",
"class",
"to",
"define",
"headlines",
"objects"
] | [
"'''\n Top headlines class to define headlines objects\n '''"
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
78cef279b1537217c1a5435db8a4394af5e49d17 | lrivallain/openfaas-fn | veba-to-argo-fn/handler/function/handler.py | [
"MIT"
] | Python | ArgoWorflow | The ArgoWorflow provide a way to start an argo WF based on an existing template.
| The ArgoWorflow provide a way to start an argo WF based on an existing template. | [
"The",
"ArgoWorflow",
"provide",
"a",
"way",
"to",
"start",
"an",
"argo",
"WF",
"based",
"on",
"an",
"existing",
"template",
"."
] | class ArgoWorflow:
"""The ArgoWorflow provide a way to start an argo WF based on an existing template.
"""
def __init__(self):
"""Initialize the ArgoWorflow
"""
logger.info("Reading configuration files")
logger.info(f"Argo config file > {ARGO_CONFIG}")
try:
with open(ARGO_CONFIG, 'r') as configfile:
argoconfig = yaml.load(configfile, Loader=yaml.SafeLoader)
# read mandatory parameters
self.server = argoconfig['argoserver']['server']
self.ns = argoconfig['argoserver']['namespace']
self.sa = argoconfig['argoserver']['serviceaccount']
self.template = argoconfig['argoserver']['template']
except OSError as err:
raise Exception(f'Could not read argo configuration: {err}')
except KeyError as err:
raise Exception(f'Missing mandatory configuration key: {err}')
except Exception as err:
raise Exception(f'Unknown error when reading settings: {err}')
# read non-mandatory parameters
self.proto = argoconfig['argoserver'].get('protocol', 'http')
self.param_name = argoconfig['argoserver'].get('event_param_name', 'event')
self.base64_encode = argoconfig['argoserver'].get('base64_encode', False)
self.raw_labels = argoconfig['argoserver'].get('labels', [])
# set a from:veba label
self.labels = ["from=veba"]
# add configured labels
for label in self.raw_labels:
self.labels.append(f"{label}={self.raw_labels[label]}")
def submit(self, event: dict):
"""Submit the workflow
Args:
event (dict): event data
"""
logger.debug("Preparing request data")
uri = f"{self.proto}://{self.server}/api/v1/workflows/{self.ns}/submit"
self.labels.append(f"event_id={event.get('id')}")
self.labels.append(f"event_subject={event.get('subject')}")
# base64 convertion
if self.base64_encode:
event_data = base64.b64encode(
json.dumps(event).encode('utf-8')
).decode()
else:
event_data = json.dumps(event)
# prepare the workflow data
data = {
"resourceKind": "WorkflowTemplate",
"resourceName": self.template,
"submitOptions": {
"serviceaccount": self.sa,
"parameters": [
f"{self.param_name}={event_data}"
],
"labels": ','.join(self.labels)
}
}
logger.debug(json.dumps(data, indent=4, sort_keys=True))
headers = { "Content-Type": "application/json" }
logger.info("Submiting workflow")
try:
r = requests.post(uri, json=data, headers=headers)
logger.debug(r.text)
r.raise_for_status()
except requests.exceptions.HTTPError:
return f"Invalid status code returned: {r.status_code}"
except Exception as err:
return f"Unable to make request to argo server {self.server}: {err}", 500
return "Argo workflow was successfully submited", 200 | [
"class",
"ArgoWorflow",
":",
"def",
"__init__",
"(",
"self",
")",
":",
"\"\"\"Initialize the ArgoWorflow\n \"\"\"",
"logger",
".",
"info",
"(",
"\"Reading configuration files\"",
")",
"logger",
".",
"info",
"(",
"f\"Argo config file > {ARGO_CONFIG}\"",
")",
"try",
":",
"with",
"open",
"(",
"ARGO_CONFIG",
",",
"'r'",
")",
"as",
"configfile",
":",
"argoconfig",
"=",
"yaml",
".",
"load",
"(",
"configfile",
",",
"Loader",
"=",
"yaml",
".",
"SafeLoader",
")",
"self",
".",
"server",
"=",
"argoconfig",
"[",
"'argoserver'",
"]",
"[",
"'server'",
"]",
"self",
".",
"ns",
"=",
"argoconfig",
"[",
"'argoserver'",
"]",
"[",
"'namespace'",
"]",
"self",
".",
"sa",
"=",
"argoconfig",
"[",
"'argoserver'",
"]",
"[",
"'serviceaccount'",
"]",
"self",
".",
"template",
"=",
"argoconfig",
"[",
"'argoserver'",
"]",
"[",
"'template'",
"]",
"except",
"OSError",
"as",
"err",
":",
"raise",
"Exception",
"(",
"f'Could not read argo configuration: {err}'",
")",
"except",
"KeyError",
"as",
"err",
":",
"raise",
"Exception",
"(",
"f'Missing mandatory configuration key: {err}'",
")",
"except",
"Exception",
"as",
"err",
":",
"raise",
"Exception",
"(",
"f'Unknown error when reading settings: {err}'",
")",
"self",
".",
"proto",
"=",
"argoconfig",
"[",
"'argoserver'",
"]",
".",
"get",
"(",
"'protocol'",
",",
"'http'",
")",
"self",
".",
"param_name",
"=",
"argoconfig",
"[",
"'argoserver'",
"]",
".",
"get",
"(",
"'event_param_name'",
",",
"'event'",
")",
"self",
".",
"base64_encode",
"=",
"argoconfig",
"[",
"'argoserver'",
"]",
".",
"get",
"(",
"'base64_encode'",
",",
"False",
")",
"self",
".",
"raw_labels",
"=",
"argoconfig",
"[",
"'argoserver'",
"]",
".",
"get",
"(",
"'labels'",
",",
"[",
"]",
")",
"self",
".",
"labels",
"=",
"[",
"\"from=veba\"",
"]",
"for",
"label",
"in",
"self",
".",
"raw_labels",
":",
"self",
".",
"labels",
".",
"append",
"(",
"f\"{label}={self.raw_labels[label]}\"",
")",
"def",
"submit",
"(",
"self",
",",
"event",
":",
"dict",
")",
":",
"\"\"\"Submit the workflow\n\n Args:\n event (dict): event data\n \"\"\"",
"logger",
".",
"debug",
"(",
"\"Preparing request data\"",
")",
"uri",
"=",
"f\"{self.proto}://{self.server}/api/v1/workflows/{self.ns}/submit\"",
"self",
".",
"labels",
".",
"append",
"(",
"f\"event_id={event.get('id')}\"",
")",
"self",
".",
"labels",
".",
"append",
"(",
"f\"event_subject={event.get('subject')}\"",
")",
"if",
"self",
".",
"base64_encode",
":",
"event_data",
"=",
"base64",
".",
"b64encode",
"(",
"json",
".",
"dumps",
"(",
"event",
")",
".",
"encode",
"(",
"'utf-8'",
")",
")",
".",
"decode",
"(",
")",
"else",
":",
"event_data",
"=",
"json",
".",
"dumps",
"(",
"event",
")",
"data",
"=",
"{",
"\"resourceKind\"",
":",
"\"WorkflowTemplate\"",
",",
"\"resourceName\"",
":",
"self",
".",
"template",
",",
"\"submitOptions\"",
":",
"{",
"\"serviceaccount\"",
":",
"self",
".",
"sa",
",",
"\"parameters\"",
":",
"[",
"f\"{self.param_name}={event_data}\"",
"]",
",",
"\"labels\"",
":",
"','",
".",
"join",
"(",
"self",
".",
"labels",
")",
"}",
"}",
"logger",
".",
"debug",
"(",
"json",
".",
"dumps",
"(",
"data",
",",
"indent",
"=",
"4",
",",
"sort_keys",
"=",
"True",
")",
")",
"headers",
"=",
"{",
"\"Content-Type\"",
":",
"\"application/json\"",
"}",
"logger",
".",
"info",
"(",
"\"Submiting workflow\"",
")",
"try",
":",
"r",
"=",
"requests",
".",
"post",
"(",
"uri",
",",
"json",
"=",
"data",
",",
"headers",
"=",
"headers",
")",
"logger",
".",
"debug",
"(",
"r",
".",
"text",
")",
"r",
".",
"raise_for_status",
"(",
")",
"except",
"requests",
".",
"exceptions",
".",
"HTTPError",
":",
"return",
"f\"Invalid status code returned: {r.status_code}\"",
"except",
"Exception",
"as",
"err",
":",
"return",
"f\"Unable to make request to argo server {self.server}: {err}\"",
",",
"500",
"return",
"\"Argo workflow was successfully submited\"",
",",
"200"
] | The ArgoWorflow provide a way to start an argo WF based on an existing template. | [
"The",
"ArgoWorflow",
"provide",
"a",
"way",
"to",
"start",
"an",
"argo",
"WF",
"based",
"on",
"an",
"existing",
"template",
"."
] | [
"\"\"\"The ArgoWorflow provide a way to start an argo WF based on an existing template.\n \"\"\"",
"\"\"\"Initialize the ArgoWorflow\n \"\"\"",
"# read mandatory parameters",
"# read non-mandatory parameters",
"# set a from:veba label",
"# add configured labels",
"\"\"\"Submit the workflow\n\n Args:\n event (dict): event data\n \"\"\"",
"# base64 convertion",
"# prepare the workflow data"
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
15ab59507209a9b7264e3945a7277176ff94124e | sbernasek/flyeye | flyeye/dynamics/visualization.py | [
"MIT"
] | Python | TimeseriesPlot |
Object describes a 1D timeseries.
Attributes:
x (np.ndarray) - independent variable
y (np.ndarray) - dependent variable
ax (matplotlib.axes.AxesSubplot)
| Object describes a 1D timeseries.
Attributes.
x (np.ndarray) - independent variable
y (np.ndarray) - dependent variable
| [
"Object",
"describes",
"a",
"1D",
"timeseries",
".",
"Attributes",
".",
"x",
"(",
"np",
".",
"ndarray",
")",
"-",
"independent",
"variable",
"y",
"(",
"np",
".",
"ndarray",
")",
"-",
"dependent",
"variable"
] | class TimeseriesPlot:
"""
Object describes a 1D timeseries.
Attributes:
x (np.ndarray) - independent variable
y (np.ndarray) - dependent variable
ax (matplotlib.axes.AxesSubplot)
"""
def __init__(self, x, y, ax=None):
"""
Instantiate a 1D timeseries.
Args:
x (np.ndarray) - independent variable
y (np.ndarray) - dependent variable
ax (matplotlib.axes.AxesSubplot)
"""
self.x = x
self.y = y
# set axis
if ax is None:
ax = self.create_figure()
self.ax = ax
def create_figure(self):
""" Instantiate figure. """
fig, ax = plt.subplots(ncols=1, figsize=(3, 2))
ax.set_xlim(self.x.min(), self.x.max())
ax.set_ylim(0, 1.1*self.y.max())
ax.set_xlabel('Time (h)'),
ax.set_ylabel('Expression (a.u.)')
return ax
def scatter(self,
color='k',
alpha=1,
s=1,
rasterized=False,
**additional):
"""
Scatterplot markers for x and y data.
Args:
color (str) - marker color
alpha (float) - marker alpha
s (float) - marker size
rasterized (bool) - if True, rasterize markers
"""
marker_kw = dict(color=color, s=s, alpha=alpha, lw=0, rasterized=rasterized)
_ = self.ax.scatter(self.x, self.y, **marker_kw, **additional)
def average(self,
ma_type='savgol',
window_size=100,
resolution=1,
smooth=True,
color='k',
alpha=1,
lw=1,
linestyle=None,
**additional
):
"""
Plot moving average of x and y data.
Args:
ma_type (str) - type of average, 'savgol', 'sliding', or 'binned'
window_size (int) - size of sliding window or bin (num of cells)
resolution (int) - sampling resolution for confidence interval
smooth (bool) - if True, apply secondary savgol filter
color, alpha, lw, linestyle - formatting parameters
"""
ma_kw = dict(ma_type=ma_type, window_size=window_size, resolution=resolution, smooth=smooth)
line_kw = dict(line_color=color, line_alpha=alpha, line_width=lw, linestyle=linestyle)
if len(self.y) > window_size:
_ = plot_mean(self.x, self.y, ax=self.ax, **ma_kw, **line_kw, **additional)
def interval(self,
ma_type='sliding',
window_size=100,
resolution=25,
nbootstraps=1000,
confidence=95,
color='k',
alpha=0.5,
**additional):
"""
Plot confidence interval for moving average of x and y data.
Args:
ma_type (str) - type of moving average, 'sliding' or 'binned'
window_size (int) - size of sliding window or bin (num of cells)
resolution (int) - sampling resolution for confidence interval
nbootstraps (int) - number of bootstraps
confidence (float) - confidence interval, between 0 and 100
color, alpha - formatting parameters
"""
# define moving average keyword arguments
ma_kw = dict(ma_type=ma_type,
window_size=window_size,
resolution=resolution,
nbootstraps=nbootstraps,
confidence=confidence)
# define interval shading keyword arguments
shade_kw = dict(color=color, alpha=alpha)
# plot confidence interval
if len(self.y) > window_size:
plot_mean_interval(self.x,
self.y,
ax=self.ax,
**ma_kw,
**shade_kw)
def plot(self,
scatter=False,
average=True,
interval=False,
marker_kw={},
line_kw={},
interval_kw={},
ma_kw={}):
"""
Plot timeseries data.
Args:
scatter (bool) - if True, add datapoints
average (bool) - if True, add moving average
interval (bool) - if True, add moving average interval
marker_kw (dict) - keyword arguments for marker formatting
line_kw (dict) - keyword arguments for line formatting
interval_kw (dict) - keyword arguments for interval formatting
ma_kw (dict) - keyword arguments for moving average
"""
# add scattered data
if scatter:
self.scatter(**marker_kw)
# add moving average
if average:
self.average(**ma_kw, **line_kw)
# add confidence interval for moving average
if interval:
self.interval(**ma_kw, **interval_kw) | [
"class",
"TimeseriesPlot",
":",
"def",
"__init__",
"(",
"self",
",",
"x",
",",
"y",
",",
"ax",
"=",
"None",
")",
":",
"\"\"\"\n Instantiate a 1D timeseries.\n\n Args:\n\n x (np.ndarray) - independent variable\n\n y (np.ndarray) - dependent variable\n\n ax (matplotlib.axes.AxesSubplot)\n\n \"\"\"",
"self",
".",
"x",
"=",
"x",
"self",
".",
"y",
"=",
"y",
"if",
"ax",
"is",
"None",
":",
"ax",
"=",
"self",
".",
"create_figure",
"(",
")",
"self",
".",
"ax",
"=",
"ax",
"def",
"create_figure",
"(",
"self",
")",
":",
"\"\"\" Instantiate figure. \"\"\"",
"fig",
",",
"ax",
"=",
"plt",
".",
"subplots",
"(",
"ncols",
"=",
"1",
",",
"figsize",
"=",
"(",
"3",
",",
"2",
")",
")",
"ax",
".",
"set_xlim",
"(",
"self",
".",
"x",
".",
"min",
"(",
")",
",",
"self",
".",
"x",
".",
"max",
"(",
")",
")",
"ax",
".",
"set_ylim",
"(",
"0",
",",
"1.1",
"*",
"self",
".",
"y",
".",
"max",
"(",
")",
")",
"ax",
".",
"set_xlabel",
"(",
"'Time (h)'",
")",
",",
"ax",
".",
"set_ylabel",
"(",
"'Expression (a.u.)'",
")",
"return",
"ax",
"def",
"scatter",
"(",
"self",
",",
"color",
"=",
"'k'",
",",
"alpha",
"=",
"1",
",",
"s",
"=",
"1",
",",
"rasterized",
"=",
"False",
",",
"**",
"additional",
")",
":",
"\"\"\"\n Scatterplot markers for x and y data.\n\n Args:\n\n color (str) - marker color\n\n alpha (float) - marker alpha\n\n s (float) - marker size\n\n rasterized (bool) - if True, rasterize markers\n\n \"\"\"",
"marker_kw",
"=",
"dict",
"(",
"color",
"=",
"color",
",",
"s",
"=",
"s",
",",
"alpha",
"=",
"alpha",
",",
"lw",
"=",
"0",
",",
"rasterized",
"=",
"rasterized",
")",
"_",
"=",
"self",
".",
"ax",
".",
"scatter",
"(",
"self",
".",
"x",
",",
"self",
".",
"y",
",",
"**",
"marker_kw",
",",
"**",
"additional",
")",
"def",
"average",
"(",
"self",
",",
"ma_type",
"=",
"'savgol'",
",",
"window_size",
"=",
"100",
",",
"resolution",
"=",
"1",
",",
"smooth",
"=",
"True",
",",
"color",
"=",
"'k'",
",",
"alpha",
"=",
"1",
",",
"lw",
"=",
"1",
",",
"linestyle",
"=",
"None",
",",
"**",
"additional",
")",
":",
"\"\"\"\n Plot moving average of x and y data.\n\n Args:\n\n ma_type (str) - type of average, 'savgol', 'sliding', or 'binned'\n\n window_size (int) - size of sliding window or bin (num of cells)\n\n resolution (int) - sampling resolution for confidence interval\n\n smooth (bool) - if True, apply secondary savgol filter\n\n color, alpha, lw, linestyle - formatting parameters\n\n \"\"\"",
"ma_kw",
"=",
"dict",
"(",
"ma_type",
"=",
"ma_type",
",",
"window_size",
"=",
"window_size",
",",
"resolution",
"=",
"resolution",
",",
"smooth",
"=",
"smooth",
")",
"line_kw",
"=",
"dict",
"(",
"line_color",
"=",
"color",
",",
"line_alpha",
"=",
"alpha",
",",
"line_width",
"=",
"lw",
",",
"linestyle",
"=",
"linestyle",
")",
"if",
"len",
"(",
"self",
".",
"y",
")",
">",
"window_size",
":",
"_",
"=",
"plot_mean",
"(",
"self",
".",
"x",
",",
"self",
".",
"y",
",",
"ax",
"=",
"self",
".",
"ax",
",",
"**",
"ma_kw",
",",
"**",
"line_kw",
",",
"**",
"additional",
")",
"def",
"interval",
"(",
"self",
",",
"ma_type",
"=",
"'sliding'",
",",
"window_size",
"=",
"100",
",",
"resolution",
"=",
"25",
",",
"nbootstraps",
"=",
"1000",
",",
"confidence",
"=",
"95",
",",
"color",
"=",
"'k'",
",",
"alpha",
"=",
"0.5",
",",
"**",
"additional",
")",
":",
"\"\"\"\n Plot confidence interval for moving average of x and y data.\n\n Args:\n\n ma_type (str) - type of moving average, 'sliding' or 'binned'\n\n window_size (int) - size of sliding window or bin (num of cells)\n\n resolution (int) - sampling resolution for confidence interval\n\n nbootstraps (int) - number of bootstraps\n\n confidence (float) - confidence interval, between 0 and 100\n\n color, alpha - formatting parameters\n\n \"\"\"",
"ma_kw",
"=",
"dict",
"(",
"ma_type",
"=",
"ma_type",
",",
"window_size",
"=",
"window_size",
",",
"resolution",
"=",
"resolution",
",",
"nbootstraps",
"=",
"nbootstraps",
",",
"confidence",
"=",
"confidence",
")",
"shade_kw",
"=",
"dict",
"(",
"color",
"=",
"color",
",",
"alpha",
"=",
"alpha",
")",
"if",
"len",
"(",
"self",
".",
"y",
")",
">",
"window_size",
":",
"plot_mean_interval",
"(",
"self",
".",
"x",
",",
"self",
".",
"y",
",",
"ax",
"=",
"self",
".",
"ax",
",",
"**",
"ma_kw",
",",
"**",
"shade_kw",
")",
"def",
"plot",
"(",
"self",
",",
"scatter",
"=",
"False",
",",
"average",
"=",
"True",
",",
"interval",
"=",
"False",
",",
"marker_kw",
"=",
"{",
"}",
",",
"line_kw",
"=",
"{",
"}",
",",
"interval_kw",
"=",
"{",
"}",
",",
"ma_kw",
"=",
"{",
"}",
")",
":",
"\"\"\"\n Plot timeseries data.\n\n Args:\n\n scatter (bool) - if True, add datapoints\n\n average (bool) - if True, add moving average\n\n interval (bool) - if True, add moving average interval\n\n marker_kw (dict) - keyword arguments for marker formatting\n\n line_kw (dict) - keyword arguments for line formatting\n\n interval_kw (dict) - keyword arguments for interval formatting\n\n ma_kw (dict) - keyword arguments for moving average\n\n \"\"\"",
"if",
"scatter",
":",
"self",
".",
"scatter",
"(",
"**",
"marker_kw",
")",
"if",
"average",
":",
"self",
".",
"average",
"(",
"**",
"ma_kw",
",",
"**",
"line_kw",
")",
"if",
"interval",
":",
"self",
".",
"interval",
"(",
"**",
"ma_kw",
",",
"**",
"interval_kw",
")"
] | Object describes a 1D timeseries. | [
"Object",
"describes",
"a",
"1D",
"timeseries",
"."
] | [
"\"\"\"\n Object describes a 1D timeseries.\n\n Attributes:\n\n x (np.ndarray) - independent variable\n\n y (np.ndarray) - dependent variable\n\n ax (matplotlib.axes.AxesSubplot)\n\n \"\"\"",
"\"\"\"\n Instantiate a 1D timeseries.\n\n Args:\n\n x (np.ndarray) - independent variable\n\n y (np.ndarray) - dependent variable\n\n ax (matplotlib.axes.AxesSubplot)\n\n \"\"\"",
"# set axis",
"\"\"\" Instantiate figure. \"\"\"",
"\"\"\"\n Scatterplot markers for x and y data.\n\n Args:\n\n color (str) - marker color\n\n alpha (float) - marker alpha\n\n s (float) - marker size\n\n rasterized (bool) - if True, rasterize markers\n\n \"\"\"",
"\"\"\"\n Plot moving average of x and y data.\n\n Args:\n\n ma_type (str) - type of average, 'savgol', 'sliding', or 'binned'\n\n window_size (int) - size of sliding window or bin (num of cells)\n\n resolution (int) - sampling resolution for confidence interval\n\n smooth (bool) - if True, apply secondary savgol filter\n\n color, alpha, lw, linestyle - formatting parameters\n\n \"\"\"",
"\"\"\"\n Plot confidence interval for moving average of x and y data.\n\n Args:\n\n ma_type (str) - type of moving average, 'sliding' or 'binned'\n\n window_size (int) - size of sliding window or bin (num of cells)\n\n resolution (int) - sampling resolution for confidence interval\n\n nbootstraps (int) - number of bootstraps\n\n confidence (float) - confidence interval, between 0 and 100\n\n color, alpha - formatting parameters\n\n \"\"\"",
"# define moving average keyword arguments",
"# define interval shading keyword arguments",
"# plot confidence interval",
"\"\"\"\n Plot timeseries data.\n\n Args:\n\n scatter (bool) - if True, add datapoints\n\n average (bool) - if True, add moving average\n\n interval (bool) - if True, add moving average interval\n\n marker_kw (dict) - keyword arguments for marker formatting\n\n line_kw (dict) - keyword arguments for line formatting\n\n interval_kw (dict) - keyword arguments for interval formatting\n\n ma_kw (dict) - keyword arguments for moving average\n\n \"\"\"",
"# add scattered data",
"# add moving average",
"# add confidence interval for moving average"
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
ec6a7aa96ee85f520989ee524c59400757ee3b89 | fourstix/Sparkfun_CircuitPython_QwiicJoystick | sparkfun_qwiicjoystick.py | [
"MIT"
] | Python | Sparkfun_QwiicJoystick | CircuitPython class for the Sparkfun QwiicJoystick
Usage:
# import the CircuitPython board and busio libraries
import board
import busio
# Create bus object using the board's I2C port
i2c = busio.I2C(board.SCL, board.SDA)
joystick = QwiicJoystick(i2c) # default address is 0x20
# use QwiicJoystick(i2c, address) for a different address
# joystick = QwiicJoystick(i2c, 0x21) | CircuitPython class for the Sparkfun QwiicJoystick
Usage.
import the CircuitPython board and busio libraries
import board
import busio
Create bus object using the board's I2C port
i2c = busio.I2C(board.SCL, board.SDA)
joystick = QwiicJoystick(i2c) # default address is 0x20
use QwiicJoystick(i2c, address) for a different address
joystick = QwiicJoystick(i2c, 0x21) | [
"CircuitPython",
"class",
"for",
"the",
"Sparkfun",
"QwiicJoystick",
"Usage",
".",
"import",
"the",
"CircuitPython",
"board",
"and",
"busio",
"libraries",
"import",
"board",
"import",
"busio",
"Create",
"bus",
"object",
"using",
"the",
"board",
"'",
"s",
"I2C",
"port",
"i2c",
"=",
"busio",
".",
"I2C",
"(",
"board",
".",
"SCL",
"board",
".",
"SDA",
")",
"joystick",
"=",
"QwiicJoystick",
"(",
"i2c",
")",
"#",
"default",
"address",
"is",
"0x20",
"use",
"QwiicJoystick",
"(",
"i2c",
"address",
")",
"for",
"a",
"different",
"address",
"joystick",
"=",
"QwiicJoystick",
"(",
"i2c",
"0x21",
")"
] | class Sparkfun_QwiicJoystick:
"""CircuitPython class for the Sparkfun QwiicJoystick
Usage:
# import the CircuitPython board and busio libraries
import board
import busio
# Create bus object using the board's I2C port
i2c = busio.I2C(board.SCL, board.SDA)
joystick = QwiicJoystick(i2c) # default address is 0x20
# use QwiicJoystick(i2c, address) for a different address
# joystick = QwiicJoystick(i2c, 0x21)"""
def __init__(self, i2c, address=QWIIC_JOYSTICK_ADDR, debug=False):
"""Initialize Qwiic Joystick for i2c communication."""
self._device = I2CDevice(i2c, address)
# save handle to i2c bus in case address is changed
self._i2c = i2c
self._debug = debug
# public properites
@property
def connected(self):
"""True if the Joystick is connected and a valid id is successful read."""
try:
# Attempt to read the id and see if we get an error
self._read_register(_JOYSTICK_ID)
except ValueError:
return False
return True
@property
def version(self):
"""Firmware version string for joystick."""
major = self._read_register(_JOYSTICK_VERSION1)
minor = self._read_register(_JOYSTICK_VERSION2)
return "v" + str(major) + "." + str(minor)
@property
def horizontal(self):
"""X value from 0 - 1023 of the joystick postion."""
# Read MSB for horizontal joystick position
x_msb = self._read_register(_JOYSTICK_X_MSB)
# Read LSB for horizontal joystick position
x_lsb = self._read_register(_JOYSTICK_X_LSB)
# mask off bytes and combine into 10-bit integer
x = ((x_msb & 0xFF) << 8 | (x_lsb & 0xFF)) >> 6
return x
@property
def vertical(self):
"""Y value from 0 to 1023 of the joystick postion."""
# Read MSB for veritical joystick position
y_msb = self._read_register(_JOYSTICK_Y_MSB)
# Read LSB for vertical joystick position
y_lsb = self._read_register(_JOYSTICK_Y_LSB)
# mask off bytes and combine into 10-bit integer
y = ((y_msb & 0xFF) << 8 | (y_lsb & 0xFF)) >> 6
return y
@property
def button(self):
"""0 if button is down, 1 if button is up."""
button = self._read_register(_JOYSTICK_BUTTON)
return button
# Issue: register 0x08 always contains 1 for some reason, even when cleared
@property
def button_status(self):
"""1 if button pressed between reads, cleared after read."""
# read button status (since last check)
status = self._read_register(_JOYSTICK_STATUS)
# clear button status
self._write_register(_JOYSTICK_STATUS, 0x00)
return status & 0xFF
# public functions
def set_i2c_address(self, new_address):
"""Change the i2c address of Joystick snd return True if successful."""
# check range of new address
if new_address < 8 or new_address > 119:
print("ERROR: Address outside 8-119 range")
return False
# write magic number 0x13 to lock register, to unlock address for update
self._write_register(_JOYSTICK_I2C_LOCK, 0x13)
# write new address
self._write_register(_JOYSTICK_CHANGE_ADDRESS, new_address)
# wait a second for joystick to settle after change
sleep(1)
# try to re-create new i2c device at new address
try:
self._device = I2CDevice(self._i2c, new_address)
except ValueError as err:
print("Address Change Failure")
print(err)
return False
# if we made it here, everything went fine
return True
# No i2c begin function is needed since I2Cdevice class takes care of that
# private functions
def _read_register(self, addr):
# Read and return a byte from the specified 8-bit register address.
with self._device as device:
device.write(bytes([addr & 0xFF]))
result = bytearray(1)
device.readinto(result)
# For some reason, write_then_readinto returns invalid data
# device.write_then_readinto(bytes([addr & 0xFF]), result)
if self._debug:
print("$%02X => %s" % (addr, [hex(i) for i in result]))
return result[0]
def _write_register(self, addr, value):
# Write a byte to the specified 8-bit register address
with self._device as device:
device.write(bytes([addr & 0xFF, value & 0xFF]))
if self._debug:
print("$%02X <= 0x%02X" % (addr, value)) | [
"class",
"Sparkfun_QwiicJoystick",
":",
"def",
"__init__",
"(",
"self",
",",
"i2c",
",",
"address",
"=",
"QWIIC_JOYSTICK_ADDR",
",",
"debug",
"=",
"False",
")",
":",
"\"\"\"Initialize Qwiic Joystick for i2c communication.\"\"\"",
"self",
".",
"_device",
"=",
"I2CDevice",
"(",
"i2c",
",",
"address",
")",
"self",
".",
"_i2c",
"=",
"i2c",
"self",
".",
"_debug",
"=",
"debug",
"@",
"property",
"def",
"connected",
"(",
"self",
")",
":",
"\"\"\"True if the Joystick is connected and a valid id is successful read.\"\"\"",
"try",
":",
"self",
".",
"_read_register",
"(",
"_JOYSTICK_ID",
")",
"except",
"ValueError",
":",
"return",
"False",
"return",
"True",
"@",
"property",
"def",
"version",
"(",
"self",
")",
":",
"\"\"\"Firmware version string for joystick.\"\"\"",
"major",
"=",
"self",
".",
"_read_register",
"(",
"_JOYSTICK_VERSION1",
")",
"minor",
"=",
"self",
".",
"_read_register",
"(",
"_JOYSTICK_VERSION2",
")",
"return",
"\"v\"",
"+",
"str",
"(",
"major",
")",
"+",
"\".\"",
"+",
"str",
"(",
"minor",
")",
"@",
"property",
"def",
"horizontal",
"(",
"self",
")",
":",
"\"\"\"X value from 0 - 1023 of the joystick postion.\"\"\"",
"x_msb",
"=",
"self",
".",
"_read_register",
"(",
"_JOYSTICK_X_MSB",
")",
"x_lsb",
"=",
"self",
".",
"_read_register",
"(",
"_JOYSTICK_X_LSB",
")",
"x",
"=",
"(",
"(",
"x_msb",
"&",
"0xFF",
")",
"<<",
"8",
"|",
"(",
"x_lsb",
"&",
"0xFF",
")",
")",
">>",
"6",
"return",
"x",
"@",
"property",
"def",
"vertical",
"(",
"self",
")",
":",
"\"\"\"Y value from 0 to 1023 of the joystick postion.\"\"\"",
"y_msb",
"=",
"self",
".",
"_read_register",
"(",
"_JOYSTICK_Y_MSB",
")",
"y_lsb",
"=",
"self",
".",
"_read_register",
"(",
"_JOYSTICK_Y_LSB",
")",
"y",
"=",
"(",
"(",
"y_msb",
"&",
"0xFF",
")",
"<<",
"8",
"|",
"(",
"y_lsb",
"&",
"0xFF",
")",
")",
">>",
"6",
"return",
"y",
"@",
"property",
"def",
"button",
"(",
"self",
")",
":",
"\"\"\"0 if button is down, 1 if button is up.\"\"\"",
"button",
"=",
"self",
".",
"_read_register",
"(",
"_JOYSTICK_BUTTON",
")",
"return",
"button",
"@",
"property",
"def",
"button_status",
"(",
"self",
")",
":",
"\"\"\"1 if button pressed between reads, cleared after read.\"\"\"",
"status",
"=",
"self",
".",
"_read_register",
"(",
"_JOYSTICK_STATUS",
")",
"self",
".",
"_write_register",
"(",
"_JOYSTICK_STATUS",
",",
"0x00",
")",
"return",
"status",
"&",
"0xFF",
"def",
"set_i2c_address",
"(",
"self",
",",
"new_address",
")",
":",
"\"\"\"Change the i2c address of Joystick snd return True if successful.\"\"\"",
"if",
"new_address",
"<",
"8",
"or",
"new_address",
">",
"119",
":",
"print",
"(",
"\"ERROR: Address outside 8-119 range\"",
")",
"return",
"False",
"self",
".",
"_write_register",
"(",
"_JOYSTICK_I2C_LOCK",
",",
"0x13",
")",
"self",
".",
"_write_register",
"(",
"_JOYSTICK_CHANGE_ADDRESS",
",",
"new_address",
")",
"sleep",
"(",
"1",
")",
"try",
":",
"self",
".",
"_device",
"=",
"I2CDevice",
"(",
"self",
".",
"_i2c",
",",
"new_address",
")",
"except",
"ValueError",
"as",
"err",
":",
"print",
"(",
"\"Address Change Failure\"",
")",
"print",
"(",
"err",
")",
"return",
"False",
"return",
"True",
"def",
"_read_register",
"(",
"self",
",",
"addr",
")",
":",
"with",
"self",
".",
"_device",
"as",
"device",
":",
"device",
".",
"write",
"(",
"bytes",
"(",
"[",
"addr",
"&",
"0xFF",
"]",
")",
")",
"result",
"=",
"bytearray",
"(",
"1",
")",
"device",
".",
"readinto",
"(",
"result",
")",
"if",
"self",
".",
"_debug",
":",
"print",
"(",
"\"$%02X => %s\"",
"%",
"(",
"addr",
",",
"[",
"hex",
"(",
"i",
")",
"for",
"i",
"in",
"result",
"]",
")",
")",
"return",
"result",
"[",
"0",
"]",
"def",
"_write_register",
"(",
"self",
",",
"addr",
",",
"value",
")",
":",
"with",
"self",
".",
"_device",
"as",
"device",
":",
"device",
".",
"write",
"(",
"bytes",
"(",
"[",
"addr",
"&",
"0xFF",
",",
"value",
"&",
"0xFF",
"]",
")",
")",
"if",
"self",
".",
"_debug",
":",
"print",
"(",
"\"$%02X <= 0x%02X\"",
"%",
"(",
"addr",
",",
"value",
")",
")"
] | CircuitPython class for the Sparkfun QwiicJoystick
Usage: | [
"CircuitPython",
"class",
"for",
"the",
"Sparkfun",
"QwiicJoystick",
"Usage",
":"
] | [
"\"\"\"CircuitPython class for the Sparkfun QwiicJoystick\n Usage:\n\n # import the CircuitPython board and busio libraries\n\n import board\n import busio\n\n # Create bus object using the board's I2C port\n i2c = busio.I2C(board.SCL, board.SDA)\n\n joystick = QwiicJoystick(i2c) # default address is 0x20\n\n # use QwiicJoystick(i2c, address) for a different address\n # joystick = QwiicJoystick(i2c, 0x21)\"\"\"",
"\"\"\"Initialize Qwiic Joystick for i2c communication.\"\"\"",
"# save handle to i2c bus in case address is changed",
"# public properites",
"\"\"\"True if the Joystick is connected and a valid id is successful read.\"\"\"",
"# Attempt to read the id and see if we get an error",
"\"\"\"Firmware version string for joystick.\"\"\"",
"\"\"\"X value from 0 - 1023 of the joystick postion.\"\"\"",
"# Read MSB for horizontal joystick position",
"# Read LSB for horizontal joystick position",
"# mask off bytes and combine into 10-bit integer",
"\"\"\"Y value from 0 to 1023 of the joystick postion.\"\"\"",
"# Read MSB for veritical joystick position",
"# Read LSB for vertical joystick position",
"# mask off bytes and combine into 10-bit integer",
"\"\"\"0 if button is down, 1 if button is up.\"\"\"",
"# Issue: register 0x08 always contains 1 for some reason, even when cleared",
"\"\"\"1 if button pressed between reads, cleared after read.\"\"\"",
"# read button status (since last check)",
"# clear button status",
"# public functions",
"\"\"\"Change the i2c address of Joystick snd return True if successful.\"\"\"",
"# check range of new address",
"# write magic number 0x13 to lock register, to unlock address for update",
"# write new address",
"# wait a second for joystick to settle after change",
"# try to re-create new i2c device at new address",
"# if we made it here, everything went fine",
"# No i2c begin function is needed since I2Cdevice class takes care of that",
"# private functions",
"# Read and return a byte from the specified 8-bit register address.",
"# For some reason, write_then_readinto returns invalid data",
"# device.write_then_readinto(bytes([addr & 0xFF]), result)",
"# Write a byte to the specified 8-bit register address"
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
ecaac8c49f868073f9f6b89fcb22c74fdb9c6a04 | benhid/d4s | d4s/storage.py | [
"MIT"
] | Python | Item |
Class representing store hub files.
| Class representing store hub files. | [
"Class",
"representing",
"store",
"hub",
"files",
"."
] | class Item:
"""
Class representing store hub files.
"""
def __init__(self, id: str, base_url: str):
self.id = id
self.base_url = base_url
@cached_property
def public_url(self):
""" Get public url from item in workspace.
"""
url = f'{self.base_url}/workspace/items/{self.id}/publiclink?gcube-token={self.token}'
x = requests.get(url)
# for some reason, the response returns an url with surrounding quote marks
return x.text[1:-1]
@property
def token(self):
return context.token | [
"class",
"Item",
":",
"def",
"__init__",
"(",
"self",
",",
"id",
":",
"str",
",",
"base_url",
":",
"str",
")",
":",
"self",
".",
"id",
"=",
"id",
"self",
".",
"base_url",
"=",
"base_url",
"@",
"cached_property",
"def",
"public_url",
"(",
"self",
")",
":",
"\"\"\" Get public url from item in workspace.\n \"\"\"",
"url",
"=",
"f'{self.base_url}/workspace/items/{self.id}/publiclink?gcube-token={self.token}'",
"x",
"=",
"requests",
".",
"get",
"(",
"url",
")",
"return",
"x",
".",
"text",
"[",
"1",
":",
"-",
"1",
"]",
"@",
"property",
"def",
"token",
"(",
"self",
")",
":",
"return",
"context",
".",
"token"
] | Class representing store hub files. | [
"Class",
"representing",
"store",
"hub",
"files",
"."
] | [
"\"\"\"\n Class representing store hub files.\n \"\"\"",
"\"\"\" Get public url from item in workspace.\n \"\"\"",
"# for some reason, the response returns an url with surrounding quote marks"
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
ecb3d6f73859938d78658fa282749298aab75d9d | darthryking/VMFMergeTool | gui.py | [
"BSD-3-Clause"
] | Python | VMFCache | An expandable-size cache for VMFs. This lets us skip the load process
for VMFs that we've already loaded before, which is helpful for VMFs that
take a long time to parse.
| An expandable-size cache for VMFs. This lets us skip the load process
for VMFs that we've already loaded before, which is helpful for VMFs that
take a long time to parse. | [
"An",
"expandable",
"-",
"size",
"cache",
"for",
"VMFs",
".",
"This",
"lets",
"us",
"skip",
"the",
"load",
"process",
"for",
"VMFs",
"that",
"we",
"'",
"ve",
"already",
"loaded",
"before",
"which",
"is",
"helpful",
"for",
"VMFs",
"that",
"take",
"a",
"long",
"time",
"to",
"parse",
"."
] | class VMFCache:
""" An expandable-size cache for VMFs. This lets us skip the load process
for VMFs that we've already loaded before, which is helpful for VMFs that
take a long time to parse.
"""
def __init__(self):
self.maxSize = 1
self.data = {}
self.unusedPaths = set()
self.pendingUnusedPaths = set()
self._mutex = RLock()
def increase_max_size(self, maxSize):
''' Increases the max size of the cache to the given number.
If the requested max size is less than the current size, this does
nothing.
'''
with self._mutex:
if maxSize > self.maxSize:
self.set_max_size(maxSize)
def set_max_size(self, maxSize):
with self._mutex:
if maxSize < self.get_vmf_count():
raise ValueError("Can't clear enough unused entries!")
self.evict_unused()
self.maxSize = maxSize
assert len(self.data) <= self.maxSize
def add_vmf(self, vmf):
vmfPath = vmf.path
with self._mutex:
assert len(self.data) <= self.maxSize
if vmfPath in self.pendingUnusedPaths:
# This VMF has been preemptively marked as unused.
# Don't bother caching it.
self.pendingUnusedPaths.remove(vmfPath)
return
if len(self.data) >= self.maxSize:
if len(self.unusedPaths) > 0:
self.evict_unused(limit=1)
else:
raise ValueError("VMF cache limit reached!")
self.data[vmfPath] = vmf
assert len(self.data) <= self.maxSize
def mark_used(self, *vmfPaths):
with self._mutex:
for vmfPath in vmfPaths:
if vmfPath in self.unusedPaths:
self.unusedPaths.remove(vmfPath)
def mark_unused(self, *vmfPaths):
with self._mutex:
for vmfPath in vmfPaths:
if vmfPath in self.data:
self.unusedPaths.add(vmfPath)
else:
self.pendingUnusedPaths.add(vmfPath)
def evict_unused(self, limit=float('inf')):
with self._mutex:
for i, unusedPath in enumerate(set(self.unusedPaths)):
if i >= limit:
break
del self.data[unusedPath]
self.unusedPaths.remove(unusedPath)
print("Evicted", unusedPath)
assert len(self.data) <= self.maxSize
def has_vmf_path(self, path):
with self._mutex:
return path in self.data
def get_vmfs(self):
with self._mutex:
return [
vmf for vmf in self.data.values()
if vmf.path not in self.unusedPaths
]
def get_vmf_count(self):
with self._mutex:
return len(self.data) - len(self.unusedPaths) | [
"class",
"VMFCache",
":",
"def",
"__init__",
"(",
"self",
")",
":",
"self",
".",
"maxSize",
"=",
"1",
"self",
".",
"data",
"=",
"{",
"}",
"self",
".",
"unusedPaths",
"=",
"set",
"(",
")",
"self",
".",
"pendingUnusedPaths",
"=",
"set",
"(",
")",
"self",
".",
"_mutex",
"=",
"RLock",
"(",
")",
"def",
"increase_max_size",
"(",
"self",
",",
"maxSize",
")",
":",
"''' Increases the max size of the cache to the given number.\n If the requested max size is less than the current size, this does\n nothing.\n \n '''",
"with",
"self",
".",
"_mutex",
":",
"if",
"maxSize",
">",
"self",
".",
"maxSize",
":",
"self",
".",
"set_max_size",
"(",
"maxSize",
")",
"def",
"set_max_size",
"(",
"self",
",",
"maxSize",
")",
":",
"with",
"self",
".",
"_mutex",
":",
"if",
"maxSize",
"<",
"self",
".",
"get_vmf_count",
"(",
")",
":",
"raise",
"ValueError",
"(",
"\"Can't clear enough unused entries!\"",
")",
"self",
".",
"evict_unused",
"(",
")",
"self",
".",
"maxSize",
"=",
"maxSize",
"assert",
"len",
"(",
"self",
".",
"data",
")",
"<=",
"self",
".",
"maxSize",
"def",
"add_vmf",
"(",
"self",
",",
"vmf",
")",
":",
"vmfPath",
"=",
"vmf",
".",
"path",
"with",
"self",
".",
"_mutex",
":",
"assert",
"len",
"(",
"self",
".",
"data",
")",
"<=",
"self",
".",
"maxSize",
"if",
"vmfPath",
"in",
"self",
".",
"pendingUnusedPaths",
":",
"self",
".",
"pendingUnusedPaths",
".",
"remove",
"(",
"vmfPath",
")",
"return",
"if",
"len",
"(",
"self",
".",
"data",
")",
">=",
"self",
".",
"maxSize",
":",
"if",
"len",
"(",
"self",
".",
"unusedPaths",
")",
">",
"0",
":",
"self",
".",
"evict_unused",
"(",
"limit",
"=",
"1",
")",
"else",
":",
"raise",
"ValueError",
"(",
"\"VMF cache limit reached!\"",
")",
"self",
".",
"data",
"[",
"vmfPath",
"]",
"=",
"vmf",
"assert",
"len",
"(",
"self",
".",
"data",
")",
"<=",
"self",
".",
"maxSize",
"def",
"mark_used",
"(",
"self",
",",
"*",
"vmfPaths",
")",
":",
"with",
"self",
".",
"_mutex",
":",
"for",
"vmfPath",
"in",
"vmfPaths",
":",
"if",
"vmfPath",
"in",
"self",
".",
"unusedPaths",
":",
"self",
".",
"unusedPaths",
".",
"remove",
"(",
"vmfPath",
")",
"def",
"mark_unused",
"(",
"self",
",",
"*",
"vmfPaths",
")",
":",
"with",
"self",
".",
"_mutex",
":",
"for",
"vmfPath",
"in",
"vmfPaths",
":",
"if",
"vmfPath",
"in",
"self",
".",
"data",
":",
"self",
".",
"unusedPaths",
".",
"add",
"(",
"vmfPath",
")",
"else",
":",
"self",
".",
"pendingUnusedPaths",
".",
"add",
"(",
"vmfPath",
")",
"def",
"evict_unused",
"(",
"self",
",",
"limit",
"=",
"float",
"(",
"'inf'",
")",
")",
":",
"with",
"self",
".",
"_mutex",
":",
"for",
"i",
",",
"unusedPath",
"in",
"enumerate",
"(",
"set",
"(",
"self",
".",
"unusedPaths",
")",
")",
":",
"if",
"i",
">=",
"limit",
":",
"break",
"del",
"self",
".",
"data",
"[",
"unusedPath",
"]",
"self",
".",
"unusedPaths",
".",
"remove",
"(",
"unusedPath",
")",
"print",
"(",
"\"Evicted\"",
",",
"unusedPath",
")",
"assert",
"len",
"(",
"self",
".",
"data",
")",
"<=",
"self",
".",
"maxSize",
"def",
"has_vmf_path",
"(",
"self",
",",
"path",
")",
":",
"with",
"self",
".",
"_mutex",
":",
"return",
"path",
"in",
"self",
".",
"data",
"def",
"get_vmfs",
"(",
"self",
")",
":",
"with",
"self",
".",
"_mutex",
":",
"return",
"[",
"vmf",
"for",
"vmf",
"in",
"self",
".",
"data",
".",
"values",
"(",
")",
"if",
"vmf",
".",
"path",
"not",
"in",
"self",
".",
"unusedPaths",
"]",
"def",
"get_vmf_count",
"(",
"self",
")",
":",
"with",
"self",
".",
"_mutex",
":",
"return",
"len",
"(",
"self",
".",
"data",
")",
"-",
"len",
"(",
"self",
".",
"unusedPaths",
")"
] | An expandable-size cache for VMFs. | [
"An",
"expandable",
"-",
"size",
"cache",
"for",
"VMFs",
"."
] | [
"\"\"\" An expandable-size cache for VMFs. This lets us skip the load process\n for VMFs that we've already loaded before, which is helpful for VMFs that\n take a long time to parse.\n \n \"\"\"",
"''' Increases the max size of the cache to the given number.\n If the requested max size is less than the current size, this does\n nothing.\n \n '''",
"# This VMF has been preemptively marked as unused.",
"# Don't bother caching it."
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
ecc09f1629ed770a0be9fae5c249608721a3c6c2 | mithem/helix-cli | statehandler.py | [
"MIT"
] | Python | StateHandler | utilities commonly used when working with states | utilities commonly used when working with states | [
"utilities",
"commonly",
"used",
"when",
"working",
"with",
"states"
] | class StateHandler:
"""utilities commonly used when working with states"""
def getDateString(date):
"""returns iso-date-string of specified date"""
return str(f"{date.year}-{date.month}-{date.day}")
def getAppropriateState(title):
"""returns appropriate state depending of due_date and deadline"""
if ItemHandler.getProperty(title, "due_date") == StateHandler.getDateString(datetime.now()):
return "active"
elif ItemHandler.getProperty(title, "due_date") == None:
return "upcoming"
elif ItemHandler.getProperty(title, "deadline") == None:
return "upcoming"
elif ItemHandler.getProperty(title, "deadline") == StateHandler.getDateString(datetime.now()):
return "urgent" | [
"class",
"StateHandler",
":",
"def",
"getDateString",
"(",
"date",
")",
":",
"\"\"\"returns iso-date-string of specified date\"\"\"",
"return",
"str",
"(",
"f\"{date.year}-{date.month}-{date.day}\"",
")",
"def",
"getAppropriateState",
"(",
"title",
")",
":",
"\"\"\"returns appropriate state depending of due_date and deadline\"\"\"",
"if",
"ItemHandler",
".",
"getProperty",
"(",
"title",
",",
"\"due_date\"",
")",
"==",
"StateHandler",
".",
"getDateString",
"(",
"datetime",
".",
"now",
"(",
")",
")",
":",
"return",
"\"active\"",
"elif",
"ItemHandler",
".",
"getProperty",
"(",
"title",
",",
"\"due_date\"",
")",
"==",
"None",
":",
"return",
"\"upcoming\"",
"elif",
"ItemHandler",
".",
"getProperty",
"(",
"title",
",",
"\"deadline\"",
")",
"==",
"None",
":",
"return",
"\"upcoming\"",
"elif",
"ItemHandler",
".",
"getProperty",
"(",
"title",
",",
"\"deadline\"",
")",
"==",
"StateHandler",
".",
"getDateString",
"(",
"datetime",
".",
"now",
"(",
")",
")",
":",
"return",
"\"urgent\""
] | utilities commonly used when working with states | [
"utilities",
"commonly",
"used",
"when",
"working",
"with",
"states"
] | [
"\"\"\"utilities commonly used when working with states\"\"\"",
"\"\"\"returns iso-date-string of specified date\"\"\"",
"\"\"\"returns appropriate state depending of due_date and deadline\"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
ecd45ebb5de12537c1bdbc107787ed72f93490f1 | lnros/real-estate-web-scraping | config.py | [
"MIT"
] | Python | DBConfig |
Holds the DB parameters for the web scraping.
| Holds the DB parameters for the web scraping. | [
"Holds",
"the",
"DB",
"parameters",
"for",
"the",
"web",
"scraping",
"."
] | class DBConfig:
"""
Holds the DB parameters for the web scraping.
"""
HOST = "localhost"
USER = "root"
PASSWORD = "password" # not real password, change after pulling this file
DATABASE = "brbeky1hybvf32t4ufxz"
INSERT_CITY_QUERY = "INSERT IGNORE INTO cities(city_name) values (%s)"
INSERT_LISTINGS_QUERY = "INSERT IGNORE INTO listings(listing_type) values (%s)"
INSERT_PROPERTY_TYPES_QUERY = "INSERT IGNORE INTO property_types(property_type) values (%s)"
FK_IDS_LIST = ['listing_id', 'property_type_id', 'city_id']
PRICE_COLUMN_IDX = 3
LATITUDE_COLUMN_IDX = -5
GET_LISTING_TYPE_ID_QUERY = "SELECT id FROM listings WHERE listing_type = %s"
GET_PROPERTY_TYPE_ID_QUERY = "SELECT id FROM property_types WHERE property_type = %s"
GET_CITY_ID_QUERY = "SELECT id FROM cities WHERE city_name = %s"
TUPLE_FIRST_ELEMENT_IDX = 0
LISTING_TYPE_IDX = 0
PROPERTY_TYPE_IDX = 1
CITY_IDX = 2
SEPARATOR = ","
TABLE_FEEDER_COLUMN_IDX = 3 | [
"class",
"DBConfig",
":",
"HOST",
"=",
"\"localhost\"",
"USER",
"=",
"\"root\"",
"PASSWORD",
"=",
"\"password\"",
"DATABASE",
"=",
"\"brbeky1hybvf32t4ufxz\"",
"INSERT_CITY_QUERY",
"=",
"\"INSERT IGNORE INTO cities(city_name) values (%s)\"",
"INSERT_LISTINGS_QUERY",
"=",
"\"INSERT IGNORE INTO listings(listing_type) values (%s)\"",
"INSERT_PROPERTY_TYPES_QUERY",
"=",
"\"INSERT IGNORE INTO property_types(property_type) values (%s)\"",
"FK_IDS_LIST",
"=",
"[",
"'listing_id'",
",",
"'property_type_id'",
",",
"'city_id'",
"]",
"PRICE_COLUMN_IDX",
"=",
"3",
"LATITUDE_COLUMN_IDX",
"=",
"-",
"5",
"GET_LISTING_TYPE_ID_QUERY",
"=",
"\"SELECT id FROM listings WHERE listing_type = %s\"",
"GET_PROPERTY_TYPE_ID_QUERY",
"=",
"\"SELECT id FROM property_types WHERE property_type = %s\"",
"GET_CITY_ID_QUERY",
"=",
"\"SELECT id FROM cities WHERE city_name = %s\"",
"TUPLE_FIRST_ELEMENT_IDX",
"=",
"0",
"LISTING_TYPE_IDX",
"=",
"0",
"PROPERTY_TYPE_IDX",
"=",
"1",
"CITY_IDX",
"=",
"2",
"SEPARATOR",
"=",
"\",\"",
"TABLE_FEEDER_COLUMN_IDX",
"=",
"3"
] | Holds the DB parameters for the web scraping. | [
"Holds",
"the",
"DB",
"parameters",
"for",
"the",
"web",
"scraping",
"."
] | [
"\"\"\"\n Holds the DB parameters for the web scraping.\n \"\"\"",
"# not real password, change after pulling this file"
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
ecd45ebb5de12537c1bdbc107787ed72f93490f1 | lnros/real-estate-web-scraping | config.py | [
"MIT"
] | Python | Configuration |
Holds the user parameters for the web scraping.
| Holds the user parameters for the web scraping. | [
"Holds",
"the",
"user",
"parameters",
"for",
"the",
"web",
"scraping",
"."
] | class Configuration:
"""
Holds the user parameters for the web scraping.
"""
# class attr
args = None
# PARAMETERS KWARGS KEYS
VERBOSE_KEY = 'verbose'
LIMIT_KEY = 'limit'
PRINT_KEY = 'to_print'
SAVE_KEY = 'save'
DB_KEY = 'to_database'
FETCH_KEY = 'fetch_info'
LISTING_TYPE_KEY = 'listing_type'
# CONSTANTS FOR SCRAPING
PRINTABLE = set(string.printable)
SILENCE_DRIVER_LOG = '0'
BROWSER_WIDTH = 1919
BROWSER_HEIGHT = 1079
PROPERTY_LISTING_TYPE = ('buy', 'rent', 'commercial', 'new_homes', 'all')
LISTING_MAP = {
'buy': ['buy'],
'rent': ['rent'],
'commercial': ['commercial'],
'new_homes': ['new homes'],
'all': ['buy', 'rent', 'commercial', 'new homes']
}
MAIN_URL = 'https://www.onmap.co.il/en'
URLS = {'buy': MAIN_URL + '/homes/buy',
'rent': MAIN_URL + '/homes/rent',
'commercial': MAIN_URL + '/commercial/rent',
'new homes': MAIN_URL + '/projects'}
COLUMNS_NOT_SELENIUM = ['Date', 'City_name', 'Street_name', 'House_number', 'Bathrooms', 'Rooms', 'Floor',
'Area[m^2]',
'Parking_spots_aboveground', 'Parking_spots_underground', 'Price[NIS]', 'Property_type']
SCROLL_PAUSE_TIME = 1
BETWEEN_URL_PAUSE = 3
SINGLE_ATR_ITEM = 1
TRIVIAL_NUMBER = 0
INVALID_FLOOR_TEXT_SIZE = 1
NOT_SELENIUM_PRINTING_HASH_CONSTANT = 20
NONE = 'none'
DICT_PROPERTY_ID = {'id': 'propertiesList'}
# INDICES FOR PARSING
NOT_SELENIUM_PARSING_FILE_IDX = 0
ELEM_TO_SCROLL_IDX = -1
PRICE_IDX = -1
CITY_IDX = -1
ADDRESS_IDX = -2
PROPERTY_TYPE_IDX = 1
NUM_OF_ROOMS_IDX = 0
FLOOR_IDX = 1
SIZE_IDX = 2
PARKING_SPACES_IDX = 3
FILENAME_IDX = -1
SIZE_TEXT_IDX = 0
NOT_SELENIUM_REGION_IDX = -1
URL_SPLIT_SEPARATOR = '/'
NOT_SELENIUM_SEPARATOR = '.'
SEPARATOR = ", "
PROPERTIES_LIST_IDX = 1
LEN_PROPER = 2
EMPTY = ""
DUMMY_REPLACER = 0
# XPATHS AND SELENIUM COMMANDS
SCROLL_COMMAND = "arguments[0].scrollIntoView();"
PROPERTIES_XPATH = "//div[@style='position: relative;']"
BOTTOM_PAGE_XPATH = "//div[@class='G3BoaHW05R4rguvqgn-Oo']"
# Handling strings
ENCODING = "ISO-8859-8"
COMMERCIAL_FILENAME = "commercial.csv"
NEW_HOMES_FILENAME = "new_homes.csv"
PROJECT = 'project'
COMMERCIAL = 'commercial'
# DF columns names
PRICE_COL = 'Price'
ROOM_COL = 'Rooms'
FLOOR_COL = 'Floor'
AREA_COL = 'Area'
CITY_COL = 'City'
PARKING_COL = 'Parking_spots'
PROP_TYPE_COL = 'Property_type'
LIST_TYPE_COL = 'listing_type'
@classmethod
def define_parser(cls):
"""
Creates the command line arguments
"""
arg_parser = argparse.ArgumentParser(
description="Scraping OnMap website | Checkout https://www.onmap.co.il/en/")
arg_parser.add_argument(
"property_listing_type",
choices=Configuration.PROPERTY_LISTING_TYPE,
help="choose which type of properties you would like to scrape",
type=str)
arg_parser.add_argument('--limit', '-l',
help="limit to n number of scrolls per page", metavar="n",
type=int,
required=False)
arg_parser.add_argument("--print", '-p', help="print the results to the screen", action="store_true")
arg_parser.add_argument("--save", '-s',
help="save the scraped information into a csv file in the same directory",
action="store_true")
arg_parser.add_argument("--database", '-d',
help="inserts new information found into the on_map database",
action="store_true")
arg_parser.add_argument("--fetch", '-f',
help="fetches more information for each property using Nominatim API",
action="store_true")
arg_parser.add_argument("--verbose", '-v', help="prints messages during the scraper execution",
action="store_true")
cls.args = arg_parser.parse_args() | [
"class",
"Configuration",
":",
"args",
"=",
"None",
"VERBOSE_KEY",
"=",
"'verbose'",
"LIMIT_KEY",
"=",
"'limit'",
"PRINT_KEY",
"=",
"'to_print'",
"SAVE_KEY",
"=",
"'save'",
"DB_KEY",
"=",
"'to_database'",
"FETCH_KEY",
"=",
"'fetch_info'",
"LISTING_TYPE_KEY",
"=",
"'listing_type'",
"PRINTABLE",
"=",
"set",
"(",
"string",
".",
"printable",
")",
"SILENCE_DRIVER_LOG",
"=",
"'0'",
"BROWSER_WIDTH",
"=",
"1919",
"BROWSER_HEIGHT",
"=",
"1079",
"PROPERTY_LISTING_TYPE",
"=",
"(",
"'buy'",
",",
"'rent'",
",",
"'commercial'",
",",
"'new_homes'",
",",
"'all'",
")",
"LISTING_MAP",
"=",
"{",
"'buy'",
":",
"[",
"'buy'",
"]",
",",
"'rent'",
":",
"[",
"'rent'",
"]",
",",
"'commercial'",
":",
"[",
"'commercial'",
"]",
",",
"'new_homes'",
":",
"[",
"'new homes'",
"]",
",",
"'all'",
":",
"[",
"'buy'",
",",
"'rent'",
",",
"'commercial'",
",",
"'new homes'",
"]",
"}",
"MAIN_URL",
"=",
"'https://www.onmap.co.il/en'",
"URLS",
"=",
"{",
"'buy'",
":",
"MAIN_URL",
"+",
"'/homes/buy'",
",",
"'rent'",
":",
"MAIN_URL",
"+",
"'/homes/rent'",
",",
"'commercial'",
":",
"MAIN_URL",
"+",
"'/commercial/rent'",
",",
"'new homes'",
":",
"MAIN_URL",
"+",
"'/projects'",
"}",
"COLUMNS_NOT_SELENIUM",
"=",
"[",
"'Date'",
",",
"'City_name'",
",",
"'Street_name'",
",",
"'House_number'",
",",
"'Bathrooms'",
",",
"'Rooms'",
",",
"'Floor'",
",",
"'Area[m^2]'",
",",
"'Parking_spots_aboveground'",
",",
"'Parking_spots_underground'",
",",
"'Price[NIS]'",
",",
"'Property_type'",
"]",
"SCROLL_PAUSE_TIME",
"=",
"1",
"BETWEEN_URL_PAUSE",
"=",
"3",
"SINGLE_ATR_ITEM",
"=",
"1",
"TRIVIAL_NUMBER",
"=",
"0",
"INVALID_FLOOR_TEXT_SIZE",
"=",
"1",
"NOT_SELENIUM_PRINTING_HASH_CONSTANT",
"=",
"20",
"NONE",
"=",
"'none'",
"DICT_PROPERTY_ID",
"=",
"{",
"'id'",
":",
"'propertiesList'",
"}",
"NOT_SELENIUM_PARSING_FILE_IDX",
"=",
"0",
"ELEM_TO_SCROLL_IDX",
"=",
"-",
"1",
"PRICE_IDX",
"=",
"-",
"1",
"CITY_IDX",
"=",
"-",
"1",
"ADDRESS_IDX",
"=",
"-",
"2",
"PROPERTY_TYPE_IDX",
"=",
"1",
"NUM_OF_ROOMS_IDX",
"=",
"0",
"FLOOR_IDX",
"=",
"1",
"SIZE_IDX",
"=",
"2",
"PARKING_SPACES_IDX",
"=",
"3",
"FILENAME_IDX",
"=",
"-",
"1",
"SIZE_TEXT_IDX",
"=",
"0",
"NOT_SELENIUM_REGION_IDX",
"=",
"-",
"1",
"URL_SPLIT_SEPARATOR",
"=",
"'/'",
"NOT_SELENIUM_SEPARATOR",
"=",
"'.'",
"SEPARATOR",
"=",
"\", \"",
"PROPERTIES_LIST_IDX",
"=",
"1",
"LEN_PROPER",
"=",
"2",
"EMPTY",
"=",
"\"\"",
"DUMMY_REPLACER",
"=",
"0",
"SCROLL_COMMAND",
"=",
"\"arguments[0].scrollIntoView();\"",
"PROPERTIES_XPATH",
"=",
"\"//div[@style='position: relative;']\"",
"BOTTOM_PAGE_XPATH",
"=",
"\"//div[@class='G3BoaHW05R4rguvqgn-Oo']\"",
"ENCODING",
"=",
"\"ISO-8859-8\"",
"COMMERCIAL_FILENAME",
"=",
"\"commercial.csv\"",
"NEW_HOMES_FILENAME",
"=",
"\"new_homes.csv\"",
"PROJECT",
"=",
"'project'",
"COMMERCIAL",
"=",
"'commercial'",
"PRICE_COL",
"=",
"'Price'",
"ROOM_COL",
"=",
"'Rooms'",
"FLOOR_COL",
"=",
"'Floor'",
"AREA_COL",
"=",
"'Area'",
"CITY_COL",
"=",
"'City'",
"PARKING_COL",
"=",
"'Parking_spots'",
"PROP_TYPE_COL",
"=",
"'Property_type'",
"LIST_TYPE_COL",
"=",
"'listing_type'",
"@",
"classmethod",
"def",
"define_parser",
"(",
"cls",
")",
":",
"\"\"\"\n Creates the command line arguments\n \"\"\"",
"arg_parser",
"=",
"argparse",
".",
"ArgumentParser",
"(",
"description",
"=",
"\"Scraping OnMap website | Checkout https://www.onmap.co.il/en/\"",
")",
"arg_parser",
".",
"add_argument",
"(",
"\"property_listing_type\"",
",",
"choices",
"=",
"Configuration",
".",
"PROPERTY_LISTING_TYPE",
",",
"help",
"=",
"\"choose which type of properties you would like to scrape\"",
",",
"type",
"=",
"str",
")",
"arg_parser",
".",
"add_argument",
"(",
"'--limit'",
",",
"'-l'",
",",
"help",
"=",
"\"limit to n number of scrolls per page\"",
",",
"metavar",
"=",
"\"n\"",
",",
"type",
"=",
"int",
",",
"required",
"=",
"False",
")",
"arg_parser",
".",
"add_argument",
"(",
"\"--print\"",
",",
"'-p'",
",",
"help",
"=",
"\"print the results to the screen\"",
",",
"action",
"=",
"\"store_true\"",
")",
"arg_parser",
".",
"add_argument",
"(",
"\"--save\"",
",",
"'-s'",
",",
"help",
"=",
"\"save the scraped information into a csv file in the same directory\"",
",",
"action",
"=",
"\"store_true\"",
")",
"arg_parser",
".",
"add_argument",
"(",
"\"--database\"",
",",
"'-d'",
",",
"help",
"=",
"\"inserts new information found into the on_map database\"",
",",
"action",
"=",
"\"store_true\"",
")",
"arg_parser",
".",
"add_argument",
"(",
"\"--fetch\"",
",",
"'-f'",
",",
"help",
"=",
"\"fetches more information for each property using Nominatim API\"",
",",
"action",
"=",
"\"store_true\"",
")",
"arg_parser",
".",
"add_argument",
"(",
"\"--verbose\"",
",",
"'-v'",
",",
"help",
"=",
"\"prints messages during the scraper execution\"",
",",
"action",
"=",
"\"store_true\"",
")",
"cls",
".",
"args",
"=",
"arg_parser",
".",
"parse_args",
"(",
")"
] | Holds the user parameters for the web scraping. | [
"Holds",
"the",
"user",
"parameters",
"for",
"the",
"web",
"scraping",
"."
] | [
"\"\"\"\n Holds the user parameters for the web scraping.\n \"\"\"",
"# class attr",
"# PARAMETERS KWARGS KEYS",
"# CONSTANTS FOR SCRAPING",
"# INDICES FOR PARSING",
"# XPATHS AND SELENIUM COMMANDS",
"# Handling strings",
"# DF columns names",
"\"\"\"\n Creates the command line arguments\n \"\"\""
] | [] | {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
} |
Dataset Summary
The Vault dataset is a comprehensive, large-scale, multilingual parallel dataset that features high-quality code-text pairs derived from The Stack, the largest permissively-licensed source code dataset.
We provide The Vault which contains code snippets from 10 popular programming languages such as Java, JavaScript, Python, Ruby, Rust, Golang, C#, C++, C, and PHP. This dataset provides multiple code-snippet levels, metadata, and 11 docstring styles for enhanced usability and versatility.
Supported Tasks
The Vault can be used for pretraining LLMs or downstream code-text interaction tasks. A number of tasks related to code understanding and geneartion can be constructed using The Vault such as code summarization, text-to-code generation and code search.
Languages
The natural language text (docstring) is in English.
10 programming languages are supported in The Vault: Python
, Java
, JavaScript
, PHP
, C
, C#
, C++
, Go
, Ruby
, Rust
Note: C and Go are not contained in this repo due to the nonexistence of traditional classes in these languages.
Dataset Structure
Data Instances
{
"hexsha": "78b961a6673ec1e12f8d95c33ef081f75561a87c",
"repo": "AIS-Bonn/sl-cutscenes",
"path": "sl_cutscenes/object_models.py",
"license": [
"MIT"
],
"language": "Python",
"identifier": "MeshLoader",
"original_docstring": "\n Class to load the meshes for the objects in a scene.\n ",
"docstring": "Class to load the meshes for the objects in a scene.",
"docstring_tokens": [
"Class",
"to",
"load",
"the",
"meshes",
"for",
"the",
"objects",
"in",
"a",
"scene",
"."
],
"code": "class MeshLoader:\n \"\"\"\n Class to load the meshes for the objects in a scene.\n \"\"\"\n\n def __init__(self):\n \"\"\"Module initializer\"\"\"\n self.base_dir = CONSTANTS.MESH_BASE_DIR\n self.text_dir = CONSTANTS.TEXT_BASE_DIR\n self.reset()\n\n def reset(self):\n self.loaded_meshes = []\n\n def get_meshes(self):\n \"\"\" \"\"\"\n extract_singular = lambda x: x[0] if len(x) == 1 else x\n return [extract_singular(item) for item in self.loaded_meshes]\n\n def load_meshes(self, obj_info: List[object_info.ObjectInfo], **kwargs):\n \"\"\"\n Loads the meshes whose information is given in parameter 'obj_info.\n Each call of this method APPENDS a list to the loaded_meshes attribute.\n :param obj_info: The object information of the meshes to be loaded.\n :param kwargs: additional mesh modifiers such as scale, specified with a leading 'mod_'\n \"\"\"\n paths = []\n for obj in obj_info:\n path = self.text_dir if obj.name.endswith(\"_floor\") or obj.name.endswith(\"_wall\") else self.base_dir\n paths.append((path / obj.mesh_fp).resolve())\n scales = [obj.scale for obj in obj_info]\n class_ids = [obj.class_id for obj in obj_info]\n mod_scales = kwargs.get(\"mod_scale\", [1.0] * len(scales))\n scales = [s * ms for (s, ms) in zip(scales, mod_scales)]\n flags = [mesh_flags(obj) for obj in obj_info]\n meshes = sl.Mesh.load_threaded(filenames=paths, flags=flags)\n\n # Setup class IDs\n for _, (mesh, scale, class_id) in enumerate(zip(meshes, scales, class_ids)):\n pt = torch.eye(4)\n pt[:3, :3] *= scale\n mesh.pretransform = pt\n mesh.class_index = class_id\n\n info_mesh_tuples = list(zip(obj_info, meshes))\n self.loaded_meshes.append(info_mesh_tuples)",
"code_tokens": [
"class",
"MeshLoader",
":",
"def",
"__init__",
"(",
"self",
")",
":",
"\"\"\"Module initializer\"\"\"",
"self",
".",
"base_dir",
"=",
"CONSTANTS",
".",
"MESH_BASE_DIR",
"self",
".",
"text_dir",
"=",
"CONSTANTS",
".",
"TEXT_BASE_DIR",
"self",
".",
"reset",
"(",
")",
"def",
"reset",
"(",
"self",
")",
":",
"self",
".",
"loaded_meshes",
"=",
"[",
"]",
"def",
"get_meshes",
"(",
"self",
")",
":",
"\"\"\" \"\"\"",
"extract_singular",
"=",
"lambda",
"x",
":",
"x",
"[",
"0",
"]",
"if",
"len",
"(",
"x",
")",
"==",
"1",
"else",
"x",
"return",
"[",
"extract_singular",
"(",
"item",
")",
"for",
"item",
"in",
"self",
".",
"loaded_meshes",
"]",
"def",
"load_meshes",
"(",
"self",
",",
"obj_info",
":",
"List",
"[",
"object_info",
".",
"ObjectInfo",
"]",
",",
"**",
"kwargs",
")",
":",
"\"\"\"\n Loads the meshes whose information is given in parameter 'obj_info.\n Each call of this method APPENDS a list to the loaded_meshes attribute.\n :param obj_info: The object information of the meshes to be loaded.\n :param kwargs: additional mesh modifiers such as scale, specified with a leading 'mod_'\n \"\"\"",
"paths",
"=",
"[",
"]",
"for",
"obj",
"in",
"obj_info",
":",
"path",
"=",
"self",
".",
"text_dir",
"if",
"obj",
".",
"name",
".",
"endswith",
"(",
"\"_floor\"",
")",
"or",
"obj",
".",
"name",
".",
"endswith",
"(",
"\"_wall\"",
")",
"else",
"self",
".",
"base_dir",
"paths",
".",
"append",
"(",
"(",
"path",
"/",
"obj",
".",
"mesh_fp",
")",
".",
"resolve",
"(",
")",
")",
"scales",
"=",
"[",
"obj",
".",
"scale",
"for",
"obj",
"in",
"obj_info",
"]",
"class_ids",
"=",
"[",
"obj",
".",
"class_id",
"for",
"obj",
"in",
"obj_info",
"]",
"mod_scales",
"=",
"kwargs",
".",
"get",
"(",
"\"mod_scale\"",
",",
"[",
"1.0",
"]",
"*",
"len",
"(",
"scales",
")",
")",
"scales",
"=",
"[",
"s",
"*",
"ms",
"for",
"(",
"s",
",",
"ms",
")",
"in",
"zip",
"(",
"scales",
",",
"mod_scales",
")",
"]",
"flags",
"=",
"[",
"mesh_flags",
"(",
"obj",
")",
"for",
"obj",
"in",
"obj_info",
"]",
"meshes",
"=",
"sl",
".",
"Mesh",
".",
"load_threaded",
"(",
"filenames",
"=",
"paths",
",",
"flags",
"=",
"flags",
")",
"for",
"_",
",",
"(",
"mesh",
",",
"scale",
",",
"class_id",
")",
"in",
"enumerate",
"(",
"zip",
"(",
"meshes",
",",
"scales",
",",
"class_ids",
")",
")",
":",
"pt",
"=",
"torch",
".",
"eye",
"(",
"4",
")",
"pt",
"[",
":",
"3",
",",
":",
"3",
"]",
"*=",
"scale",
"mesh",
".",
"pretransform",
"=",
"pt",
"mesh",
".",
"class_index",
"=",
"class_id",
"info_mesh_tuples",
"=",
"list",
"(",
"zip",
"(",
"obj_info",
",",
"meshes",
")",
")",
"self",
".",
"loaded_meshes",
".",
"append",
"(",
"info_mesh_tuples",
")"
],
"short_docstring": "Class to load the meshes for the objects in a scene.",
"short_docstring_tokens": [
"Class",
"to",
"load",
"the",
"meshes",
"for",
"the",
"objects",
"in",
"a",
"scene",
"."
],
"comment": [
"\"\"\"\n Class to load the meshes for the objects in a scene.\n \"\"\"",
"\"\"\"Module initializer\"\"\"",
"\"\"\" \"\"\"",
"\"\"\"\n Loads the meshes whose information is given in parameter 'obj_info.\n Each call of this method APPENDS a list to the loaded_meshes attribute.\n :param obj_info: The object information of the meshes to be loaded.\n :param kwargs: additional mesh modifiers such as scale, specified with a leading 'mod_'\n \"\"\"",
"# Setup class IDs"
],
"parameters": [],
"docstring_params": {
"returns": [],
"raises": [],
"params": [],
"outlier_params": [],
"others": []
}
}
Data Fields
Data fields for function level:
- hexsha (string): the unique git hash of file
- repo (string): the owner/repo
- path (string): the full path to the original file
- license (list): licenses in the repo
- language (string): the programming language
- identifier (string): the function or method name
- original_string (string): original version of function/class node
- original_docstring (string): the raw string before tokenization or parsing
- code (string): the part of the original that is code
- code_tokens (list): tokenized version of
code
- short_docstring (string): short, brief summarization (first line of the docstring)
- short_docstring_tokens (list): tokenized version of `short_docstring
- docstring (string): the top-level comment or docstring (docstring version without param’s doc, return, exception fields, etc)
- docstring_tokens (list): tokenized version of docstring
- comment (list): list of comments (line) inside the function/class
- parameters (list): List of parameters and its type (type can be None)
- docstring_params (dict): Dictionary of the parsed information from docstring
See here for more details and examples.
Data Splits
In this repo, the class level data is not split, and contained in only train set.
Dataset Statistics
Language | Number of samples |
---|---|
Python | 422,187 |
Java | 4,872,485 |
JavaScript | 291,479 |
PHP | 1,173,916 |
C# | 1,437,800 |
C++ | 174,370 |
Ruby | 353,859 |
Rust | 93,311 |
C | - |
Go | - |
TOTAL | 9,121,300 |
Usage
You can load The Vault dataset using datasets library: pip install datasets
from datasets import load_dataset
# Load full class level dataset
dataset = load_dataset("Fsoft-AIC/the-vault-class")
# specific language (e.g. Python)
dataset = load_dataset("Fsoft-AIC/the-vault-class", languages=['Python'])
# dataset streaming
data = load_dataset("Fsoft-AIC/the-vault-class", streaming= True)
for sample in iter(data['train']):
print(sample)
A back up dataset can be downloaded in azure storage. See Download The Vault from Azure blob storage.
Additional information
Licensing Information
MIT License
Citation Information
@article{manh2023vault,
title={The Vault: A Comprehensive Multilingual Dataset for Advancing Code Understanding and Generation},
author={Manh, Dung Nguyen and Hai, Nam Le and Dau, Anh TV and Nguyen, Anh Minh and Nghiem, Khanh and Guo, Jin and Bui, Nghi DQ},
journal={arXiv preprint arXiv:2305.06156},
year={2023}
}
Contributions
This dataset is developed by FSOFT AI4Code team.
- Downloads last month
- 468