instruction
stringclasses 1
value | input
stringlengths 437
30.9k
| output
stringlengths 539
6.78k
|
---|---|---|
Please summarize the input | Non-invasive handling of sleep apnea, snoring and emergency situationsA monitoring non-invasive device for handling of sleep apnea, snoring and emergency situations operates for breathing assistance by means of transdermal stimulation of muscle groups including the pectoralis majoris, the serratus anterior, and the abdominal muscles. A wrist mounted version may alarm drivers or others requiring focus or concentration when they fall asleep and may alert a medical center. The invention may have a pulse oximeter on a person's wrist/finger to monitor their breathing while asleep, and in the event of a serious snoring or sleep apnea episode, activate the breathing assistance pulses.What is claimed is:
| 1. A sleep apnea, snoring, emergency situations and breath assistance device configured for use by a person having a body, skin, a mouth, airways, first, second, third, and fourth pairs of abdominal muscles, and four chest muscles including first and second pectoral muscles and first and second serratus anterior muscles, the sleep apnea, snoring, emergency situations and breath assistance device comprising:
a control module having operative electrical connections to a plurality of dermal electrodes configured to be attached to such skin of such person, whereby the control module is in communication with the dermal electrodes, the control module configured so as to be worn on such person's body;
a first one of the plurality of dermal electrodes configured to be disposed on such skin of such person at one such chest muscle;
a second one of the plurality of dermal electrodes configured to be disposed on such skin of such person at one such abdominal muscle;
each of the dermal electrodes configured to deliver a plurality of pulse trains to one such respective muscle;
the control module having a stimulation module operative to send a first pulse train to such chest muscle and a second pulse train to such abdominal muscle;
the first pulse train operative to stimulate such chest muscle so as to cause a first contraction of such chest muscle;
the second pulse train operative to stimulate such abdominal muscle so as to cause a second contraction of such abdominal muscle;
whereby at least one breath is stimulated.
| 2. The sleep apnea, snoring, emergency situations and breath assistance device of claim 1, configured for use with agarment worn on such body by such person, wherein:
the control module, the dermal electrodes and the operative electrical connections are configured so as to be worn on such body concealed within such garment.
| 3. The sleep apnea, snoring, emergency situations and breath assistance device of claim 2, further comprising:
a third one of the plurality of dermal electrodes configured to be disposed on such skin of such person at a second such abdominal muscle;
a fourth one of the plurality of dermal electrodes configured to be disposed on such skin of such person at a third such abdominal muscle;
a fifth one of the plurality of dermal electrodes configured to be disposed on such skin of such person at a fourth such abdominal muscle;
the control module further operative to send the second pulse train to such second, third and fourth abdominal muscles.
| 4. The sleep apnea, snoring, emergency situations and breath assistance device of claim 3, further comprising:
a sixth one of the plurality of dermal electrodes configured to be disposed on such skin of such person at a second such chest muscle;
a seventh one of the plurality of dermal electrodes configured to be disposed on such skin of such person at a third such chest muscle;
an eighth one of the plurality of dermal electrodes configured to be disposed on such skin of such person at a fourth such chest muscle;
the control module further operative to send the first pulse train to such second, third and fourth chest muscles.
| 5. The sleep apnea, snoring, emergency situations and breath assistance device of claim 4, the pulse train further comprising:
a group of pulses consisting of a plurality of individual pulses increasing in amplitude with time, the group of pulses having a duration of 500 ms to 900 ms;
a second time out period of 2 to 3 seconds during which no pulses are sent;
repetitions of the group of pulses and the second time out period for a breath assist time period defined to last either until an autonomic breath occurs or for a period of time of no more than 3 seconds.
| 6. The sleep apnea, snoring, emergency situations and breath assistance device of claim 5, further comprising:
at least one pulse oximeter configured to be attached to such user;
the pulse oximeter sensor in operative communication with the control module;
the control module further comprising an analysis module operative to receive a data from the pulse oximeter sensor and analyze the data to determine if such person is exhibiting an autonomic breath and if such person is not exhibiting an autonomic breath for a period of 3 seconds, the control module further operative to send the pulse trains.
| 7. The sleep apnea, snoring, emergency situations and breath assistance device of claim 6, wherein the pulse oximeter sensor is further operative to alert such person by means of a signal when it sends such pulse trains.
| 8. The sleep apnea, snoring, emergency situations and breath assistance device of claim 5, further comprising:
at least one breath sensor in operative communication with the control module, the breath sensor configured to be disposed on such skin of such person;
the control module further comprising an analysis module operative to receive a data from the breath sensor and analyze the data to determine if such person is exhibiting an autonomic breath and if such person is not exhibiting an autonomic breath, the control module further operative to send the pulse trains.
| 9. The sleep apnea, snoring, emergency situations and breath assistance device of claim 5, further comprising:
at least one blood oxygen level sensor, the blood oxygen level sensor in operative communication with the control module, the blood oxygen level sensor configured to be disposed on such skin of such person;
the control module further comprising an analysis module operative to receive a data from the blood oxygen level sensor and analyze the data to determine if such person is exhibiting an oxygen level indicative of a normal breathing pattern and if such person is not, the control module further operative to send the pulse trains.
| 10. The sleep apnea, snoring, emergency situations and breath assistance device of claim 9, further comprising:
an RF communication module;
the control module having a non-volatile memory and a central processor unit, the analysis module stored in the non-volatile memory, the control module having a start button operative to activate the sleep apnea, snoring, emergency situations and breath assistance device to begin an operating cycle, using a first set of preset operating parameters also stored in the non-volatile memory;
a mobile device having an operative RF connection to the RF communication module of the control module and further having a touch screen operative to display a set of data collected by the device and enable control of the secretion clearance and cough assistance device;
the start button further operative to establish the operative RF connection to the mobile device;
the mobile device having a module operative to provide wireless control of the operation of the control module;
the mobile device operative to collect data, provide for wireless setup and wireless maintenance of the breath assistance device.
| 11. The sleep apnea, snoring, emergency situations and breath assistance device of claim 10, wherein the mobile device is operative to provide control of the control module by one mode selected from the group consisting of: manual control input to the mobile device and the control module, manual control input to the mobile device and from the mobile device to the control module, adaptive heuristic control by an artificial intelligence module loaded in the mobile device and the control module, adaptive heuristic control by an artificial intelligence module loaded in the mobile device and from the mobile device to the control module, remote control from a remote location via communication with the mobile device and from the mobile device to the control module, and combinations thereof.
| 12. The sleep apnea, snoring, emergency situations and breath assistance device of claim 10, wherein the control module is further operative to alert such person by means of a signal from such mobile device when it sends such pulse trains.
| 13. The sleep apnea, snoring, emergency situations and breath assistance device of claim 10, configured for use with a vehicle being driven by such person, such vehicle having autonomous driving capability, wherein the control module further comprises:
a communication protocol allowing the control module to control such vehicle;
the control module operative to assume control of such vehicle when it sends such pulse trains.
| 14. The sleep apnea, snoring, emergency situations and breath assistance device of claim 13, wherein the communication protocol further comprises one member selected from the group consisting of: V2X, Bluetooth, WiFi, and combinations thereof.
| 15. The sleep apnea, snoring, emergency situations and breath assistance device of claim 10, configured for use by such person in a job requiring attention and focus.
| 16. The sleep apnea, snoring, emergency situations and breath assistance device of claim 5, further comprising:
at least one blood pressure sensor, the blood pressure sensor in operative communication with the control module, the blood pressure sensor configured to be disposed on such skin of such person;
the control module further comprising an analysis module operative to receive a data from the blood pressure sensor and analyze the data to determine if such person is exhibiting normal autonomic breathing and if sch person is not exhibiting normal autonomic breathing, the control module further operative to send the pulse trains.
| 17. The sleep apnea, snoring, emergency situations and breath assistance device of claim 5, further comprising:
at least one sensor of at least one heart rate sensor, the heart rate sensor in operative communication with the control module, the heart rate sensor configured to be disposed on such skin of such person;
the control module further comprising an analysis module operative to receive a data from the heart rate sensor and analyze the data to determine if such person is exhibiting normal autonomic breathing and if such person is not exhibiting normal autonomic breathing, the control module further operative to send the pulse trains. | The device has a main portion having a shape dimensioned and configured to be worn on such arm. A control module (1514) includes a CPU within the device main portion, and has operative electrical connections to a first electrode. The first electrode is in contact with arm. A control module comprises an analysis module operative to receive data from the sensor and analyze the data to determine if such a person (1500) exhibits autonomic breath and if such person is not exhibiting such autonomic breath. The control module is operative to carry out task related to sending of a first pulse train to the first electrode, making an alert noise, alert vibration, and communicating with a vehicle and a first preferred remote terminal through a RF communication module (1516). An INDEPENDENT CLAIM is included for a method of breath assistance for use by a person. Sleep apnea, snoring, emergency situations and breath assistance device for use by person in autonomic breath. The control module is operative to alert person by audible/vibration/transmitted signal when it sends such pulse trains. The device is efficient to have the stimulation happen concurrently with the breathing or perhaps even after, or there are multiple rounds of stimulation for each breath, and so on. The device analyzes the monitored data, and examines the stimulation history, and then actually optimizes the parameters of the stimulation, thus providing a unique and optimized stimulation from moment to moment or from breath to breath. The communication protocol is selected from group consisting of Wireless Fidelity (Wi-Fi) , Bluetooth standards. The drawing shows a front view of the sleep apnea, snoring, emergency situations and breath assistance device. 1500Person1514Control module1516RF communication module1518Control device1520Sensor |
Please summarize the input | SYSTEMS AND METHODS FOR IMPROVED OPERATION OF A WORKING VEHICLEVarious apparatus and procedures for improved operation of a working vehicle are provided. One embodiment provides for vehicle-to-vehicle communications using cellular modems to provide information from one vehicle to another vehicle that has lost internet connectivity. Another embodiment provides a method for improving safety of a work area where an autonomous or remotely controlled vehicle is operating by scanning for unknown Bluetooth modules in the vicinity of a working vehicle. Another embodiment provides for intercepting and modifying signals from vehicle controls and passing the modified signals to a control unit of the vehicle.|1. A system for performing a work operation in a work area comprising:
a plurality of vehicles wherein each vehicle is equipped with a GNSS unit and a modem, and wherein the modem of each vehicle is configured to receive location corrections from an RTK network;
a processor connected to each vehicle, wherein each processor is configured to receive location information from its respective GNSS unit and location corrections from its respective modem;
wherein each processor is further configured to run mission plan software for controlling operation of its respective vehicle; and
wherein each processor is further configured to detect a loss of connection to the RTK network, connect to a local wireless network, and query any other vehicle of the plurality of vehicles for location corrections.
| 2. The system of claim 1 wherein real-time collision avoidance information is communicated in addition to the location corrections.
| 3. The system of claim 1 wherein dynamic job optimization information is communicated in addition to the location corrections.
| 4. The system of claim 1 wherein the plurality of vehicles comprise autonomous vehicles.
| 5. A system for improving safety in a work area comprising:
one or more vehicles wherein each vehicle is equipped with a Bluetooth module configured to send and receive signals from other Bluetooth modules;
a processor connected to each vehicle, wherein each processor is configured to communicate with its respective Bluetooth module; and
wherein each processor is further configured to shut down its respective vehicle if a signal transmitted by an unknown Bluetooth module is detected by the vehicle's Bluetooth module.
| 6. The method of claim 5 wherein the one or more vehicles is autonomous.
| 7. The method of claim 5 wherein the one or more vehicles is remotely controlled.
| 8. A method for improving safety in a work area comprising:
operating one or more vehicles in the work area, wherein each vehicle is equipped with a vehicle Bluetooth module configured to send and receive signals from other Bluetooth modules, and each vehicle is equipped with a processor configured to communicate with its respective vehicle Bluetooth module;
scanning for Bluetooth signals transmitted by one or more other Bluetooth modules; and
shutting down the one or more vehicles if its associated vehicle Bluetooth module receives a signal transmitted by an unknown Bluetooth module.
| 9. The method of claim 5 wherein the plurality of vehicles is autonomous.
| 10. The method of claim 5 wherein the plurality of vehicles is remotely controlled.
| 11. A method for autonomously controlling a vehicle comprising:
providing an interceptor configured to intercept one or more messages communicated by one or more armrest controls of the vehicle to an engine control unit of the vehicle;
inserting autonomous control instructions into the one or more intercepted messages to create a modified message; and
communicating the modified message to the engine control unit of the vehicle.
| 12. The method of claim 11 wherein the one or messages communicated by one or more armrest controls of the vehicle to the engine control unit of the vehicle and the modified message are communicated on a CAN bus of the vehicle. | The system has multiple vehicles where each vehicle (10) is connected with a global navigation satellite system (GNSS) unit (40) and a modem (710), where the modem of each vehicle is configured to receive location corrections from an RTK network. A processor is connected to each vehicle, where each processor is configured to receive location information from the respective GNSS unit and location corrections from its respective modem. Each processor is configured to run mission plan software for controlling operation of its respective vehicle. Each processor is configured to detect a loss of connection to the RTK network, connect to a local wireless network, and query any other vehicle of the plurality of vehicles for location corrections. INDEPENDENT CLAIMS are included for: (1) a system for improving safety in a work area; (2) the method for improving safety in a work area; (3) a method for autonomously controlling a vehicle. System for performing work operation in work area performed using a manned vehicle or by an autonomous or remotely controlled vehicle such as agricultural vehicle, a mower. The operator of the vehicle makes judgment calls about selecting a safe evacuation location and steering the vehicle quickly toward the evacuation location while avoiding injury or damage to the vehicle or to objects or people in the vehicle path. The speed of vehicle is reduced to avoid damage to vehicle, when readings captured by GNSS unit indicate that the vehicle is approaching or within the slow zone. The drawing shows a schematic view of system for performing work operation in work area .performed using a manned vehicle.10Vehicle 20Control implement 30Computer 35Microprocessor 40GNSS unit 700Base station 710Modem |
Please summarize the input | Mobile payment system for traffic prioritization in self-driving vehiclesA self-driving or autonomous vehicle transmits a vehicle-to-vehicle offer message from a user of a vehicle-connected mobile communication device riding in the self-driving vehicle to a second user of a second mobile communication device riding in a second vehicle to pay for a traffic prioritization relative to the second vehicle. The first mobile communication device receives a reply message and sends a payment to the second mobile communication device or an account associated with the second mobile communication device to obtain the traffic prioritization relative to the other vehicle. For example, the traffic prioritization may enable one vehicle to pass the other vehicle, to take precedence at an intersection or to be given priority to take a parking place or any other traffic-related advantage.The invention claimed is:
| 1. A non-transitory computer-readable medium storing computer-readable instructions in code which when executed by a processor of a mobile communication device cause the mobile communication device to:
communicatively connect to a mobile communication interface of a self-driving vehicle in which the mobile communication device is located;
receive user input defining an offer to pay for a traffic prioritization that prioritizes the self-driving vehicle relative to a second vehicle;
automatically generate an offer message in response to the user input;
automatically transmit the offer message to the mobile communication interface of the self-driving vehicle for communicating the offer message to a second mobile communication device in the second vehicle;
automatically receive a reply message from the second mobile communication device;
automatically determine if the reply message constitutes an acceptance or rejection of the offer; and
in response to determining that the reply message indicates the acceptance of the offer, send a payment to the second mobile communication device or to an account associated with the second mobile communication device to pay for the traffic prioritization.
| 2. The non-transitory computer-readable medium of claim 1 further comprising code that causes the mobile communication device to receive a confirmation message to confirm receipt of the payment.
| 3. The non-transitory computer-readable medium of claim 2 further comprising code that causes the mobile communication device to receive an acknowledgement message that the second vehicle will maneuver as soon as traffic regulations and traffic conditions permit to grant priority to the self-driving vehicle.
| 4. The non-transitory computer-readable medium of claim 1 further comprising code that causes the mobile communication device to output an alert that an estimated time of arrival at a destination will be later than originally predicted and presenting a user interface element to pay to prioritize the self-driving vehicle in traffic.
| 5. The non-transitory computer-readable medium of claim 1 further comprising code that causes the mobile communication device to receive a third-party request to expedite travel, the third-party request including a third-party payment to prioritize the self-driving vehicle in traffic, wherein the code is configured to automatically generate and transmit a third-party offer message using the third-party payment to the second mobile communication device.
| 6. A non-transitory computer-readable medium storing computer-readable instructions in code which when executed by a processor of a mobile communication device cause the mobile communication device to:
display on a user interface of the mobile communication device a fee-for-transport interface of a fee-for-transport application executing on the mobile communication device to enable a user to summon a self-driving vehicle to transport the user from a starting point to a destination for a fee;
receive user input from the user to define the destination, wherein the starting point is either a current location of the mobile communication device or a user-specified pickup location;
display pricing options based on a plurality of different levels of traffic prioritization for transport to the destination from the starting point to the destination;
receive a user-selected traffic prioritization; and
communicate a pickup request to the self-driving vehicle, the pickup request including the user-selected traffic prioritization to enable the self-driving vehicle to automatically offer one or more payments to one or more other vehicles to obtain the user-selected traffic prioritization along the route to the destination.
| 7. The non-transitory computer-readable medium of claim 6 wherein the code causes the mobile communication device to display a trip report upon arrival at the destination that indicates that the self-driving vehicle has determined that a portion of the fee allocated for prioritization payments has been unused, and the portion of the fee that was unused has been refunded to an account associated with a user of the mobile communication device.
| 8. The non-transitory computer-readable medium of claim 6 comprising code that causes the mobile communication device to display on the user interface of the mobile communication device an amount payable to arrive at the destination at a user-specified time, to present a user interface element to pay the amount, and to communicate this amount and the user-specified time to the self-driving vehicle.
| 9. The non-transitory computer-readable medium of claim 6 comprising code that causes the mobile communication device to receive real-time traffic data, to detect a traffic jam based on the real-time traffic data by determining that the self-driving vehicle is moving below a speed limit, and to send a plurality of offer messages to a plurality of vehicles to pay for prioritization.
| 10. The non-transitory computer-readable medium of claim 9 wherein the offer messages are conditional offers that are conditional on acceptance by all of the plurality of vehicles.
| 11. The non-transitory computer-readable medium of claim 6 wherein the code to display the pricing options includes code to display travel times for the pricing options.
| 12. The non-transitory computer-readable medium of claim 6 comprising code to cause the mobile communication device to use an event stored in a calendar application on the mobile communication device to determine the travel time to the event, and then automatically recommend a prioritization level to arrive at the event on time.
| 13. The non-transitory computer-readable medium of claim 6 wherein the pricing options are based on historical prioritization data that include the probabilities of offers being accepted at various price points.
| 14. The non-transitory computer-readable medium of claim 6 comprising code that causes the mobile communication device to receive a third-party request to expedite travel, the third-party request including a third-party payment to prioritize the self-driving vehicle in traffic.
| 15. A non-transitory computer-readable medium storing computer-readable instructions in code which when executed by a processor of a mobile communication device cause the mobile communication device to:
generate an emergency request for an emergency, the emergency request requesting that a self-driving vehicle be prioritized in traffic due the emergency;
transmit the emergency request to a governmental authority emergency server to request emergency prioritization; and
receive an emergency prioritization authorization from the governmental authority emergency server, the emergency prioritization authorization comprising a first cryptographic token to be broadcast by the self-driving vehicle to other vehicles to obtain priority in traffic and a second cryptographic token that is recognizable by law enforcement entities permitting the self-driving vehicle to exceed a speed limit due to the emergency.
| 16. The non-transitory computer-readable medium of claim 15 wherein the emergency request is generated in response to detecting a 911 call being made by the mobile communication device.
| 17. The non-transitory computer-readable medium of claim 15 wherein the emergency request is generated in response to a biometric sensor detecting the emergency, the biometric sensor being in the mobile communication device or in communication with the mobile communication device.
| 18. The non-transitory computer-readable medium of claim 15 comprising code that causes the mobile communication device to:
determine an emergency destination to replace a destination originally specified by the user; and
re-route the self-driving vehicle to the emergency destination.
| 19. The non-transitory computer-readable medium of claim 18 comprising code that causes the mobile communication device to:
constrain the cryptographic token to be valid only for a new route to the emergency destination.
| 20. The non-transitory computer-readable medium of claim 16 comprising code that causes the mobile communication device to:
determine an emergency destination to replace a destination originally specified by the user; and
re-route the self-driving vehicle to the emergency destination. | The medium has set of instructions for communicatively connecting to a first mobile communication interface (1000) of a self-driving vehicle (10) in which the first mobile communication device is located. User input defining offer to pay for traffic prioritization that prioritizes the self-driving vehicle relative to a primary vehicle is received. Offer message is automatically generated in response to the user input. The offer message is automatically transmitted to the first mobile communication interface of the self-driving vehicle for communicating the offer message to a second mobile communication device (1100) in the primary vehicle. Reply message is automatically received from the second mobile communication device. Judgment is made to check whether the reply message constitutes acceptance or rejection of the offer. Payment is transmitted to the second mobile communication device or to an account associated with the second mobile communication device to pay for the traffic prioritization in response to determining that the reply message indicates the acceptance of the offer. Non-transitory computer readable storage medium for realizing traffic prioritization in a self-driving vehicle i.e. car (from drawings) by a mobile payment system through a mobile communication device e.g. smartphone, cell phone, tablet, smartwatch, wearable smart device and laptop. The medium enables mutually sensing self-driving vehicles in a preset area of a road by utilizing various sensors for collision avoidance and communication through vehicle-to-vehicle messaging protocols. The drawing shows a schematic diagram of a mobile payment system.10Self-driving vehicle 11Vehicle-to-vehicle messages 1000First mobile communication interface 1100Second mobile communication interface 1101First user 1101aSecond user 1105Processor 1105aCPU 1110Mobile device memory 1115Mobile device display screen 1120Mobile device global navigation satellite system chip 1130Cellular transceiver 1140Mobile device data interface 1150User interface element 1200First vehicle-to-vehicle data transceiver 1200aSecond vehicle-to-vehicle data transceiver |
Please summarize the input | Vehicle-to-vehicle payment system for traffic prioritization in self-driving vehiclesA self-driving or autonomous vehicle has a traffic-prioritization processor to send or receive a payment to or from a central server to obtain a traffic prioritization for a route or to accept a traffic de-prioritization for the route. The central server receives and distributes payments to other vehicles traveling the route. The vehicle communicates with the central server to receive a plurality of levels of prioritization which range from a highest prioritization to a lowest prioritization, and the costs or payouts associated with each of the levels.The invention claimed is:
| 1. A self-driving vehicle comprising:
a vehicle chassis;
a motor supported by the chassis for providing propulsive power for the vehicle;
a braking system;
a steering system;
a plurality of sensors;
a self-driving processor configured to receive signals from the sensors and to generate steering, acceleration and braking control signals for controlling the steering system, the motor and the braking system of the vehicle;
a Global Navigation Satellite System (GNSS) receiver for receiving satellite signals and for determining a current location of the self-driving vehicle;
a radiofrequency data transceiver; and
a traffic-prioritization processor configured to cooperate with the radiofrequency data transceiver to:
receive from a central server a price to obtain a traffic prioritization for a route or to accept a traffic de-prioritization for the route, wherein the central server determines the price based on offers and requests to be prioritized or deprioritized from other vehicles traveling the route and wherein the central server receives payments from prioritized vehicles traveling the route and distributes payments to de-prioritized vehicles traveling the route; and
send or receive a payment to or from the central server to obtain the traffic prioritization for the route or to accept the traffic de-prioritization for the route.
| 2. The self-driving vehicle of claim 1 wherein the traffic-prioritization processor is configured to cooperate with the radiofrequency data transceiver to receive, from the central server a plurality of levels of prioritization which range from a highest prioritization to a lowest prioritization, and the costs or payouts associated with each of the levels.
| 3. The self-driving vehicle of claim 2 wherein the traffic-prioritization processor is configured to cooperate with the radiofrequency data transceiver to receive, from the central server, travel times for the levels of prioritization.
| 4. The self-driving vehicle of claim 3 comprising a user interface to display the costs or payouts for the levels of prioritization and the travel times for each of the levels of prioritization to enable a user to select the level of prioritization for the route.
| 5. The self-driving vehicle of claim 1 wherein the user interface provides an alert indicating that an estimated time of arrival at a destination will be later than originally predicted and providing a user interface element to enable a user to pay to expedite travel to the destination.
| 6. The self-driving vehicle of claim 4 wherein the user interface displays the cost to pay to obtain the traffic prioritization to the destination.
| 7. A self-driving vehicle comprising:
a vehicle chassis;
a motor supported by the chassis for providing propulsive power for the vehicle;
a braking system;
a steering system;
a plurality of sensors;
a self-driving processor configured to receive signals from the sensors and to generate steering, acceleration and braking control signals for controlling the steering system, the motor and the braking system of the vehicle;
a Global Navigation Satellite System (GNSS) receiver for receiving satellite signals and for determining a current location of the self-driving vehicle;
a radiofrequency data transceiver; and
a traffic-prioritization processor that cooperates with the radiofrequency data transceiver to:
receive, from a central server, pricing for different levels of traffic prioritization for a route, the pricing including a cost to obtain a higher traffic prioritization for the route and a payout to accept a lower traffic prioritization for the route.
| 8. The self-driving vehicle of claim 7 wherein the traffic-prioritization processor is configured to cooperate with the radiofrequency data transceiver to send to the central server a payment equal to the cost of obtaining the higher traffic prioritization for the route.
| 9. The self-driving vehicle of claim 7 wherein the traffic-prioritization processor is configured to cooperate with the radiofrequency data transceiver to receive from the central server a payment equal to the payout for accepting the lower traffic prioritization for the route.
| 10. The self-driving vehicle of claim 7 further comprising a user interface presenting costs and payouts for three or more different levels of traffic prioritization.
| 11. The self-driving vehicle of claim 10 wherein the user interface also presents costs and payouts based on times of day.
| 12. The self-driving vehicle of claim 10 wherein the user interface also presents costs and payouts based on segments of the route.
| 13. The self-driving vehicle of claim 10 wherein the user interface also presents travel times for the different levels of traffic prioritization.
| 14. An autonomous vehicle comprising:
a self-driving processor configured to receive signals from sensors to generate steering, acceleration and braking control signals for controlling a steering system, a motor and a braking system of the vehicle;
a Global Navigation Satellite System (GNSS) receiver for receiving satellite signals and for determining a current location of the vehicle;
a radiofrequency data transceiver; and
a traffic-prioritization processor cooperating with the radiofrequency data transceiver to:
communicate with a central server to receive pricing for a traffic prioritization or de-prioritization for a route; and
send or receive a payment to or from the central server for the traffic prioritization or de-prioritization for the route.
| 15. The autonomous vehicle of claim 14 wherein the pricing includes costs and payouts for different segments of the route.
| 16. The autonomous vehicle of claim 15 wherein the costs and payouts for the different segments depend on a time of day.
| 17. The autonomous vehicle of claim 14 comprising a user interface to present the costs and payouts to enable selection of a level of prioritization.
| 18. The autonomous vehicle of claim 17 wherein the user interface indicates whether the costs and payouts are above normal market prices for that particular time and place.
| 19. The autonomous vehicle of claim 14 wherein the pricing includes a bid and an ask for each segment of the route and for each level of prioritization, the bid defining a price being offered for the prioritization and the ask defining a price that is being asked to accept the prioritization.
| 20. The autonomous vehicle of claim 14 wherein the self-driving processor and the traffic-prioritization processor are integrated in a vehicle computing device. | The vehicle (10) has a vehicle chassis (12) for supporting a motor for providing propulsive power for the vehicle. A self-driving processor (100) receives signals from sensors and for generating steering, acceleration and braking control signals. A Global Navigation Satellite System (GNSS) receiver (260) receives satellite signals and determines a current location of the self-driving vehicle. A traffic-prioritization processor (200) cooperates with a radio frequency data transceiver (220) for sending or receiving a payment to or from a central server to obtain traffic prioritization for a route or to accept traffic de-prioritization for the route. Autonomous or self-driving vehicles such as car, van, minivan, sports utility vehicle (SUV), crossover-type vehicle, bus, minibus, truck, tractor-trailer, semi-trailer, construction vehicle, work vehicle, tracked vehicle, semi-tracked vehicle, offroad vehicle, electric cart and a dune buggy for utilizing sensors such as RADAR, LIDAR and/or cameras to provide signals to a processor or controller that generates and outputs steering, acceleration and braking signals to the vehicle. The vehicle allows self-driving vehicles in a given area of a road to mutually sense presence of each other using various sensors for collision avoidance through vehicle-to-vehicle messaging protocols. The vehicle can automatically perform an adjustment to own routing on the benefit of the prioritization, e.g., to pass the second vehicle, upon transfer of the payment. The drawing shows a side view of an autonomous or self-driving vehicle.10Vehicle 12Vehicle chassis 100Self-driving processor 200Traffic-prioritization processor 220Radio frequency data transceiver 260GNSS receiver |
Please summarize the input | Vehicle-to-vehicle payment system for traffic prioritization in self-driving vehiclesA self-driving or autonomous vehicle comprises a processor to transmit an offer message to another vehicle and to receive a reply message from the other vehicle, and to transfer a payment to the other vehicle to obtain a traffic prioritization relative to the other vehicle. For example, the traffic prioritization may enable one vehicle to pass the other vehicle, to take precedence at an intersection or to be given priority to take a parking place or any other traffic-related advantage.The invention claimed is:
| 1. A self-driving vehicle comprising:
a vehicle chassis;
a motor supported by the chassis for providing propulsive power for the vehicle;
a braking system;
a steering system;
a plurality of sensors;
a processor configured to receive signals from the sensors and to generate steering, acceleration and braking control signals for controlling the steering system, the motor and the braking system of the vehicle;
a Global Navigation Satellite System (GNSS) receiver for receiving satellite signals and for determining a current location of the self-driving vehicle;
a radiofrequency data transceiver; and
wherein the processor is configured to:
transmit an offer message to a second vehicle;
receive a reply message from the second vehicle; and
transfer a payment to the second vehicle to obtain a traffic prioritization relative to the second vehicle.
| 2. The self-driving vehicle of claim 1 wherein the processor is configured to receive a counteroffer from the second vehicle and is configured to accept or reject the counteroffer.
| 3. The self-driving vehicle of claim 1 wherein the processor cooperates with the radiofrequency data transceiver to communicate with a first payment server to transfer payment to a second payment server associated with the second vehicle.
| 4. The self-driving vehicle of claim 3 wherein the second payment server requests that the first payment server verify that funds are available, wherein the first payment server confirms to the second payment server that the funds are available, and wherein the second payment server confirms to the second vehicle that the funds are available.
| 5. The self-driving vehicle of claim 4 wherein the processor requests that the first payment server transfer the funds in response to receiving an acknowledgement from the second vehicle that the availability of the funds has been verified.
| 6. The self-driving vehicle of claim 5 wherein the processor receives a confirmation from the second vehicle that the second vehicle has initiated a manoeuver to reprioritize the self-driving vehicle in traffic relative to the second vehicle.
| 7. The self-driving vehicle of claim 1 wherein the processor cooperates with the radiofrequency data transceiver to communicate two parallel offer messages to the second vehicle and to a third vehicle.
| 8. The self-driving vehicle of claim 7 wherein each of the offer messages contains bits in a data field indicating that the offer is conditional on which of the second and third vehicles is first to reply.
| 9. The self-driving vehicle of claim 1 wherein the processor cooperates with the radiofrequency data transceiver to send two conditional offer messages to the second vehicle and to a third vehicle ahead of the second vehicle.
| 10. The self-driving vehicle of claim 9 wherein the conditional offer messages each contains bits in a data field indicating that the offer is conditional on both the second and third vehicles accepting.
| 11. The self-driving vehicle of claim 1 further comprising a user interface presenting pricing and timing data for two routes to enable a user of the self-driving vehicle to select one of the two routes based on both pricing and timing.
| 12. The self-driving vehicle of claim 1 further comprising a user interface presenting costs and payouts for different traffic prioritizations.
| 13. The self-driving vehicle of claim 1 further comprising a user interface presenting bid-ask pricing for different levels of traffic prioritization for different road segments, wherein bid prices are prices being offered by the self-driving vehicle to the second vehicle for the traffic prioritization and ask prices are prices the second vehicle is asking from the self-driving vehicle to grant the traffic prioritization.
| 14. The self-driving vehicle of claim 1 wherein the processor is configured to receive user-configurable multipliers for setting prices for various types of traffic manoeuvers.
| 15. The self-driving vehicle of claim 1 wherein the traffic prioritization is precedence for a parking space.
| 16. The self-driving vehicle of claim 1 wherein the traffic prioritization is precedence at an intersection.
| 17. The self-driving vehicle of claim 1 wherein the vehicle is a truck and wherein the traffic prioritization is precedence at a loading dock of a warehouse or store.
| 18. The self-driving vehicle of claim 1 wherein the processor is configured to grant precedence to an emergency vehicle upon wirelessly receiving a special code.
| 19. The self-driving vehicle of claim 1 wherein the processor automatically generates the offer message based on predetermined user settings representing priority levels set by a user wherein the priority levels are set based on time and location.
| 20. The self-driving vehicle of claim 1 wherein the payment comprises a transfer to the second vehicle of redeemable points that are stored in a database and are redeemable for a subsequent traffic prioritization in favor of the second vehicle. | The self-driving vehicle has a vehicle chassis, a motor supported by the chassis for providing propulsive power for the vehicle, a braking system, a steering system and several sensors. A processor is configured to receive signals from the sensors and to generate steering, acceleration and braking control signals for controlling the steering system, the motor and the braking system of the vehicle. A global navigation satellite system (GNSS) receiver configured for receiving satellite signals and for determining a current location of the self-driving vehicle and a radiofrequency data transceiver. The processor is configured to transmit an offer message to a second vehicle. A reply message is received from the second vehicle. A payment is transferred to the second vehicle to obtain a traffic prioritization relative to the second vehicle. The self-driving vehicles use sensors such as radio detection and ranging, light detection and ranging or cameras to provide signals to the processor or controller that generates and outputs steering, acceleration and braking signals to the vehicle. Uses included but are not limited to encompass any vehicle such as a car, van, minivan, sports utility vehicle, crossover-type vehicle, bus, minibus, truck, tractor-trailer, semi-trailer, construction vehicle, work vehicle, tracked vehicle, semi-tracked vehicle, offroad vehicle, electric cart, dune buggy. The receiving vehicle have a rule defining a monetary threshold to automatically accept an offer from a requesting vehicle. The emergency vehicle makes the request without offering any payment because the vehicle is an emergency vehicle in a first paradigm. The drawing shows a schematic view of the system for V2V payments for traffic reprioritization.10Autonomous vehicle 222Base station transceiver 250Internet 300First payment server 302Payment processing server |
Please summarize the input | Mobile payment system for traffic prioritization in self-driving vehiclesA self-driving or autonomous vehicle transmits a vehicle-to-vehicle offer message from a user of a vehicle-connected mobile communication device riding in the self-driving vehicle to a second user of a second mobile communication device riding in a second vehicle to pay for a traffic prioritization relative to the second vehicle. The first mobile communication device receives a reply message and sends a payment to the second mobile communication device or an account associated with the second mobile communication device to obtain the traffic prioritization relative to the other vehicle. For example, the traffic prioritization may enable one vehicle to pass the other vehicle, to take precedence at an intersection or to be given priority to take a parking place or any other traffic-related advantage.The invention claimed is:
| 1. A self-driving vehicle comprising:
a vehicle chassis;
a motor supported by the chassis for providing propulsive power for the vehicle;
a braking system;
a steering system;
a plurality of sensors;
a processor configured to receive signals from the sensors and to generate steering, acceleration and braking control signals for controlling the steering system, the motor and the braking system of the vehicle;
a Global Navigation Satellite System (GNSS) receiver for receiving satellite signals and for determining a current location of the self-driving vehicle;
a mobile communication interface communicatively connected to a first mobile communication device of a first user riding in the self-driving vehicle, the mobile communication interface receiving from the first mobile communication device an offer message from the first user to pay for a traffic prioritization relative to a second self-driving vehicle;
a vehicle-to-vehicle data transceiver communicatively connected to the mobile communication interface to transmit the offer message to the second self-driving vehicle to be relayed via a second mobile communication interface to a second mobile communication device of a second user riding in the second self-driving vehicle; and
wherein the mobile communication interface, via the vehicle-to-vehicle data transceiver, receives a reply message from the second mobile communication device and transmits the reply message to the first mobile communication device to cause the first mobile communication device to make a payment from a first account of the first user to a second account of the second user to obtain the traffic prioritization relative to the second self-driving vehicle; and
wherein the mobile communication interface receives a payment message from the first mobile communication device and transmits, via the vehicle-to-vehicle data transceiver, the payment message to the second mobile communication device to confirm that the payment has been being made.
| 2. The self-driving vehicle of claim 1 wherein the mobile communication interface receives a counteroffer from the second mobile communication device and relays the counteroffer to the first mobile communication device to accept or reject the counteroffer, wherein the first mobile communication device is configured to either present the counteroffer and receive user input to accept or reject the counteroffer or automatically accept or reject the counteroffer based on a user setting.
| 3. The self-driving vehicle of claim 1 comprising a fee-for-transport processor that computes a fee to transport the first user from a starting point along a route to a destination, wherein the fee is determined based on distance or travel time and is further based on a user-specified traffic prioritization received from the first mobile communication device.
| 4. The self-driving vehicle of claim 3 wherein the fee-for-transport processor communicates to the first mobile communication device a plurality of pricing options for the route based on different levels of traffic prioritization.
| 5. The self-driving vehicle of claim 4 wherein the fee-for-transport processor computes the travel times for the route for each of the different levels of traffic prioritization, wherein the travel times are computed using real-time traffic data for the route and historical prioritization data for the route for the time of day, the historical prioritization data indicative of probabilities of traffic prioritization requests being accepted for the route at the time of day.
| 6. The self-driving vehicle of claim 5 wherein the fee-for-transport processor receives a user selection of one of the different levels of traffic prioritization from the first mobile communication device, the fee-for-transport processor then automatically offering payments to other vehicles along the route to obtain traffic prioritizations and, when offers are accepted, automatically disbursing payments to the other vehicles.
| 7. The self-driving vehicle of claim 1 wherein the mobile communication interface is a Bluetooth? interface and the vehicle-to-vehicle data transceiver is a dedicated vehicle-to-vehicle short-range communications (DSRC) transceiver operating in a 5.7-5.9 GHz band.
| 8. A non-transitory computer-readable medium storing computer-readable instructions in code which when executed by a processor of a mobile communication device cause the mobile communication device to:
communicatively connect to a mobile communication interface of a self-driving vehicle in which the mobile communication device is located;
receive user input defining an offer to pay for a traffic prioritization that prioritizes the self-driving vehicle relative to a second vehicle;
automatically generate an offer message in response to the user input, the offer message being a datagram in a predetermined data format;
automatically transmit the offer message to the mobile communication interface of the self-driving vehicle for communicating the offer message to a second mobile communication device in the second vehicle, the second mobile communication device being configured to automatically read the datagram;
automatically receive a reply message from the second mobile communication device;
automatically determine if the reply message constitutes an acceptance or rejection of the offer; and
in response to determining that the reply message indicates the acceptance of the offer, send a payment to the second mobile communication device or to an account associated with the second mobile communication device to pay for the traffic prioritization.
| 9. The non-transitory computer-readable medium of claim 8 further comprising code that causes the mobile communication device to receive a confirmation message to confirm receipt of the payment.
| 10. The non-transitory computer-readable medium of claim 9 further comprising code that causes the mobile communication device to receive an acknowledgement message that the second vehicle will maneuver as soon as traffic regulations and traffic conditions permit to grant priority to the self-driving vehicle.
| 11. The non-transitory computer-readable medium of claim 8 further comprising code that causes the mobile communication device to output an alert that an estimated time of arrival at a destination will be later than originally predicted and presenting a user interface element to pay to prioritize the self-driving vehicle in traffic.
| 12. The non-transitory computer-readable medium of claim 8 further comprising code that causes the mobile communication device to receive a third-party request to expedite travel, the third-party request including a third-party payment to prioritize the self-driving vehicle in traffic, wherein the code is configured to automatically generate and transmit a third-party offer message using the third-party payment to the second mobile communication device.
| 13. A non-transitory computer-readable medium storing computer-readable instructions in code which when executed by a processor of a mobile communication device cause the mobile communication device to:
display on a user interface of the mobile communication device a fee-for-transport interface of a fee-for-transport application executing on the mobile communication device to enable a user to summon a self-driving vehicle also executing the fee-for-transport application to transport the user from a starting point to a destination for a fee;
receive user input from the user to define the destination, wherein the starting point is either a current location of the mobile communication device or a user-specified pickup location;
display pricing options based on a plurality of different levels of traffic prioritization for transport to the destination, wherein the pricing options are also based on either distance or travel time to from the starting point to the destination;
receive a user-selected traffic prioritization; and
communicate a pickup request to the self-driving vehicle, the pickup request including the user-selected traffic prioritization to enable the self-driving vehicle to automatically offer one or more payments to one or more other vehicles to obtain the user-selected traffic prioritization along the route to the destination.
| 14. The non-transitory computer-readable medium of claim 13 wherein the code causes the mobile communication device to display a trip report upon arrival at the destination that indicates that the self-driving vehicle has determined that a portion of the fee allocated for prioritization payments has been unused, and the portion of the fee that was unused has been refunded to an account associated with a user of the first mobile communication device.
| 15. The non-transitory computer-readable medium of claim 13 comprising code that causes the mobile communication device to display on the user interface of the mobile communication device an amount payable to arrive at the destination at a user-specified time, to present a user interface element to pay the amount, and to communicate this amount and the user-specified time to the self-driving vehicle.
| 16. The non-transitory computer-readable medium of claim 12 comprising code that causes the mobile communication device to receive real-time traffic data, to detect a traffic jam based on the real-time traffic data by determining that the self-driving vehicle is moving at an average speed less than 20% of a speed limit, and to send a plurality of offer messages to a plurality of vehicles to pay for prioritization.
| 17. The non-transitory computer-readable medium of claim 16 wherein the offer messages are conditional offers that are conditional on acceptance by all of the plurality of vehicles.
| 18. The non-transitory computer-readable medium of claim 13 comprising code that causes the mobile communication device to:
generate an emergency request in response to detecting a 911 call being made by the mobile communication device to signify an emergency, the emergency request requesting that the self-driving vehicle be prioritized in traffic due the emergency;
transmit the emergency request to a governmental authority emergency server to request emergency prioritization; and
receive an emergency prioritization authorization from the governmental authority emergency server, the emergency prioritization authorization comprising a first cryptographic token to be broadcast by the self-driving vehicle to other vehicles to obtain priority in traffic and a second cryptographic token that is recognizable by law enforcement entities permitting the self-driving vehicle to exceed a speed limit due to the emergency.
| 19. The non-transitory computer-readable medium of claim 18 comprising code that causes the mobile communication device to:
determine an emergency destination to replace the destination originally specified by the user;
re-route the self-driving vehicle to the emergency destination; and
constrain the cryptographic token to be valid only for a new route to the emergency destination.
| 20. The non-transitory computer-readable medium of claim 18 comprising code that causes the mobile communication device to:
detect an emergency using a sensor in, or communicatively connected to, the mobile communication device;
generate an emergency request requesting that the self-driving vehicle be prioritized in traffic in response to detecting the emergency;
transmit the emergency request to a governmental authority emergency server to request emergency prioritization; and
receive an emergency prioritization authorization from the governmental authority emergency server, the emergency prioritization authorization comprising one or both of: a first cryptographic token to be broadcast by the self-driving vehicle to other vehicles to obtain priority in traffic and a second cryptographic token recognizable by law enforcement entities permitting the self-driving vehicle to exceed a speed limit due to the emergency. | The vehicle (10) has a mobile communication interface for receiving a reply message from a second mobile communication device through a vehicle-to-vehicle data transceiver, and transmitting the reply message to a first mobile communication device to cause the first mobile communication device to make a payment from a first account of a first user to a second account of a second user to obtain traffic prioritization relative to a second self-driving vehicle. The mobile communication interface receives a payment message from the first mobile communication device and transmits the payment message to the second mobile communication device to confirm that the payment is made through the vehicle-to-vehicle data transceiver. An INDEPENDENT CLAIM is included for a non-transitory computer-readable medium storing computer-readable instructions for operating a self-driving vehicle. Self-driving vehicle i.e car. The vehicle in a given area of a road mutually sense each other's presence using various sensors for collision avoidance and can communicate through vehicle-to-vehicle messaging protocols with each other to avoid collisions. The drawing shows a schematic diagram of a system for V2V payments for traffic reprioritization.10, 10aSelf-driving vehicles 11Exchange v2v messages 222Base stations transceiver 250Internet 300, 302Payment processing servers |
Please summarize the input | SYSTEMS AND METHODS FOR AN AUTONOMOUS CONVOY WITH LEADER VEHICLEA module for a leader vehicle of a convoy can have a suite of sensors, a communication system, and a controller. The sensor suite can have at least one feature sensor that detects features and/or terrain in an environment and at least one location sensor that determines a location of the leader vehicle. Via the sensor suite, the controller can detect features as the leader vehicle travels along a route through the environment as well as the route of the leader vehicle. The controller can build a map for at least part of the environment with the detected route therethrough. Data indicative of the map and the detected route can then be transmitted to one or more follower vehicles. In some embodiments, the leader vehicle is manually driven while the follower vehicles operate autonomously. | The system has a convoy leader module (200) that is used for a leader vehicle of a convoy, and comprises a first suite of sensors (202). The first suite comprises at least one feature sensor operable to detect features or terrain in an environment to be traversed by the leader vehicle and at least one location sensor operable to determine a location of the leader vehicle. A first communication system (204) is operable to transmit one or more signals between the leader vehicle and one or more follower vehicles in the convoy. The route of the leader vehicle is detected through the environment via the at least one location sensor. A map for at least portion of the environment with the detected route is built based at least in portion on the detected one or more features and the detected route. The first data indicative of the map and the detected route are transmitted to the one or more follower vehicles in the convoy via the first communication system. An INDEPENDENT CLAIM is included for a convoy. Convoy system for autonomous vehicles with leader vehicle e.g. manned leader vehicle. The method allows the leader vehicle and the autonomous follower vehicles in the convoy to share a common map, thus improving the efficiency of the convoy. The method enables the convoy leader module to be mounted on and/or integrated with a leader vehicle, so that the leader module can use the detected features and route to construct a map, which can be shared with the follower vehicles. The follower vehicles can have their own sensors that detect the features within the environment and can use detected features to improve the route following. The shared map can include information regarding an environmental aspect such as a slip condition, roadway features, area susceptible to dust generation, and the follower vehicle can implement remedial measures at or in advance of a location of that environmental aspect. The drawing shows a simplified schematic diagram of the manned vehicle with convoy leader module. 200Convoy leader module202Sensor suite204Communication system206Control system208Data storage system |
Please summarize the input | Vehicle-to-vehicle sensor verification using sensor fusion networkA vehicle-to-vehicle sensor authentication using a sensor fusion network is used. The invention claims a system and method for sensor verification using a sensor fusion network. The sensor fusion network may include a plurality of sensors associated with one or more vehicles having autonomous or partial autonomous driving functions.|1. A vehicle sensor verification system, comprising: a coordination processor; the coordination processor is operable for performing data communication with a plurality of vehicles; a sensor fusion network; the sensor fusion network comprises a plurality of sensors; each sensor is in data communication with the coordination processor; the sensor fusion network comprises at least a first sensor operable to generate first data and a second sensor operable to generate second data; wherein the first sensor is associated with a first vehicle of the plurality of vehicles; the first data indicates a first detection state of the object; and the second data indicates a second detection state of the object.
| 2. The system according to claim 1, wherein the second sensor is associated with a second vehicle of the plurality of vehicles.
| 3. The system according to claim 1, wherein the plurality of sensors further comprises a third sensor operable to generate third data indicative of a detected state of the object, and wherein the coordinating processor is operable to be based on the first data; the second data and the third data generating coordination data, the coordination data comprises a weighted detection state of the object.
| 4. The system according to claim 3, wherein based on the first data, the second data and the third data use multiple voting algorithms to generate the coordinated data.
| 5. The system according to claim 3, wherein the second sensor is associated with the first vehicle and the third sensor is associated with the second vehicle.
| 6. The system according to claim 1, wherein the coordinating processor comprises a neural network operable to identify whether a potential trajectory of the first vehicle is free of obstacles.
| 7. The system according to claim 1, wherein the coordination processor is operable to dynamically define the sensor fusion network as the subset of the plurality of sensors according to the proximity of each of the plurality of sensors to the first vehicle.
| 8. The system according to claim 1, wherein the coordination processor is operable to detect a potential fault condition in the first sensor.
| 9. The system according to claim 1, wherein each sensor of the plurality of sensors in the sensor fusion network comprises a specified accuracy; and the coordinating processor is operable to dynamically define the sensor fusion network as subset of sensors based on the specified accuracy of each sensor.
| 10. The system according to claim 9, wherein the coordination processor is operable to select a sensor included in the sensor fusion network based on a minimum specified accuracy.
| 11. The system according to claim 1, wherein the first sensor comprises a sensor type selected from a group of sensor types; the group of sensor types comprises a radar sensor, a laser radar sensor, a proximity sensor, a camera sensor, an infrared sensor and an ultraviolet sensor; an ultrasonic sensor or a sound wave sensor.
| 12. The system according to claim 11, wherein the second sensor comprises a sensor type different from the first sensor.
| 13. The system according to claim 1, wherein the first data further comprises a first confidence value, and the second data further comprises a second confidence value.
| 14. A method for object verification by using sensor fusion network, wherein the sensor fusion network comprises a plurality of sensors, wherein at least the first sensor is associated with the first vehicle, the method comprises: based on the gap measurement of the first sensor to generate a first object data, the first object data comprises a first confidence value of the first object state and the first object state; generating a second object data based on the gap measurement of the second sensor of the plurality of sensors; the second object data comprises a second confidence value of the second object state and the second obstacle state; and generating coordination verification data, the coordination verification data indicates the coordination object state generated by using the first obstacle data and the second obstacle data.
| 15. The method according to claim 14, wherein the second sensor is associated with the second vehicle.
| 16. The method according to claim 15, further comprising generating a third obstacle data based on a gap measurement of a third sensor to a potential trajectory, the third obstacle data including a third confidence value of a third object state and a third object state; and using the first object data, the second object data and the third object data to generate the coordination verification data, wherein the third sensor is associated with the third vehicle.
| 17. The method according to claim 14, wherein the coordination verification data is generated in response to a majority vote algorithm, the majority vote algorithm utilizing at least a first obstacle data, a second obstacle data and a third obstacle data as input.
| 18. The method according to claim 17, wherein the majority of voting algorithms use weighting factors to generate track feasibility data, the weighting factors being based on a first confidence value, a second confidence value, and a third confidence value.
| 19. The method according to claim 14, wherein the coordinated verification data further indicates a coordinated confidence value associated with the coordinated object state. | The system has a coordination processor for performing data communication with multiple vehicles. A sensor fusion network is provided with multiple sensors. Each sensor is in data communication with the coordination processor. The sensor fusion network is provided with two sensors operable to generate two data. A neural network is operable to identify whether potential trajectory of the vehicles is free of obstacles. An INDEPENDENT CLAIM is included for method for performing object verification by using a sensor fusion network. Vehicle-to-vehicle sensor verification system. The system realizes autonomous or partial autonomous driving functions of the vehicles. The drawing shows a top view of a vehicle-to-vehicle sensor verification system. |
Please summarize the input | Method, device, and computer program for controlling stop of autonomous vehicle using speed profileProvided are a method, a device, and a computer program for controlling stop of an autonomous vehicle using a speed profile. The method of controlling, by a computing device, stop of an autonomous vehicle using a speed profile includes obtaining surrounding information of an autonomous vehicle, determining candidate routes for controlling stop of the autonomous vehicle on the basis of the surrounding information, calculating scores for candidate driving plans for the autonomous vehicle to travel the determined candidate routes according to a preset speed profile, and finalizing a driving plan for the autonomous vehicle on the basis of the calculated scores.What is claimed is:
| 1. A method of controlling, by a computing device, stop of an autonomous vehicle using a speed profile, the method comprising:
obtaining surrounding information of an autonomous vehicle;
determining candidate routes for controlling stop of the autonomous vehicle on the basis of the surrounding information;
calculating scores for candidate driving plans for the autonomous vehicle to travel the determined candidate routes according to a preset speed profile; and
finalizing a driving plan for the autonomous vehicle on the basis of the calculated scores.
| 2. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route and then stop at the determined candidate stop location according to a first speed profile by applying the first speed profile to the autonomous vehicle,
wherein, when a current speed of the autonomous vehicle is v 0, a current acceleration is a0, and a distance to the determined candidate stop location is starget, the first speed profile increases or reduces a speed of the autonomous vehicle from v0 to a preset target speed of vtarget using the current acceleration of a0 and a preset sectional acceleration profile, maintains the speed of the autonomous vehicle at vtarget for a certain period from a time point at which the speed of the autonomous vehicle becomes vtarget, and reduces the speed of the autonomous vehicle from vtarget to zero using the preset sectional acceleration profile and stops the autonomous vehicle at the determined candidate stop location after the certain period, and
wherein the certain period is set such that a distance traveled by the autonomous vehicle according to the first speed profile becomes s target.
| 3. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route and then stop at the determined candidate stop location according to a second speed profile by applying the second speed profile to the autonomous vehicle,
wherein, when a current speed of the autonomous vehicle is v 0, a current acceleration is a0, and a distance to the determined candidate stop location is starget, the second speed profile increases or reduces a speed of the autonomous vehicle from v0 to a preset target speed of vtarget using the current acceleration of a0 and a preset sectional acceleration profile, maintains the speed of the autonomous vehicle at vtarget for a first period from a time point at which the speed of the autonomous vehicle becomes vtarget, reduces the speed of the autonomous vehicle from vtarget to vtail using the preset sectional acceleration profile after the first period, maintains the speed of the autonomous vehicle at vtail for a second period from a time at which the speed of the autonomous vehicle becomes vtail, and reduces the speed of the autonomous vehicle from vtail to zero using the preset sectional acceleration profile and stops the autonomous vehicle at the determined candidate stop location after the second period,
wherein the first period is set such that a distance traveled by the autonomous vehicle according to the second speed profile becomes a difference between s target and stail, and
the second period is set such that a distance traveled by the autonomous vehicle according to the second speed profile becomes s tail.
| 4. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route and then stop at the determined candidate stop location according to a third speed profile by applying the third speed profile to the autonomous vehicle,
wherein, when a current speed of the autonomous vehicle is v 0, a current acceleration is a0, a distance to the determined candidate stop location is starget, the third speed profile reduces a speed of the autonomous vehicle from v0 to zero using the current acceleration of a0, a target acceleration of adecel of the autonomous vehicle, and a preset sectional acceleration profile and stops the autonomous vehicle at the determined candidate stop location, and
wherein a decel is set to a value such that a distance traveled by the autonomous vehicle according to the third speed profile becomes starget.
| 5. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route and then stop at the determined candidate stop location according to a fourth speed profile by applying the fourth speed profile to the autonomous vehicle,
wherein, when a current speed of the autonomous vehicle is v 0, a current acceleration is a0, and a distance to the determined candidate stop location is starget, the fourth speed profile reduces a speed of the autonomous vehicle from v0 to vtail using the current acceleration of a0, a target acceleration of adecel of the autonomous vehicle, and a preset sectional acceleration profile, maintains the speed of the autonomous vehicle at vtail for a certain period from a time point at which the speed of the autonomous vehicle becomes vtail, and reduces the speed of the autonomous vehicle from vtail to zero using the preset sectional acceleration profile and stops the autonomous vehicle at the determined candidate stop location after the certain period, and
wherein the certain period is set such that a distance traveled by the autonomous vehicle from the time point at which the speed of the autonomous vehicle becomes v tail becomes a difference value stail between starget and a distance stravel,ramp traveled by the autonomous vehicle until the speed of the autonomous vehicle reaches vtail.
| 6. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route according to a fifth speed profile by applying the fifth speed profile to the autonomous vehicle,
wherein, when a current speed of the autonomous vehicle is v 0 and a current acceleration is a0, the fifth speed profile increases or reduces a speed of the autonomous vehicle from v0 to vtarget using the current acceleration of a0 and a preset sectional acceleration profile and causes the autonomous vehicle to travel while maintaining the speed of the autonomous vehicle at vtarget.
| 7. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route according to a sixth speed profile by applying the sixth speed profile to the autonomous vehicle,
wherein, when a current speed of the autonomous vehicle is v 0, a current acceleration is a0, and a distance to a location at which a preset target speed of vtarget of the autonomous vehicle will be achieved is starget, the sixth speed profile increases or reduces a speed of the autonomous vehicle from v0 to vtarget using the current acceleration of a0, a target acceleration of aadjust of the autonomous vehicle, and a preset sectional acceleration profile and causes the autonomous vehicle to travel while maintaining the speed of the autonomous vehicle at vtarget, and
a adjust is set such that a distance traveled by the autonomous vehicle until the speed of the autonomous vehicle reaches vtarget becomes starget.
| 8. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route and stop according to a seventh speed profile by applying the seventh speed profile to the autonomous vehicle,
wherein, when a current speed of the autonomous vehicle is v 0 and a current acceleration is a0, the seventh speed profile reduces a speed of the autonomous vehicle from v0 to zero and stops the autonomous vehicle using the current acceleration of a0, a target acceleration of atarget of the autonomous vehicle, and a preset sectional acceleration profile.
| 9. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route and stop according to an eighth speed profile by applying the eighth speed profile to the autonomous vehicle,
wherein, when a current speed of the autonomous vehicle is v 0, the eighth speed profile reduces a speed of the autonomous vehicle from v0 to zero and stops the autonomous vehicle using a preset acceleration of aemergency, and
a emergency is a value preset without considering a current acceleration of a0 of the autonomous vehicle and a preset sectional acceleration profile.
| 10. The method of claim 1, wherein the calculating of the scores comprises determining whether the determined candidate stop locations correspond to a preset no-stopping zone and correcting the scores calculated for the determined candidate stop locations according to a result of determining whether the determined candidate stop locations correspond to the preset no-stopping zone.
| 11. The method of claim 1, wherein the calculating of the scores comprises calculating the scores for the candidate driving plans for traveling the determined candidate routes using a processor included in the computing device,
when there are a plurality of candidate driving plans for which scores will be calculated because there are the plurality of determined candidate routes or the plurality of candidate stop locations are determined on the determined candidate routes, the calculating of the scores comprises calculating the scores for the plurality of candidate driving plans for traveling the determined candidate routes using a plurality of different processors included in the computing device, the plurality of candidate driving plans including continuously traveling the plurality of candidate routes without stopping or traveling the plurality of candidate routes and then stopping at any one of the plurality of candidate stop locations determined on the plurality of candidate routes, and
the finalizing of the driving plan comprises collecting the scores calculated by the plurality of different processors and finalizing the candidate driving plan having the highest score as the driving plan for the autonomous vehicle.
| 12. The method of claim 1, further comprising:
receiving a target stop location for the autonomous vehicle from a user;
transmitting information on the received target stop location to a server and receiving a control command, which is determined according to scores calculated for the target stop location and a driving plan including a driving method to the target stop location on the basis of the preset speed profile, from the server; and
controlling the autonomous vehicle to stop at the target stop location according to the control command.
| 13. The method of claim 1, further comprising providing guide information of the finalized stop location,
wherein the providing of the guide information comprises providing information on the finalized route, the finalized stop location, and the finalized driving plan through a display provided in the autonomous vehicle, providing the information on the finalized route, the finalized stop location, and the finalized driving plan to another vehicle adjacent to the autonomous vehicle through vehicle-to-vehicle communication, or displaying the finalized stop location on a road on which the autonomous vehicle is traveling through a location display module provided in the autonomous vehicle.
| 14. A device for controlling stop of an autonomous vehicle using a speed profile, the device comprising:
a processor;
a network interface;
a memory; and
a computer program which is loaded into the memory and executed by the processor,
wherein the computer program comprises;
an instruction of obtaining surrounding information of an autonomous vehicle;
an instruction of determining candidate routes for controlling stop of the autonomous vehicle on the basis of the surrounding information;
an instruction of calculating scores for candidate driving plans for the autonomous vehicle to travel the determined candidate routes according to a preset speed profile;
an instruction of finalizing a driving plan for the autonomous vehicle on the basis of the calculated scores; and
an instruction of determining candidate stop locations on the determined candidate routes,
wherein the determining of the candidate stop locations comprises determining, as a candidate stop location, at least one of a location which is spaced a certain distance from a stop line on the determined candidate route, a location which is spaced a certain distance from a location at which an object present on the determined candidate route has stopped or is predicted to stop, and a location input by a driver or a passenger of the autonomous vehicle,
the calculating of the scores comprises calculating scores for the determined candidate stop locations and the candidate driving plans including driving methods to the determined candidate stop locations, and
the finalizing of the driving plan comprises finalizing a route, a stop location, and a driving plan including a driving method to the stop location for the autonomous vehicle on the basis of the calculated score.
| 15. A non-transitory computer-readable recording medium storing a computer program, and configured to be coupled to a computer hardware, the program includes instructions to execute operations of:
obtaining surrounding information of an autonomous vehicle;
determining candidate routes for controlling stop of the autonomous vehicle on the basis of the surrounding information;
calculating scores for candidate driving plans for the autonomous vehicle to travel the determined candidate routes according to a preset speed profile;
finalizing a driving plan for the autonomous vehicle on the basis of the calculated scores; and
determining candidate stop locations on the determined candidate routes,
wherein the determining of the candidate stop locations comprises determining, as a candidate stop location, at least one of a location which is spaced a certain distance from a stop line on the determined candidate route, a location which is spaced a certain distance from a location at which an object present on the determined candidate route has stopped or is predicted to stop, and a location input by a driver or a passenger of the autonomous vehicle,
the calculating of the scores comprises calculating scores for the determined candidate stop locations and the candidate driving plans including driving methods to the determined candidate stop locations, and
the finalizing of the driving plan comprises finalizing a route, a stop location, and a driving plan including a driving method to the stop location for the autonomous vehicle on the basis of the calculated score. | The method involves obtaining surrounding information of an autonomous vehicle (10). Candidate routes (31) are determined for controlling stop of the autonomous vehicle on the basis of the surrounding information. Scores are calculated for candidate driving plans for the vehicle (21) to travel the determined candidate routes according to a preset speed profile. A driving plan is finalized for the vehicles based on the calculated scores. Candidate stop locations (41,42) are determined on the candidate routes. A route, a stop location, and a driving plan including a driving method to the stop location are finalized based on a calculated score by a computing device e.g. personal computer. INDEPENDENT CLAIMS are included for the following:a device for controlling stop of autonomous vehicle using speed profile; anda computer program. Method for controlling stop of autonomous vehicle using speed profile. The method enables preventing the autonomous vehicle from stopping at an inappropriate location e.g. on a crosswalk, in a no-stopping or parking zone, at a crossroad, and close to a fire hydrant. The drawing shows the diagram exemplifying candidate routes and candidate stop locations.10Autonomous vehicle 22Vehicle 31First candidate route 41First candidate stop location 42Second candidate stop location |
Please summarize the input | METHOD, APPARATUS AND COMPUTER PROGRAM FOR GENERATING SURROUNDING ENVIRONMENT INFORMATION FOR AUTOMATIC DRIVING CONTROL OF VEHICLEProvided are a method, device, and computer program for generating surrounding environment information for autonomous driving control of a vehicle. According to various embodiments of the present disclosure, a method for generating surrounding environment information for autonomous driving control of a vehicle is a method performed by a computing device, comprising the steps of collecting first sensor data about the surrounding environment of a first vehicle; Generating environmental information about the first vehicle by using first sensor data obtained from the first sensor; and correcting the generated surrounding environment information by using.|1. A method performed by a computing device, comprising: collecting first sensor data relating to a surrounding environment of a first vehicle;
generating surrounding environment information about the first vehicle by using the collected first sensor data; and correcting the generated surrounding environment information using second sensor data related to a surrounding environment of the second vehicle collected from a second vehicle located adjacent to the first vehicle, wherein the generated surrounding environment information includes: Correcting the information may include setting a reference object using the collected first sensor data and the collected second sensor data;
calculating an error for the set reference object by comparing information on the set reference object included in the collected first sensor data with information on the set reference object included in the collected second sensor data; and correcting the collected second sensor data using the calculated error, and correcting the generated surrounding environment information using the corrected second sensor data. A method for generating surrounding environment information.
| 2. The method of claim 1, wherein the correcting of the generated ambient environment information using the corrected second sensor data comprises converting the generated ambient environment information to the first sensor using the collected first sensor data. dividing a shaded area including a location where data is not collected into a non-shaded area including a location where the first sensor data is collected; and converting the shaded area into the non-shaded area by correcting the shaded area using the corrected second sensor data.
| 3. The method of claim 2, wherein the converting of the shaded area into the non-shaded area comprises: correcting surrounding environment information of the second vehicle generated according to the collected second sensor data using the calculated error; and correcting the shaded area using the corrected surrounding environment information of the second vehicle.
| 4. delete
| 5. The method of claim 1, wherein the calculating of the error with respect to the set reference object comprises time information included in the collected first sensor data and the collected second sensor data using a time protocol. synchronizing the received time information; and comparing the information on the set reference object included in the first sensor data with which the time information is synchronized with the information on the set reference object included in the second sensor data with which the time information is synchronized to determine the set reference object. A method for generating surrounding environment information for autonomous driving control of a vehicle, comprising calculating an error for
| 6. The method of claim 1, wherein the calculating of the error with respect to the set reference object comprises, when there is a history of occurrence of an event for the second vehicle, the event from the second vehicle based on a history of occurrence of the event. collecting information about a first point in time when
calculating a time error between the collected first sensor data and the collected second sensor data by comparing the first time point with a second time point when the first vehicle detects an event generated from the second vehicle; and correcting time information included in the collected first sensor data and time information included in the collected second sensor data by using the calculated time error, and correcting the time information included in the corrected first sensor data. Comparing information on the reference object with information on the set reference object included in the calibrated second sensor data to calculate an error for the set reference object, How to generate information.
| 7. The method of claim 1, wherein the calculating of the error for the set reference object comprises, when two or more reference objects are set, information on the set two or more reference objects included in the collected first sensor data and the collected calculating two or more position errors for each of the two or more set reference objects by comparing information on the set two or more reference objects included in the second sensor data; and determining a position error between the collected first sensor data and the collected second sensor data by optimizing the calculated sum of the two or more position errors to have a minimum value. A method for generating surrounding environment information for
| 3. The method of claim 2, wherein the converting of the shaded area to the non-shaded area comprises: the calculated error—the calculated error when the surrounding environment information of the second vehicle includes dynamic object information; Time information, location information, and direction information of the dynamic object included in the dynamic object information are corrected using - including time error, position error, and direction error for the object, and the corrected dynamic object information is used to A method of generating surrounding environment information for autonomous driving control of a vehicle, comprising correcting a shaded area.
| 9. The method of claim 1, wherein the setting of the reference object comprises position information about the set reference object included in the collected first sensor data and information about the set reference object included in the collected first sensor data. Comparing location information to calculate a location difference value; and determining whether a reference object set using the collected first sensor data and a reference object set using the second sensor data are the same object according to whether the calculated position difference value is within a predetermined value. A method for generating surrounding environment information for autonomous driving control of a vehicle, comprising:
| 3. The method of claim 2, wherein the converting the shaded area into the non-shaded area comprises, when two or more second sensor data are collected from two or more second vehicles adjacent to the first vehicle, the collected two or more second sensor data. Correcting the shaded area using each of the second sensor data, but calculating the importance of the two or more second vehicles when the collected two or more second sensor data collected at the location corresponding to the shaded area are different and correcting the shadow area using only the second sensor data collected from the second vehicle having the highest calculated importance.
| 11. The method of claim 1, wherein second sensor data about the surrounding environment of the second vehicle is collected from the second vehicle by being directly connected to the second vehicle through V2V communication (Vehicle-to-Vehicle Communication), or a plurality of Connecting to a control server that collects sensor data on the surrounding environment of the vehicle and receiving second sensor data on the surrounding environment of the second vehicle from the control server, further comprising: How to generate environmental information.
| 12. The method of claim 1, wherein the generating of the surrounding environment information comprises: generating a grid map for a predetermined range based on the first vehicle, wherein the grid map includes a plurality of grids; And by recording the collected first sensor data on a grid corresponding to a location where the collected first sensor data was collected, a non-shaded area including a grid on which the collected first sensor data was recorded and the collected first sensor data Generating ambient environment information including a shaded area including a grid in which first sensor data is not recorded, and correcting the generated ambient environment information using the corrected second sensor data, A method of generating ambient environment information for controlling autonomous driving of a vehicle, comprising correcting second sensor data collected at a location corresponding to a grid included in the shaded area and recording the corrected second sensor data in a grid included in the shaded area.
| 13. Processor;
Network interface;
Memory; and a computer program loaded into the memory and executed by the processor, wherein the computer program includes instructions for collecting first sensor data related to a surrounding environment of the first vehicle;
instructions for generating surrounding environment information about the first vehicle by using the collected first sensor data; and instructions for correcting the generated surrounding environment information using second sensor data related to the surrounding environment of the second vehicle collected from a second vehicle located adjacent to the first vehicle, wherein the generated surrounding environment information includes: The instructions for correcting information may include instructions for setting a reference object using the collected first sensor data and the collected second sensor data;
instructions for calculating an error for the set reference object by comparing information on the set reference object included in the collected first sensor data with information on the set reference object included in the collected second sensor data; and an instruction for correcting the collected second sensor data using the calculated error and correcting the generated surrounding environment information using the corrected second sensor data. A computing device that performs a method for generating surrounding environment information.
| 14. coupled with the computing device, collecting first sensor data relating to the surrounding environment of the first vehicle;
generating surrounding environment information about the first vehicle by using the collected first sensor data; and correcting the generated surrounding environment information using second sensor data related to a surrounding environment of the second vehicle collected from a second vehicle located adjacent to the first vehicle, wherein the generated surrounding environment information includes: Correcting the information may include setting a reference object using the collected first sensor data and the collected second sensor data;
calculating an error for the set reference object by comparing information on the set reference object included in the collected first sensor data with information on the set reference object included in the collected second sensor data; and correcting the collected second sensor data using the calculated error, and correcting the generated surrounding environment information using the corrected second sensor data. A computer program stored in a recording medium readable by a computing device in order to execute an environmental information generating method. | The method involves setting a reference object using the collected first sensor data and the collected second sensor data. An error for the set reference object is calculated by comparing information on the set reference object included in the collected first sensor data with information on the set reference object included in the collected second sensor data. The collected second sensor data is corrected using the calculated error, and the generated surrounding environment information is corrected using the corrected second sensor data. INDEPENDENT CLAIMS are included for the following:a computing device for generating surrounding environment information for autonomous driving control of vehicle; anda computing program for generating surrounding environment information for autonomous driving control of vehicle. Method for generating surrounding environment information for autonomous driving control of vehicle. The safer self-driving control of the host vehicle is enabled by removing an area in which the host vehicles cannot perceive it as a result of the correction. The drawing shows a flowchart illustrating the process for generating surrounding environment information for autonomous driving control of vehicle. (Drawing includes non-English language text) S110Step for collecting first sensor data about the surrounding environment of the first vehicleS120Step for generating surrounding environment information about the first vehicleS130Step for using second vehicle data collected from the second vehicle adjacent to the first vehicle |
Please summarize the input | Method for automatically driving vehicle in group queue, device and electronic deviceThe invention claims a method for automatically driving vehicle in group team, device and electronic device; Wherein, the method comprises: the RSU side obtains the information sent by the target automatic driving vehicle, and periodically broadcasts the group request information to confirm at least one group automatic driving vehicle of the application group in the rest automatic driving vehicle; obtaining the team vehicle attribute information of the team automatic driving vehicle, and according to the target vehicle attribute information and team vehicle attribute information for team planning, obtaining the target team strategy; the target group strategy respectively sent to the target automatic driving vehicle and team automatic driving vehicle, so that the target automatic driving vehicle and team automatic driving vehicle group according to the target group strategy; so as to realize the remote distance (namely exceeds the vehicle communication distance) automatic driving vehicle information interaction through the RSU side and the way group, improves the high efficiency and safety of the automatic driving vehicle in the group.|1. A method for automatically driving vehicle in group queue, wherein it is applied to the RSU side; the RSU side is in communication connection with the vehicle side; wherein the vehicle side comprises: a plurality of automatic driving vehicle in the same driving direction, and the distance between any two of the automatic driving vehicle is not less than the vehicle communication distance, the method comprises: obtaining the information sent by the target automatic driving vehicle; Wherein, the information includes: target vehicle attribute information and team request information; periodically broadcasting the group request information to confirm at least one group automatic driving vehicle of the application group in the rest of the automatic driving vehicle; obtaining the team vehicle attribute information of the team automatic driving vehicle, and according to the target vehicle attribute information and the team vehicle attribute information for team planning, obtaining the target team strategy; sending the target group strategy to respectively target automatic driving vehicle and the group automatic driving vehicle, so that the target automatic driving vehicle and the group automatic driving vehicle group according to the target group strategy.
| 2. The method according to claim 1, wherein the vehicle attribute information of each said automatic driving vehicle comprises: static information and dynamic information; Wherein, the static information includes: vehicle body parameter and vehicle engine power; the dynamic information comprises: real time position and real time speed; according to the target vehicle attribute information and the team vehicle attribute information of the team planning step, comprising: according to the target dynamic information of the target automatic driving vehicle and the team dynamic information of the team automatic driving vehicle, performing team planning according to the oil consumption as the target, obtaining the target team strategy; Wherein, the target group policy comprises: in the formation process, the target automatic driving vehicle corresponding to the first group speed, the group automatic driving vehicle corresponding to the second group speed, and the group finishing time.
| 3. The method according to claim 2, wherein the step of sending the target team strategy to respectively target automatic driving vehicle and the team automatic driving vehicle comprises the following steps: sending the first group team speed and the group team finishing time to the target automatic driving vehicle, and sending the second group team speed and the group team finishing time to the group team automatic driving vehicle, so that the target automatic driving vehicle according to the first group team speed; the group automatic driving vehicle group according to the second group speed.
| 4. The method according to claim 1, wherein the method further comprises: obtaining the team information sent by the target automatic driving vehicle; Wherein, the team information comprises at least one of the following: group ID information, the ID information of the pilot vehicle, the driving direction of the team, the current position of the team, each vehicle information in the team, the cruising speed of the team, the train space of the team, the member number of the team, the ID list of the team member, the length of the team and the driving route of the team; sending the team information to the team automatic driving vehicle.
| 5. The method according to claim 1, wherein each of the automatic driving vehicle is further configured with a free cruise mode and a group cruise mode; the method further comprises: when monitoring the group is finished, generating mode switching instruction; the mode switching instruction respectively sent to the target automatic driving vehicle and the group automatic driving vehicle, so that the target automatic driving vehicle and the group automatic driving vehicle are switched from the free cruise mode to the group cruise mode according to the mode switching instruction.
| 6. The method according to claim 1, wherein the group request information further carries with priority information; the method further comprises: obtaining the group request information set; wherein the group request information set comprises a plurality of the group request information of the same time, each of the group request information corresponding to different of the automatic driving vehicle; based on the priority information carried by the group request information, determining the target group request information.
| 7. The method according to claim 1, wherein the step of confirming at least one group automatic driving vehicle of the application group in the rest of the automatic driving vehicles comprises: when monitoring the confirmation application team information, the other said automatic driving vehicle, the confirmation application group information corresponding to the automatic driving vehicle is confirmed as the team automatic driving vehicle.
| 8. An automatic driving vehicle in the group device, wherein it is applied to the RSU side; the RSU side is connected with the vehicle side communication; wherein the vehicle side comprises: a plurality of automatic driving vehicle in the same driving direction, and the distance between any two of the automatic driving vehicle is not less than the vehicle communication distance, the device comprises: an obtaining module for obtaining the information sent by the target automatic driving vehicle; Wherein, the information includes: target vehicle attribute information and team request information; a broadcast module, for periodically broadcasting the team request information to confirm at least one group automatic driving vehicle of the application group in the rest of the automatic driving vehicle; planning module, for obtaining the team vehicle attribute information of the team automatic driving vehicle, and according to the target vehicle attribute information and the team vehicle attribute information for team planning, obtaining the target team strategy; a sending module, used for sending the target team strategy to respectively target automatic driving vehicle and the team automatic driving vehicle, so that the target automatic driving vehicle and the team automatic driving vehicle group according to the target team strategy.
| 9. An electronic device, comprising a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the steps of the method according to any one of claims 1-7 when the computer program is executed.
| 10. A computer readable storage medium, wherein the computer readable storage medium is stored with a computer program; when the computer program is run by the processor, executing the steps of the method according to any one of the preceding claims 1-7 | The method involves obtaining information sent by a target automatic driving vehicle, where the information includes target vehicle attribute information and team request information. The group request information is periodically broadcasted. The target vehicle attribute information of a team is obtained. A target team strategy is obtained according to target vehicle information and team attribute information. Team planning process is performed on the target team strategy. Each target group strategy is sent to each target automatic drive vehicle and a group automatic drive vehicles. The target automatic driving vehicle and a group automatic driving vehicle group are determined according to the target group strategy. INDEPENDENT CLAIMS are included for: (1) a device for automatically driving vehicle in group queue in RSU side;(2) an electronic device comprising a processor and a memory to execute a set of instructions for performing a method for automatically driving vehicle in group queue in RSU side;(3) a computer readable storage medium for storing a set of instructions for performing a method for automatically driving vehicle in group queue in RSU side. Method for automatically driving vehicle in group queue in restricted stock unit (RSU) side. The method enables realizing long distance automatic driving vehicle information interaction through the RSU side and the way group so as to improve high efficiency and safety of the automatic driving vehicles in the group. The method enables allowing the target vehicle to perform team planning according to the target dynamic information and the team dynamic information, so that the target driving vehicle and team driving vehicle team is formed according to a target team strategy, and improving the high efficiency of the vehicle in the vehicle group. The drawing shows a flow diagram a method for automatically driving vehicle in group queue in RSU side. (Drawing includes non-English language text). |
Please summarize the input | AUTONOMOUS VEHICLE COMMUNICATION FRAMEWORK FOR MULTI-NETWORK SCENARIOSApproaches for Multi-Access Edge Computing (MEC) Vehicle-to-Everything (V2X), Vehicle-To-Vehicle (V2V), and Autonomous Vehicles Distributed Networks (AVDN) functions in a MEC infrastructure are discussed. In various examples, operations and network configurations are described that use a service in an AVDN, including: identifying a service condition (e.g., based on a state of a service and connectivity to an instance of the service); establishing a connection in the AVDN in response to the service condition (e.g., using vehicle-to-vehicle (V2V) or Vehicle-to-Everything (V2X) network communications to the AVDN); and performing a service operation with the service via the AVDN.What is claimed is:
| 1. A user equipment (UE) of a first autonomous vehicle (AV), comprising:
a network interface configured to perform vehicle-to-vehicle (V2V) or Vehicle-to-Everything (V2X) network communications with an autonomous vehicle distributed network (AVDN); and
at least one processor configured to:
identify a service condition based on a state of a service and connectivity to an instance of the service in an infrastructure network;
establish a connection in the AVDN to a second UE at a second AV in response to the service condition, using the V2V or V2X network communications, the AVDN further to provide connectivity between the UE and the second UE for use of the service; and
perform a service operation with the service via the AVDN, using the connection to the second UE. cm 2. The UE of claim 1, wherein the AVDN is formed among a plurality of AVs, the AVDN connecting at least the UE of the first AV and the second UE of the second AV.
| 3. The UE of claim 1, wherein the service operation performed by the UE includes providing a service request to the AVDN, wherein the first AV operates as a service requestor, and wherein the second AV operates as a service provider.
| 4. The UE of claim 1, wherein the service operation performed by the UE includes fulfillment of a service request from the AVDN, wherein the first AV operates as a service provider, and wherein the second AV operates as a service requestor.
| 5. The UE of claim 1, wherein the service condition is identified in response to a change of the state of the service, and wherein the change of the state of service is associated with one or more of:
availability of data from the service;
availability of a resource used by the service;
unavailability of the instance of the service in the infrastructure network; or
a safety-related scenario involving the first AV, the second AV, or the service.
| 6. The UE of claim 1, wherein the service condition is identified in response to the UE being located outside a coverage area of the infrastructure network, and wherein the infrastructure network is a wireless network operated from one or more fixed locations and operated in accordance with a standard from a 3rd Generation Partnership Project (3GPP) 5G, Intelligent Transport Systems (ITS)-G5, or Dedicated Short Range Communications (DSRC) family of standards.
| 7. The UE of claim 1, wherein the service operation relates to: data sharing, decision sharing, or task computation sharing; and
wherein the service operation provides fulfillment of an application operating at the first AV or the second AV.
| 8. The UE of claim 1, wherein the service is provided by a Multi-Access Edge Computing (MEC) host,
wherein the MEC host operates according to a standard from an European Telecommunications Standards Institute (ETSI) MEC standards family, and
wherein (i) the UE operates as a MEC client and the second UE operates as the MEC host, or (ii) the UE operates as the MEC host and the second UE operates as a MEC client.
| 9. The UE of claim 1, wherein the service operation is established using an Application Programming Interface (API) for the AVDN, the API for the AVDN providing a standardized interface to invoke the service operation between the UE and the second UE.
| 10. The UE of claim 1, the at least one processor further configured to:
perform authentication of the UE with an authentication server of the AVDN, wherein the connection with the AVDN is established in response to successful authentication.
| 11. The UE of claim 1, wherein the UE is configured by the AVDN to operate as an anchor service provider, wherein the at least one processor is further configured to:
perform a service request with a third UE of a third AV;
obtain service response data, in response to the service request with the third UE; and
provide the service response data to the second UE.
| 12. At least one non-transitory machine readable medium including instructions for coordinating service operations from a first user equipment (UE) of an autonomous vehicle (AV) with an autonomous vehicle distributed network (AVDN), wherein the instructions, when executed by processing circuitry, cause the processing circuitry to perform operations comprising:
identify a service condition, based on a state of a service and connectivity to an instance of the service in an infrastructure network;
establish a connection in the AVDN to a second UE at a second AV in response to the service condition, using vehicle-to-vehicle (V2V) or Vehicle-to-Everything (V2X) network communications to the AVDN, the AVDN further to provide connectivity between the UE and the second UE for use of the service; and
perform a service operation with the service via the AVDN, using the connection to the second UE.
| 13. The non-transitory machine readable medium of claim 12, wherein the AVDN is formed among a plurality of AVs, the AVDN connecting at least the UE of the first AV and the second UE of the second AV.
| 14. The non-transitory machine readable medium of claim 12, wherein the service operation performed by the UE includes providing a service request to the AVDN, wherein the first AV operates as a service requestor, and wherein the second AV operates as a service provider.
| 15. The non-transitory machine readable medium of claim 12, wherein the service operation performed by the UE includes fulfillment of a service request from the AVDN, wherein the first AV operates as a service provider, and wherein the second AV operates as a service requestor.
| 16. The non-transitory machine readable medium of claim 12, wherein the service condition is identified in response to a change of the state of the service, and wherein the change of the state of service is associated with one or more of:
availability of data from the service;
availability of a resource used by the service;
unavailability of the instance of the service in the infrastructure network; or
a safety-related scenario involving the first AV, the second AV, or the service.
| 17. The non-transitory machine readable medium of claim 12, wherein the service condition is identified in response to the UE being located outside a coverage area of the infrastructure network, and wherein the infrastructure network is a wireless network operated from one or more fixed locations and operated in accordance with a standard from a 3rd Generation Partnership Project (3GPP) 5G, Intelligent Transport Systems (ITS)-G5, or Dedicated Short Range Communications (DSRC) family of standards.
| 18. The non-transitory machine readable medium of claim 12, wherein the service operation relates to: data sharing, decision sharing, or task computation sharing; and
wherein the service operation provides fulfillment of an application operating at the first AV or the second AV.
| 19. The non-transitory machine readable medium of claim 12, wherein the service is provided by a Multi-Access Edge Computing (MEC) host,
wherein the MEC host operates according to a standard from an European Telecommunications Standards Institute (ETSI) MEC standards family, and
wherein (i) the UE operates as a MEC client and the second UE operates as the MEC host, or (ii) the UE operates as the MEC host and the second UE operates as a MEC client.
| 20. The non-transitory machine readable medium of claim 12, wherein the service operation is established using an Application Programming Interface (API) for the AVDN, the API for the AVDN providing a defined interface to invoke the service operation between the UE and the second UE.
| 21. The non-transitory machine readable medium of claim 12, the instructions further to perform operations comprising:
performing authentication of the UE with an authentication server of the AVDN, wherein the connection with the AVDN is established in response to successful authentication.
| 22. The non-transitory machine readable medium of claim 12, wherein the UE is configured by the AVDN to operate as an anchor service provider, the instructions further to perform operations comprising:
performing a service request with a third UE of a third AV;
obtaining service response data, in response to the service request with the third UE; and
providing the service response data to the second UE.
| 23. A system, comprising:
at least one network communication device adapted to perform vehicle-to-vehicle (V2V) or Vehicle-to-Everything (V2X) network communications; and
at least one processing device that, when in operation, is configured by instructions to:
operate the at least one network communication device to establish an autonomous vehicle distributed network (AVDN), using the V2V or V2X network communications;
receive a service request via the AVDN, the AVDN further to provide connectivity between the at least one processing device and at least one other device to operate at least one service;
identify a service condition based on the service request; and
perform a service operation with the at least one service, via the AVDN, based on the identified service condition.
| 24. The system of claim 23, wherein the at least one network communication device and the at least one processing device is included in a first autonomous vehicle (AV), wherein the service operation includes providing a service request to at least a second AV accessible via the AVDN, wherein the first AV operates as a service requestor, and wherein the second AV operates as a service provider.
| 25. The system of claim 23, wherein the at least one network communication device and the at least one processing device is included in a first autonomous vehicle (AV), wherein the service operation includes fulfillment of a service request from at least a second AV accessible via the AVDN, wherein the first AV operates as a service provider, and wherein the second AV operates as a service requestor. | The user equipment has a network interface configured to perform vehicle-to-vehicle (V2V) or Vehicle-to-Everything (V2X) network communications with an autonomous vehicle distributed network (AVDN). A processor (1704) identifies a service condition based on a state of a service and connectivity to an instance of the service in an infrastructure network, and establishes a connection in the AVDN to a second UE at a second AV in response to the service condition using the V2V or V2X network communications. The AVDN provides connectivity between the firs UE and the second UE for use of the service. The processor performs a service operation with the service through the AVDN using the connection to the second UE. INDEPENDENT CLAIMS are included for:(1) a non-transitory machine-readable medium including instructions for coordinating service operations from first UE of AV with AVDN;(2) a system for coordinating service operations from first UE of AV with AVDN. User equipment (UE) for a first autonomous vehicle (AV) e.g. car for coordinating service operations from first AV with autonomous vehicle distributed network (AVDN), in network settings such as multi-access edge computing (MEC) infrastructures and in multi-mobile network operator (MNO) scenarios. The method provides reduced latency, increased responsiveness, and more available computing power than offered in traditional cloud network services and wide area network connections. The method allows a cloud consumer to unilaterally provision computing capabilities such as server time and network storage, as needed automatically without requiring human interaction with a service's provider, so that the capabilities can be rapidly and elastically provisioned automatically to quickly scale out and rapidly released with minimal management effort or interaction with the service provider. The drawing shows a block diagram of a compute node system.1700Edge compute node 1702Compute circuitry 1704Processor 1706Memory 1710Data storage |
Please summarize the input | CLIMATE BASED SELF- SPEED CONTROL SYSTEM IN CAR USING ARTIFICIAL INTELLIGENCEThis system introduces a paradigm shift in vehicular autonomy, integrating adaptive artificial intelligence to not only enable autonomous driving but also dynamically adjust vehicle speed based on real-time climate and environmental conditions. These sensors meticulously capture and feed real-time data on the vehicle's surroundings into a robust artificial intelligence framework. Unlike conventional systems, our solution employs cutting-edge machine learning algorithms to process and fuse this data, facilitating precise decision-making. The self-speed control system transcends the traditional boundaries of autonomous driving by actively responding to an array of environmental factors. It continuously monitors and adapts to factors such as traffic conditions, weather dynamics, temperature fluctuations, wind patterns. Air Quality Index (AQl), road infrastructure, and unforeseen obstacles. Crucially, the system operates with a paramount focus on safety, restraining speed until optimal environmental conditions are assured. This dynamic approach ensures not only safe but also smooth and efficient navigation, even in the most challenging and unpredictable environments. Through this innovative self-speed control system, we pave the way for a safer, more efficient transportation landscape.|1. The Climate based self-speed control system in car using Artificial lnteligence comprises: LiDAR sensor(1), Radar sensors(2), ultrasonic sensor(3), Steering system(4). Throttle and brakes(5), Battery(6), Cellular network(7),V2x Communication(8), GPS and inertial navigation systems(10).
| 2. LiDAR (Light Detection and Ranging)(1): This sensor emits laser pulses and measures the reflected light to create a highly accurate 3D map of the surroundings. It's like having superpowered vision, able to see obstacles in darkness, fog, and even behind comers.
| 3. Radar (Radio Detection and Ranging)(2): Similar to LiDAR(1), radar uses radio waves to detect objects and measure their distance and speed. It's a good backup for LiDAR(1), especially in bad weather conditions.
| 4. Ultrasonic sensors(3): These sensors emit high-frequency sound waves to detect nearby objects, providing short-range obstacle detection, especially useful for parking and manoeuvring in tight spaces.
| 5. Steering system(4): The car's steering wheel is controlled by electric motors or hydraulic actuators that turn the wheels based on the decisions made by the computer.
| 6. Throttle and brakes(5): The car's speed is controlled by electronically controlled motors that adjust the throttle and apply the brakes as needed.
| 7. Battery(6): Self-Speed control cars typically use large batteries to power all the on-board electronics and sensors. Some may have hybrid systems with an additional engine for range extension
| 8. Cellular network(7): Self-Speed control cars can connect to the cellular network to download maps, traffic updates, and communicate with other vehicles or infrastructure.
| 9. V2X (Vehicle-to-Everything) communication(8): This technology allows cars to communicate directly with each other and with roadside infrastructure, further enhancing safety and traffic flow.
| 10. GPS and inertial navigation systems(9): These provide precise location and direction information, even in areas with limited cellular coverage. | The system has a light detection and ranging (LiDAR) sensor for emitting laser pulses and measuring reflected light to create a highly accurate three-dimensional (3D) map of surroundings. Ultrasonic sensors emit high-frequency sound waves to detect nearby objects for providing short-range obstacle detection. Electric motors or hydraulic actuators control car's steering wheel and turns the wheels based on the decisions made by a computer. Electronically controlled motors control car's speed and adjust a throttle. A battery powers all on-board electronics and sensors. INDEPENDENT CLAIMS are included for: (1) liDAR Light Detection and Ranging: This sensor emits laser pulses and measures the reflected light to create a highly accurate D map of the surroundings. It's like having superpowered vision; (2) radar Radio Detection and Ranging: Similar to LiDAR; (3) ultrasonic sensors: These sensors emit high-frequency sound waves to detect nearby objects; (4) steering system: The car's steering wheel is controlled by electric motors or hydraulic actuators that turn the wheels based on the decisions made by the computer.; (5) throttle and brakes: The car's speed is controlled by electronically controlled motors that adjust the throttle and apply the brakes as needed.; (6) battery: Self-Speed control cars typically use large batteries to power all the on-board electronics and sensors. Some may have hybrid systems with an additional engine for range extension; (7) cellular network: Self-Speed control cars can connect to the cellular network to download maps; (8) vX Vehicle-to-Everything communication: This technology allows cars to communicate directly with each other and with roadside infrastructure; (9) gPS and inertial navigation systems: These provide precise location and direction information. Climate based self-speed control system for autonomous vehicles i.e. self-driving cars. The system operates in a manner that restricts the driver from increasing the speed until the environment reaches a climate-neutral state, thus ensuring a safer and more efficient mode of transportation. By harnessing cutting-edge technologies and their seamless integration, the system aims to redefine the capabilities of autonomous vehicles, specifically targeting their adaptability to varying climate conditions, ultimately enhancing road safely and optimizing transportation efficiency in diverse environmental settings. The system is capable of empowering vehicles to autonomously regulate their speed, navigate diverse and challenging environments, and dynamically adapt to various driving scenarios. |
Please summarize the input | TRANSMISSION CONTROL IN APPLICATION LAYER BASED ON RADIO BEARER QUALITY METRICS IN VEHICULAR COMMUNICATIONMethods, apparatuses, and computer-readable mediums for wireless communication are disclosed by the present disclosure. In an aspect, an application layer in a user equipment (UE) receives, from an access layer in the UE, a quality of service (QoS) indication comprising a metric that represents a quality of one or more radio bearers used for a vehicular communication with one or more other UEs. The application layer performs a transmission control over the vehicular communication based on the QoS indication.What is claimed is:
| 1. A method of wireless communication, comprising:
receiving, by an application layer in a user equipment (UE), from an access layer in the UE, a quality of service (QoS) indication comprising a metric that represents a quality of one or more radio bearers used for a vehicular communication with one or more other UEs; and
performing, at the application layer, a transmission control over the vehicular communication based on the QoS indication.
| 2. The method of claim 1, wherein the metric is indicative of a message reception performance as affected by a presence or an absence of message interference or collision in the one or more radio bearers.
| 3. The method of claim 1, wherein the performing comprises adjusting a transmission rate of a unicast communication of the UE, according to the QoS indication.
| 4. The method of claim 1, wherein the performing comprises adjusting a transmission range of a groupcast communication of the UE, according to the QoS indication.
| 5. The method of claim 1, wherein the performing comprises adjusting a maneuver of the UE, according to the QoS indication.
| 6. The method of claim 1, wherein the performing comprises adjusting an autonomous driving status of the UE, according to the QoS indication.
| 7. The method of claim 1, further comprising sharing sensor data of the UE with a remote UE via a unicast communication at a first transmission rate.
| 8. The method of claim 7, wherein the performing comprises:
determining, by the application layer, based on the QoS indication, a second transmission rate supportable by the unicast communication; and
adjusting the unicast communication according to the second transmission rate.
| 9. The method of claim 8, wherein the adjusting comprises performing inter-transmission time (ITT) control at the UE.
| 10. The method of claim 8,
wherein the sharing comprises sharing video sensor data of the UE with the remote UE over the unicast communication; and
wherein the adjusting comprises adjusting a video resolution of a video codec of the UE according to the second transmission rate supportable by the unicast communication.
| 11. The method of claim 7, wherein the receiving comprises receiving a packet error rate (PER) related to the unicast communication with the remote UE.
| 12. The method of claim 7, wherein the receiving comprises receiving a negative acknowledgement (NACK) statistic related to the unicast communication with the remote UE.
| 13. The method of claim 1, further comprising:
communicating, by the UE, with a plurality of other UEs via a groupcast communication; and
wherein the receiving comprises receiving at least one of a packet error rate (PER) or a negative acknowledgement (NACK) statistic related to the groupcast communication with the plurality of other UEs.
| 14. The method of claim 13, wherein the performing comprises:
determining, based on the at least one of the PER or the NACK statistic, that a reachable range of the UE fails to comply with a minimum range requirement of a vehicular application configured for controlling a maneuver of the UE.
| 15. The method of claim 14, wherein the performing further comprises cancelling the maneuver of the UE.
| 16. The method of claim 14, wherein the performing further comprises postponing the maneuver of the UE.
| 17. The method of claim 14, wherein the performing further comprises regenerating a driving strategy of the UE to match the reachable range.
| 18. The method of claim 14, wherein the performing further comprises:
modifying a range of the UE according to the reachable range; and
adjusting the maneuver of the UE based on the range.
| 19. The method of claim 18, wherein the modifying comprises adjusting a radiated power of the UE.
| 20. The method of claim 18, wherein adjusting the maneuver comprises slowing down the UE.
| 21. The method of claim 18, wherein adjusting the maneuver comprises following a stop and go operation at the UE.
| 22. The method of claim 18, wherein adjusting the maneuver comprises exiting an autonomous driving mode at the UE.
| 23. The method of claim 14, wherein the maneuver comprises a coordinated intersection crossing.
| 24. The method of claim 1, wherein the QoS indication comprises one or more of a Packet Error Rate (PER), a Packet Received Rate (PRR), an average number of retransmissions, an average PER, an average PRR, or an acknowledgement (ACK)/negative acknowledgement (NACK) statistic.
| 25. The method of claim 1, wherein the QoS indication comprises a range statistic of a groupcast group.
| 26. The method of claim 1, wherein the QoS indication comprises a supported bit rate for a radio bearer.
| 27. The method of claim 1, wherein the vehicular communication comprises a new radio (NR) vehicle-to-everything (V2X) communication.
| 28. A user equipment (UE) for wireless communication, comprising:
a memory storing instructions; and
a processor in communication with the memory, wherein the processor is configured to execute the instructions to:
receive, by an application layer in the UE, from an access layer in the UE, a quality of service (QoS) indication comprising a metric that represents a quality of one or more radio bearers used for a vehicular communication with one or more other UEs; and
perform, at the application layer, a transmission control over the vehicular communication based on the QoS indication.
| 29. The UE of claim 28, wherein the processor is further configured to execute the instructions to adjust a transmission rate of a unicast communication of the UE, according to the QoS indication.
| 30. The UE of claim 28, wherein the processor is further configured to execute the instructions to adjust a transmission range of a groupcast communication of the UE, according to the QoS indication. | The method involves using the application layer (142) in a user equipment (UE) (148) to receive a QoS indication (144) from the access layer (146) in the UE. The QoS indication includes a metric that represents a quality of the radio bearers used for vehicular communication with other UEs (104,149). The application layer then performs transmission control over the vehicular communication based on the QoS indication. An INDEPENDENT CLAIM is also included for a UE used for wireless communication. Wireless communication method for use in vehicular communication systems. Can be used in transmission control in application layer based on radio bearer quality metrics in vehicular communication, including vehicle-to-vehicle (V2V) communication, vehicle-to-pedestrian (V2P) communication, vehicle-to-everything (V2X) communication, enhanced vehicle-to-everything (eV2X) communication, and cellular vehicle-to-everything (C-V2X) communication. Provides a wireless communication method that ensures improved autonomous driving, e.g., in self-driving vehicles operating with reduced or zero human input, and improved driving experience, e.g., improved non-autonomous human driving. The drawing shows a schematic diagram illustrating a wireless communication system and an access network. 104,149Other UEs142Application layer144QoS indication146Access layer148UE |
Please summarize the input | CONGESTION CONTROL FOR NR V2XIn one aspect, a method includes determining, by a user equipment (UE), a channel busy ratio (CBR) window for a CBR measurement for one or more resources; determining, by the UE, a CBR measurement value for the CBR window and for the one or more resources; determining, by the UE, a channel occupancy ratio (CR) window based on a first number of subframes used for a history of past transmissions and based on a second number of subframes used for future planned transmissions and corresponding retransmissions; and determining, by the UE, a CR value for the CR window based on subchannels used for the one or more resources for the first number of subframes and based on subchannels estimated for the one or more resources for the second number of subframes. In another aspect, a method includes determining a CR window based on a CBR measurement value. | The method involves determining (700) a channel busy ratio (CBR) window for a CBR measurement for resources by a user equipment (UE). A CBR measurement value for the CBR window and for the resources is determined (701) by the UE. A channel occupancy ratio (CR) window is determined based on a first number of sub-frames used for a history of past transmissions and based on a second number of sub-frames used for future planned transmissions and corresponding retransmissions by the UE. A CR value for the CR window is determined based on sub-channels used for the resources for the first number of sub-frames and based on sub-channels estimated for the resources for the second number of sub-frames by the UE. INDEPENDENT CLAIMS are included for the following:(1) an apparatus configured for wireless communication for supporting enhanced congestion control for vehicle-to-everything (V2X) in new radio (NR); and(2) a non-transitory computer-readable medium storing program for supporting enhanced congestion control for vehicle-to-everything (V2X) in new radio (NR). Method for supporting enhanced congestion control for vehicle-to-everything (V2X) in new radio (NR) of wireless communication. The method increases reliability and throughput, reduces latency, and enables operation in ultra-reliable low latency communications (URLLC) modes. The base stations take advantage of the higher dimension multiple input, multiple output (MIMO) capabilities to exploit three-dimensional beam-forming in both elevation and azimuth beam-forming to increase coverage and capacity. The enhanced congestion control operations enables more aperiodic communications to be transmitted, and thus increases throughput and reduce latency. The drawing shows a block diagram of the blocks executed by a UE configured. 700Step for determining a CBR window for a CBR measurement for resources 701Step for determining a CBR measurement value for the CBR window and for the resources 702Step for determining a CR window based on the CBR measurement value |
Please summarize the input | Automated control of headlight illumination by onboard vehicle-to-everything (V2X) deviceIn an aspect, a method of wireless communication performed by a vehicle-to-everything (V2X) device onboard a vehicle includes receiving one or more V2X safety messages indicating a potential safety condition related to illumination of an object; determining, in response to the one or more V2X safety messages, that the object is within or approaching a target area in which illumination of the object by headlights of the vehicle can be adjusted; and controlling an illumination intensity and/or an illumination pattern of the headlights of the vehicle in response to determining that the object is within the target area.What is claimed is:
| 1. A method of wireless communication performed by a vehicle-to-everything (V2X) device onboard a vehicle, comprising:
receiving one or more V2X safety messages indicating a potential safety condition related to illumination of an object, wherein the one or more V2X safety messages indicate information relating to a location of the object;
determining, in response to the information relating to the location of the object indicated by the one or more V2X safety messages, that the object is within or approaching a target area in which illumination of the object by headlights of the vehicle can be adjusted;
determining one or more occluded regions of the target area based on topographical features identified by the V2X device;
determining whether the object is within or outside of the one or more occluded regions based on the information relating to the location of the object indicated by the one or more V2X safety messages; and
controlling an illumination intensity and/or an illumination pattern of the headlights of the vehicle in response to determining that the object is within the target area outside of the one or more occluded regions.
| 2. The method of claim 1, further comprising:
determining the target area based on a current illumination intensity and/or a current illumination pattern of the headlights of the vehicle.
| 3. The method of claim 1, further comprising:
determining the target area based on an illumination capability of the headlights of the vehicle.
| 4. The method of claim 1, further comprising:
determining the target area based on a current position of the vehicle and/or a projected future position of the vehicle using one or more vehicular sensors.
| 5. The method of claim 1, wherein:
the one or more occluded regions of the target area are determined based on topographical features identified by the V2X using sensor data received from one or more vehicular sensors.
| 6. The method of claim 5, wherein:
the one or more vehicular sensors include one or more light detection and ranging (LIDAR) sensors.
| 7. The method of claim 5, wherein:
the one or more vehicular sensors include one or more radio detection and ranging (RADAR) sensors.
| 8. The method of claim 5, wherein:
the one or more vehicular sensors include one or more image sensors.
| 9. The method of claim 1, further comprising:
detecting ambient lighting conditions exterior to the vehicle; and
determining the target area based on the ambient lighting conditions.
| 10. The method of claim 1, wherein controlling the illumination intensity and/or the illumination pattern of the headlights comprises: controlling illumination intensities of one or more of a plurality of light-emitting elements of the headlights.
| 11. The method of claim 10, wherein:
the illumination intensities of the one or more of the plurality of light-emitting elements of the headlights are adjusted to provide an illumination pattern that emits light at an increased intensity toward the object.
| 12. The method of claim 11, wherein:
the illumination pattern is asymmetric between a left-side headlight illumination intensity and a right-side headlight illumination intensity.
| 13. The method of claim 1, wherein: the potential safety condition is a condition in which the object should be illuminated to make the object visible to a driver of the vehicle; and the illumination intensity and/or the illumination pattern of the headlights are controlled to increase intensity of light beams emitted toward the object.
| 14. The method of claim 1, wherein: the potential safety condition is a condition in which illumination of the object should be limited to reduce a likelihood of blinding an individual at the object; and the illumination intensity and/or the illumination pattern of the headlights are controlled to decrease intensity of light beams emitted toward the object.
| 15. The method of claim 1, wherein: the potential safety condition is a condition in which the object should be illuminated to make the object visible to one or more image sensors used in an autonomous driving system of the vehicle; and the illumination intensity and/or the illumination pattern of the headlights are controlled to increase intensity of light beams emitted toward the object.
| 16. The method of claim 1, wherein: the potential safety condition is a condition in which illumination of the object should be limited to reduce a likelihood of overexposing one or more image sensors at the object; and the illumination intensity and/or the illumination pattern of the headlights are controlled to decrease intensity of light beams emitted toward the object.
| 17. The method of claim 1, wherein: the one or more V2X safety messages indicating the potential safety condition related to illumination of the object are received from a roadside unit (RSU).
| 18. The method of claim 1, wherein: determining that the object is within the target area comprises determining a current position of the object and/or a projected future position of the object based on the information relating to the location of the object indicated by the one or more V2X safety messages and/or further V2X communications.
| 19. The method of claim 18, wherein:
the current position of the object is based on the information relating to the location of the object indicated by the one or more V2X safety messages.
| 20. The method of claim 19, wherein the projected future position of the object is based on the current position of the object and:
a speed of the object indicated in V2X communications relating to the object,
a heading of the object indicated in V2X communications relating to the object,
a projected path of the object indicated in V2X communications relating to the object, or
any combination thereof.
| 21. The method of claim 1, wherein:
the one or more occluded regions of the target area are determined based on the topographical features identified by the V2X device from map data.
| 22. The method of claim 21, wherein:
the one or more occluded regions of the target area are determined based on road lanes used by the object as determined from the map data.
| 23. The method of claim 21, wherein:
the map data includes V2X map data.
| 24. The method of claim 21, wherein:
the map data includes local map data.
| 25. A vehicle-to-everything (V2X) device onboard a vehicle, comprising:
a memory;
at least one transceiver; and
at least one processor communicatively coupled to the memory and the at least one transceiver, the at least one processor configured to:
receive, via the at least one transceiver, one or more V2X safety messages indicating a potential safety condition related to illumination of an object, wherein the one or more V2X safety messages indicate information relating to a location of the object;
determine, in response to the information relating to the location of the object indicated by the one or more V2X safety messages, that the object is within or approaching a target area in which illumination of the object by headlights of the vehicle can be adjusted;
determine one or more occluded regions of the target area based on topographical features identified by the V2X device;
determining whether the object is within or outside of the one or more occluded regions based on the V2X safety information indicating the location of the object and control an illumination intensity and/or an illumination pattern of the headlights of the vehicle in response to determining that the object is within the target area.
| 26. The V2X device of claim 25, wherein the at least one processor is further configured to:
determine the target area based on a current illumination intensity and/or a current illumination pattern of the headlights of the vehicle.
| 27. The V2X device of claim 25, wherein the at least one processor is further configured to:
determine the target area based on an illumination capability of the headlights of the vehicle.
| 28. The V2X device of claim 25, wherein the at least one processor is further configured to:
determine the target area based on a current position of the vehicle and/or a projected future position of the vehicle using one or more vehicular sensors.
| 29. The V2X device of claim 25, wherein:
the one or more occluded regions of the target area are determined based on topographical features identified by the V2X using sensor data received from one or more vehicular sensors.
| 30. The V2X device of claim 29, wherein:
the one or more vehicular sensors include one or more light detection and ranging (LIDAR) sensors.
| 31. The V2X device of claim 29, wherein:
the one or more vehicular sensors include one or more radio detection and ranging (RADAR) sensors.
| 32. The V2X device of claim 29, wherein:
the one or more vehicular sensors include one or more image sensors.
| 33. The V2X device of claim 25, wherein the at least one processor configured to control the illumination intensity and/or the illumination pattern of the headlights comprises the at least one processor configured to:
control illumination intensities of one or more of a plurality of light-emitting elements of the headlights.
| 34. The V2X device of claim 33, wherein:
the illumination intensities of the one or more of the plurality of light-emitting elements of the headlights are controlled to provide an illumination pattern that emits light at an increased intensity toward the object.
| 35. The V2X device of claim 33, wherein:
the illumination pattern is asymmetric between a left-side headlight illumination intensity and a right-side headlight illumination intensity.
| 36. The V2X device of claim 25, wherein:
determining that the object is within the target area comprises determining a current position of the object and/or a projected future position of the object based on the information relating to the location of the object indicated by the one or more V2X safety messages and/or further V2X communications.
| 37. The V2X device of claim 36, wherein:
the current position of the object is based on the location information relating to the location of the object indicated by the one or more V2X safety messages.
| 38. The V2X device of claim 25, wherein:
the one or more occluded regions of the target area are determined based on the topographical features identified by the V2X device from map data. | The method involves receiving vehicle-to-everything (V2X) safety messages indicating a potential safety condition related to illumination of an object (702). Occluded regions of a target area are determined based on topographical features identified by a V2X device. Determination is made (704) to check whether the object is within or outside of the regions based on information relating to a location of the object indicated by the safety messages. An illumination intensity and/or an illumination pattern of headlights of the vehicle are controlled (706) in response to determining that the objects are within the target area. An INDEPENDENT CLAIM is included for a V2X device for onboarding a vehicle. Method for performing wireless communication by a V2X device for onboarding of a vehicle. The method enables utilizing a V2X safety module to determine that the object is within or approaching the target area in which illumination of the object by headlights of the vehicle can be adjusted in response to the V2X safety messages, and to control the illumination intensity and/or the illumination pattern of the headlights of the vehicle based on the determined object in an efficient manner. The drawing shows a flow chart illustrating a method for performing wireless communication by a V2X device for onboarding of a vehicle.702Receiving V2X safety messages indicating potential safety condition related to illumination of object 704Determining whether object is within or outside of regions based on information relating to location of object indicated by safety messages 706Controlling illumination intensity and/or illumination pattern of headlights of vehicle in response to determining that objects are within target area |
Please summarize the input | LOW LATENCY ENHANCEMENTS TO CV2X AUTONOMOUS RESOURCE SELECTION AND RE-SELECTION PROCEDURE FOR VEHICLE-TO-VEHICLE COMMUNICATIONSLow latency enhancements for communication systems, including autonomous driving and/or selection scenarios, are provided. A method for communication includes monitoring communication resources in a communication system, determining a set of candidate resources to use for subsequent transmission of information within a time window such that the time window is minimized based on a desired communication latency parameter that considers at least one or more of communication channel congestion and a priority of transmission, determining a set of lowest energy resources from the set of candidate resources, selecting a low energy resource from the set of lowest energy resources, and transmitting data on the selected low energy resource. Other aspects, embodiments, and features are also claimed and described. | The method involves monitoring communication resources (922-946) in a communication system. A set of candidate resources (910) to use for subsequent transmission of information within a time window such that the time window is minimized based on a desired communication latency parameter that considers communication channel congestion and a priority of the intended transmission. A set of lowest energy resources is determined from the set of candidate resources. A low energy resource is selected from the set of lowest energy resources. Data on the selected low energy resource is transmitted. INDEPENDENT CLAIMS are also included for the following:an apparatus for facilitating communication between user equipment'sa communication devicea non-transitory computer-readable medium comprising a set of instructions for facilitating communication between vehicles. Method for facilitating communication i.e. vehicle-to-vehicle communication, between user equipment's e.g. vehicles such as automobiles, for a wireless communications system. Can also be used for cellular phones, smart phones, session initiation protocol (SIP) phones, laptops, personal digital assistants (PDAs), satellite radios, global positioning systems, multimedia devices, video devices, digital audio players i.e. MPEG-1 audio layer 3 (MP3) players, cameras, game consoles, tablets, smart devices and wearable devices for Code Division Multiple Access (CDMA) , Time Division Multiple Access (TDMA) , Frequency-Division Multiple Access (FDMA) , OFDMA , Single-Carrier Frequency-Division Multiple Access (SC-FDMA) systems. The method enables transmitting data streams to the single user equipment so as to increase data rate, and transmitting data streams to the user equipment and other user equipment's to increase system capacity. The drawing shows a schematic view of a communication frame structure. 900Data structure901Trigger time for resource selection or reselection910Candidate resources922-946Communication resources |
Please summarize the input | Methods and apparatus for parking lot exit management using V2XAspects of the present disclosure include methods, apparatuses, and computer readable media for receiving a plurality of requests, from a plurality of user equipments (UEs), to exit a parking area comprising a plurality of vehicles, wherein each of the plurality of UEs is associated with a corresponding vehicle of the plurality of vehicles, determining an exit order for the plurality of vehicles to exit the parking area, and transmitting, to the plurality of UEs, a plurality of exit commands, based on the exit order, for the plurality of vehicles to exit the parking area.What is claimed is:
| 1. A method of wireless communication by a road side unit in a network, comprising:
receiving a plurality of requests, from a plurality of user equipments (UEs), to exit a parking area comprising a plurality of vehicles, wherein each of the plurality of UEs is associated with a corresponding vehicle of the plurality of vehicles;
determining an estimated exit duration to exit the parking area from a current location for an individual vehicle of the plurality of vehicles;
determining, based on the estimated exit duration, an exit order for the plurality of vehicles to exit the parking area;
transmitting, to the plurality of UEs, a plurality of exit commands, based on the exit order, for the plurality of vehicles to exit the parking area;
collecting, via one or more sensors, sensor information corresponding to activity within the parking area, the collecting comprising:
monitoring the plurality of vehicles exiting the parking area; and
detecting at least one of an out-of-order exit, a collision, a pedestrian, or other road user;
generating, based on the sensor information, one or more updated exit commands to supersede the plurality of exit commands; and
transmitting the one or more updated exit commands to at least a subset of the plurality of UEs in response to detecting the at least one of the out-of-order exit, the collision, the pedestrian, or the other road user.
| 2. The method of claim 1, wherein receiving the plurality of requests comprises:
receiving an emergency exit request from a first responder vehicle of the plurality of vehicles; and
wherein determining the exit order comprises prioritizing the first responder vehicle in the exit order for the plurality of vehicles.
| 3. The method of claim 1, wherein determining the exit order comprises:
determining the exit order based on one or more of a reception order associated with receiving the plurality of requests, proximities of the plurality of vehicles to one or more exits of the parking area, sizes of the plurality of vehicles, maneuverabilities of the plurality of vehicles, estimated durations for the plurality of vehicles to exit the parking area, estimated fuel consumptions of the plurality of vehicles, or priorities associated with the plurality of requests.
| 4. The method of claim 1, wherein transmitting the plurality of exit commands comprises:
sequentially transmitting each of the plurality of exit commands based on a corresponding scheduled exit time of a plurality of scheduled exit times in accordance with the exit order.
| 5. The method of claim 1, wherein transmitting the plurality of exit commands comprises:
transmitting a first exit command of the plurality of exit commands to a first vehicle of the plurality of vehicles scheduled to exit the parking area before remaining vehicles of the plurality of vehicles; and
transmitting, to the remaining vehicles, remaining exit commands of the plurality of exit commands each comprising identification information associated with a vehicle scheduled to exit the parking area immediately before each of the remaining vehicles.
| 6. The method of claim 5, wherein:
the identification information includes at least one of a make, a model, a color, a license plate, a build, a location, or a vehicle type.
| 7. The method of claim 1, wherein transmitting the plurality of exit commands comprises:
transmitting a plurality of scheduled exit times.
| 8. The method of claim 1, wherein:
the plurality of exit commands comprises location information associated with one or more exits of the parking area.
| 9. A road side unit, comprising:
a memory comprising instructions;
a transceiver; and
one or more processors operatively coupled with the memory and the transceiver, the one or more processors configured to execute the instructions in the memory to:
receive a plurality of requests, from a plurality of user equipments (UEs), to exit a parking area comprising a plurality of vehicles, wherein each of the plurality of UEs is associated with a corresponding vehicle of the plurality of vehicles;
determine an estimated exit duration to exit the parking area from a current location for an individual vehicle of the plurality of vehicles;
determine, based on the estimated exit duration, an exit order for the plurality of vehicles to exit the parking area;
transmit, to the plurality of UEs, a plurality of exit commands, based on the exit order, for the plurality of vehicles to exit the parking area;
collect, via one or more sensors, sensor information corresponding to activity within the parking area, wherein to collect, the one or more processors are configured:
monitor the plurality of vehicles exiting the parking area; and
detect at least one of an out-of-order exit, a collision, a pedestrian, or other road user;
generate, based on the sensor information, one or more updated exit commands to supersede the plurality of exit commands; and
transmit the one or more updated exit commands to at least a subset of the plurality of UEs in response to detecting the at least one of the out-of-order exit, the collision, the pedestrian, or the other road user.
| 10. The road side unit of claim 9, wherein receiving the plurality of requests comprises:
receive an emergency exit request from a first responder vehicle of the plurality of vehicles; and
wherein determining the exit order comprises prioritizing the first responder vehicle in the exit order for the plurality of vehicles.
| 11. The road side unit of claim 10, wherein determining the exit order comprises:
determine the exit order based on one or more of a reception order associated with receiving the plurality of requests, proximities of the plurality of vehicles to one or more exits of the parking area, sizes of the plurality of vehicles, maneuverabilities of the plurality of vehicles, estimated durations for the plurality of vehicles to exit the parking area, estimated fuel consumptions of the plurality of vehicles, or priorities associated with the plurality of requests.
| 12. The road side unit of claim 10, wherein transmitting the plurality of exit commands comprises:
sequentially transmit each of the plurality of exit commands based on a corresponding scheduled exit time of a plurality of scheduled exit times in accordance with the exit order.
| 13. The road side unit of claim 10, wherein transmitting the plurality of exit commands comprises:
transmit a first exit command of the plurality of exit commands to a first vehicle of the plurality of vehicles scheduled to exit the parking area before remaining vehicles of the plurality of vehicles; and
transmit to the remaining vehicles, remaining exit commands of the plurality of exit commands each comprising identification information associated with a vehicle scheduled to exit the parking area immediately before each of the remaining vehicles.
| 14. The road side unit of claim 13, wherein:
the identification information includes at least one of a make, a model, a color, a license plate, a build, a location, or a vehicle type.
| 15. The road side unit of claim 10, wherein transmitting the plurality of exit commands comprises:
transmit a plurality of scheduled exit times.
| 16. The road side unit of claim 10, wherein:
the plurality of exit commands comprises location information associated with one or more exits of the parking area.
| 17. A method of wireless communication by a user equipment (UE) associated with a vehicle in a network, comprising:
transmitting, to a road side unit (RSU), an exit request; and
receiving, from the RSU based on an estimated exit duration to exit a parking area from a current location, one or more exit commands comprising identification information associated with an other vehicle scheduled to exit the parking area immediately before the vehicle, and one or both of an indication for the vehicle to begin exiting the parking area or a scheduled exit time for the vehicle.
| 18. The method of claim 17, wherein transmitting the exit request comprises:
transmitting an emergency exit request from a first responder vehicle; and
wherein receiving the one or more exit commands comprises receiving a priority exit command to exit the parking area ahead of a plurality of vehicles.
| 19. The method of claim 17, wherein:
the identification information includes at least one of a make, a model, a color, a license plate, a build, an identifying mark, or an accessory associated with the other vehicle.
| 20. The method of claim 17, further comprising:
displaying via a graphical user interface, exit information based on one or more exit commands.
| 21. The method of claim 17, further comprising:
transmitting, to an autonomous drive system, exit information based on the one or more exit commands.
| 22. A user equipment (UE) associated with a vehicle, comprising:
a memory comprising instructions;
a transceiver; and
one or more processors operatively coupled with the memory and the transceiver, the one or more processors configured to execute instructions in the memory to:
transmit, to a road side unit (RSU), an exit request; and
receive, from the RSU based on an estimated exit duration to exit a parking area from a current location, one or more exit commands comprising identification information associated with an other vehicle scheduled to exit the parking area immediately before the vehicle, and one or both of an indication for the vehicle to begin exiting the parking area or a scheduled exit time scheduled for the vehicle.
| 23. The UE of claim 22, wherein transmitting the exit request comprises:
transmitting an emergency exit request from a first responder vehicle; and
wherein receiving the one or more exit commands comprises receiving a priority exit command to exit the parking area ahead of a plurality of vehicles.
| 24. The UE of claim 22, wherein:
the identification information includes at least one of a make, a model, a color, a license plate, a build, an identifying mark, or an accessory associated with the other vehicle.
| 25. The UE of claim 22, wherein the one or more processors are further configured to:
display, via a graphical user interface, exit information based on the one or more exit commands.
| 26. The UE of claim 22, wherein the one or more processors are further configured to:
transmit, to an autonomous drive system, exit information based on the one or more exit commands. | The method involves receiving multiple requests from multiple user equipments (UEs) to exit a parking area (S605), where the parking area comprises multiple vehicles, where each UE is associated with corresponding vehicle. An exit order is determined (S610) from multiple vehicles to exit the parking area, where the exit order comprises prioritizing the responder vehicles in the exit orders for multiple vehicles. Multiple exit commands are transmitted (S615) to multiple vehicles to exit the parking area based on the exit order and estimated exit duration. An emergency exit request is received from a responder vehicle. Identification information of the vehicle is obtained, where the identification information includes make, a model, a color, a number plate, build, a location or a vehicle type. Multiple vehicles exiting the parking area is monitored. Processors are operatively coupled with a memory and a transceiver. INDEPENDENT CLAIMS are included for:(a). a roadside unit;(b). a method for establishing wireless communication by using a user equipment;(c). a user equipment associated with a vehicle Method for facilitating parking lot exit management by using a vehicle-to-everything (V2X) network. The method enables establishing the vehicle-to-everything (V2X) network to manage traffic and reducing congestions. The method enables utilizing Roadside Unit (RSU) to transmit the exit commands to the User Equipment (UEs) to exit the parking area based on the exit order for the vehicles. The method enables performing clear channel assessment (CCA) to determine whether the channel is available or not. The drawing shows a flow diagram illustrating a method for performing parking lot exit management by using a vehicle-to-everything network.S605Step for receiving multiple requests from multiple user equipments to exit a parking area S610Step for determining exit order from multiple vehicles to exit the parking area S615Step for transmitting multiple exit commands to multiple vehicles to exit the parking area based on the exit order and estimated exit duration |
Please summarize the input | ENHANCING NAVIGATION EXPERIENCE USING V2X SUPPLEMENTAL INFORMATIONEmbodiments of the disclosure are directed to the use of supplemental information received from Vehicle-to-Everything (V2X) capable entities in order to enhance navigation and route selection based on available advanced driver assistance systems (ADAS) functionality. A number of potential routes are evaluated by retrieving the V2X capabilities and locations from V2X capable entities along those routes. That information is used to assess traffic density and availability of supplemental information used by ADAS along each route, allowing for an evaluation of each route on travel time and ADAS support. The driver can then select the best route that supports their needs.|1. A method comprising:
* obtaining (222) a destination address and a source address;
* determining (224) a plurality of routes from the source address to the destination address;
* for each route in the plurality of routes:
* determining (226) an availability of Vehicle to Everything, V2X,-capable entities, capable of providing V2X information, along one or more portions of each respective route;
* calculating a travel time estimate for each route in the plurality of routes based on the availability of V2X-capable entities along the one or more portions of each respective route;
* generating (228) a navigation map for display in an in-vehicle display, the navigation map comprising each route of the plurality of routes and an indication of the availability of V2X-capable entities along the one or more portions of each respective route; and
* causing (230) the navigation map to be displayed in the in-vehicle display;
* wherein the method further comprises:
* receiving a first user selection of a navigation route from the plurality of routes;
* determining, based on a user event having a duration, that the navigation route will not provide a sufficient level/density of V2X-capable entities for the duration of the user event;
* determining an alternative route, wherein the alternative route has a sufficient level/density of V2X-capable entities allowing for autonomous driving over the duration of the user event;
* causing the navigation map in the in-vehicle display to show an indication of the alternative route;
* receiving a second user selection of the alternative route;
* calculating a travel time estimate for the alternative route based on a greater availability of V2X-capable entities along one or more portions of the alternative route;
* determining an estimated change in travel time associated with the second user selection of the alternative route, the estimated change in travel time based at least on the calculated travel time estimate for the alternative route; and
* causing the in-vehicle display to show the estimated change in travel time.
| 2. The method of claim 1, further comprising:
* selecting a navigation route from the plurality of routes based on the availability of V2X-capable entities along the one or more portions of the navigation route; or
* receiving a third user selection of a navigation route from the plurality of routes; and
* updating the navigation map in the in-vehicle display to show only the user selected navigation route.
| 3. The method of claim 1, wherein calculating the travel time estimate for each route in the plurality of routes further comprises:
* determining V2X-capabilities of the V2X-capable entities along each respective route; and preferably
* the method further comprises:
* updating the travel time estimate for each route in the plurality of routes based on the V2X-capabilities of V2X-capable entities along each respective route; and further preferably
* the method further comprises:
* ordering each route in the plurality of routes in an ordered list based on the travel time estimate for each respective route; and
* causing the ordered list of the plurality of routes to be displayed in the in-vehicle display.
| 4. The method of claim 1, further comprising:
* for each route in the plurality of routes:
receiving a V2X-capability and a location for each of the plurality of V2X-capable entities along the one or more portions of each respective route; and preferably
* the method further comprises:
* for each route in the plurality of routes:
determining, from the V2X-capability and the location for each of the plurality of V2X-capable entities along the one or more portions of each respective route, an availability of assisted driving features along the one or more portions of each respective route; and further preferably
* the method further comprises:
updating the navigation map in the in-vehicle display to show availability of assisted driving features along the one or more portions of each route in the plurality of routes.
| 5. The method of claim 1, wherein the plurality of V2X-capable entities includes Vehicle-to-Vehicle, V2V, capable vehicles or Vehicle-to-Infrastructure, V2I, capable infrastructure.
| 6. A system comprising:
* a vehicle (100) having an in-vehicle display (756) and an on-board navigation computer (716), the on-board navigation computer capable of receiving communication over Vehicle-to-Everything, V2X, communication; and
* a navigation application executable by the on-board navigation computer to cause the on-board navigation computer to:
* obtain a destination address and a source address;
* determine a plurality of routes from the source address to the destination address;
* for each route in the plurality of routes:
* determine an availability of Vehicle to Everything, V2X,-capable entities, capable of providing V2X information, along one or more portions of each respective route;
* calculate a travel time estimate for each route in the plurality of routes based on the availability of V2X-capable entities along the one or more portions of each respective route;
* generate a navigation map for display in the in-vehicle display, the navigation map comprising each route of the plurality of routes and an indication of the availability of V2X-capable entities along the one or more portions of each respective route; and
* cause the navigation map to be displayed in the in-vehicle display;
* further causing the on-board navigation computer to:
* receive a first user selection of a navigation route from the plurality of routes;
* determine, based on a user event having a duration, that the navigation route will not provide a sufficient level/density of V2X-capable entities for the duration of the user event;
* determine an alternative route, wherein the alternative route has a sufficient level/density of V2X-capable entities allowing for autonomous driving over the duration of the user event;
* cause the navigation map in the in-vehicle display to show an indication of the alternative route;
* receive a second user selection of the alternative route;
* calculate a travel time estimate for the alternative route based on a greater availability of V2X-capable entities along one or more portions of the alternative route;
* determine an estimated change in travel time associated with the second user selection of the alternative route, the estimated change in travel time based at least on the calculated travel time estimate for the alternative route; and
* cause the in-vehicle display to show the estimated change in travel time.
| 7. The system of claim 6, wherein the navigation application is executable by the on-board navigation computer to further cause the on-board navigation computer to:
* select a navigation route from the plurality of routes based on the availability of V2X-capable entities along the one or more portions of the navigation route; or
* receive a third user selection of a navigation route from the plurality of routes; and
* update the navigation map in the in-vehicle display to show only the user selected navigation route.
| 8. The system of claim 6, wherein the navigation application is executable by the on-board navigation computer to further cause the on-board navigation computer to:
* determine V2X-capabilities of the V2X-capable entities along each respective route; and preferably
* the on-board navigation computer is further caused to:
* update the travel time estimate for each route in the plurality of routes based on V2X-capabilities of the V2X-capable entities along each respective route; and further preferably
* the on-board navigation computer is further caused to:
* order each route in the plurality of routes in an ordered list based on the travel time estimate for each respective route; and
* cause the ordered list of the plurality of routes to be displayed in the in-vehicle display.
| 9. The system of claim 6, wherein the navigation application is executable by the on-board navigation computer to further cause the on-board navigation computer to:
for each route in the plurality of routes:
receive a V2X-capability and a location for each of the plurality of V2X-capable entities along the one or more portions of each respective route; and preferably the on-board navigation computer is further caused to:
for each route in the plurality of routes:
determine, from the V2X-capability and the location for each of the plurality of V2X-capable entities along the one or more portions of each respective route, an availability of assisted driving features along the one or more portions of each respective route; and further preferably the on-board navigation computer is further caused to:
update the navigation map in the in-vehicle display to show availability of assisted driving features along the one or more portions of each route in the plurality of routes.
| 10. The system of claim 6, wherein the plurality of V2X-capable entities includes Vehicle-to-Vehicle, V2V, capable vehicles or Vehicle-to-Infrastructure, V2I, capable infrastructure.
| 11. A non-transitory computer readable memory containing instructions executable by a processor to cause the processor to:
* obtain a destination address and a source address;
* determine a plurality of routes from the source address to the destination address;
* for each route in the plurality of routes:
* determine an availability of Vehicle to Everything, V2X,-capable entities, capable of providing V2X information, along one or more portions of each respective route;
* calculating a travel time estimate for each route in the plurality of routes based on the availability of V2X-capable entities along the one or more portions of each respective route;
* generate a navigation map for display in the in-vehicle display, the navigation map comprising each route in the plurality of routes and an indication of the availability of V2X-capable entities along the one or more portions of each respective route; and
* cause the navigation map to be displayed in the in-vehicle display;
* and further causing the processor to:
* receive a first user selection of a navigation route from the plurality of routes;
* determine, based on a user event having a duration, that the navigation route will not provide a sufficient level/density of V2X-capable entities for the duration of the user event;
* determine an alternative route, wherein the alternative route has available a sufficient level/density of V2X-capable entities allowing for autonomous driving over the duration of the user event;
* cause the navigation map in the in-vehicle display to show an indication of the alternative route;
* receive a second user selection of the alternative route;
* calculating a travel time estimate for the alternative route based on a greater availability of V2X-capable entities along one or more portions of the alternative route;
* determine an estimated change in travel time associated with the second user selection of the alternative route, the estimated change in travel time based at least on the calculated travel time estimate for the alternative route; and
* cause the in-vehicle display to show the estimated change in travel time. | The method involves obtaining a destination address and a source address. Multiple routes are determined from the source address to the destination address. An availability of vehicle (100,104) to everything (V2X)-capable entities are determined capable of providing V2X information, along one or more portions of each respective route. A navigation map is generated for display in an in-vehicle display, the navigation map includes each route of multiple routes and an indication of the availability of V2X-capable entities along the one or more portions of each respective route. The navigation map is caused to be displayed in the in-vehicle display. INDEPENDENT CLAIMS are included for the following:a system for autonomous driving and advanced driver assistance systems;an apparatus for autonomous driving and advanced driver assistance systems; anda non-transitory computer readable for autonomous driving and advanced driver assistance systems. Method for autonomous driving and advanced driver assistance systems (ADAS). The supplemental information can also be used to generate suggestions to the driver and enable the driver to make better decisions. The drawing shows a schematic view of a V2X-capable entities. 100,104Vehicle102Infrastructure106Power grid110Pedestrian |
Please summarize the input | WIRELESS COMMUNICATION APPARATUS AND METHOD IN WIRELESS DEVICESThe present invention relates to a method and apparatus for wireless communication at wireless devices, in particular a method and apparatus for collaborative early detection and threat mitigation in C-V2X. In one aspect, the apparatus detects a threat entity in a threat area based on data signals received from the threat entity, wherein the threat entity interferes with wireless resources or spectrum used in autonomous driving and cooperative decisions. The transmitter transmits, to at least one second wireless device, a message indicating the threat entity in the threat zone. | The apparatus has a memory (360), a transceiver and a processor (359) communicatively connected to the memory and the transceiver. The processor is configured to detect a threat entity within a threat zone based on data signals received from the threat entity. The threat entity obstructs wireless spectrum or resources utilized in cooperative or automated driving decisions; and transmit to second wireless device and a message indicating the threat entity within the threat zone. The data signals received from the threat entity comprise data that is inconsistent with projected data for wireless devices. The data signals comprise data of a misbehaving wireless device. The data of the misbehaving wireless device comprises implausible data related to characteristic of the misbehaving wireless device. INDEPENDENT CLAIMS are included for the following:a method for wireless communication of first wireless device;a apparatus for wireless communication at second wireless device; anda method for wireless communication of second wireless device. Method for cooperative early threat detection and avoidance in cellular vehicle-to-everything (C-V2X). The method enables facilitating cooperative early threat detection and avoidance in cellular vehicle-to-everything (C-V2X) and/or D2D technology in an effective manner. The drawing shows a schematic view of a first device and a second device. 310Wireless communication device359Processor360Memory370Receive processor374Channel estimator |
Please summarize the input | Methods and systems for managing interactions between vehicles with varying levels of autonomyMethods, devices and systems enable controlling an autonomous vehicle by identifying vehicles that are within a threshold distance of the autonomous vehicle, determining an autonomous capability metric of each of the identified vehicles, and adjusting a driving parameter of the autonomous vehicle based on the determined autonomous capability metric of each of the identified vehicles. Adjusting a driving parameter may include adjusting one or more of a minimum separation distance, a minimum following distance, a speed parameter, or an acceleration rate parameter.What is claimed is:
| 1. A method of controlling an autonomous vehicle, comprising:
determining dynamically, via a processor of the autonomous vehicle, a threshold distance appropriate for current conditions;
identifying, via the processor of the autonomous vehicle, vehicles that are within the dynamically determined threshold distance of the autonomous vehicle;
determining an autonomous capability metric of each of the identified vehicles; and
adjusting a driving parameter of the autonomous vehicle based on the determined autonomous capability metric of each of the identified vehicles.
| 2. The method of claim 1, wherein determining the autonomous capability metric of each of the identified vehicles comprises determining a level of autonomy of each identified vehicle.
| 3. The method of claim 1, wherein adjusting the driving parameter of the autonomous vehicle based on the determined autonomous capability metric of each identified vehicle comprises:
adjusting a minimum separation distance to be maintained between the autonomous vehicle and at least one vehicle of the identified vehicles.
| 4. The method of claim 3, wherein adjusting the minimum separation distance to be maintained between the autonomous vehicle and the at least one vehicle of the identified vehicles comprises adjusting the minimum separation distance based on the autonomous capability metric of the least one vehicle and a behavior model of the at least one vehicle.
| 5. The method of claim 1, wherein adjusting the driving parameter of the autonomous vehicle based on the determined autonomous capability metric of each identified vehicle comprises:
adjusting a minimum following distance to be maintained between the autonomous vehicle and at least one vehicle of the identified vehicles.
| 6. The method of claim 5, wherein adjusting the minimum following distance to be maintained between the autonomous vehicle and the at least one vehicle of the identified vehicles comprises adjusting the minimum following distance based on the autonomous capability metric of the least one vehicle and a behavior model of the at least one vehicle.
| 7. The method of claim 1, wherein adjusting the driving parameter of the autonomous vehicle based on the determined autonomous capability metric of each of the identified vehicles comprises one or more of:
adjusting a speed of the autonomous vehicle; or
adjusting an acceleration rate at which the autonomous vehicle will change speed.
| 8. The method of claim 7, wherein adjusting the speed of the autonomous vehicle or the acceleration rate at which the autonomous vehicle will change speed comprises adjusting the speed or the acceleration rate based on the autonomous capability metric of at least one vehicle of the identified vehicles and a behavior model of the at least one vehicle.
| 9. The method of claim 1, wherein determining the autonomous capability metric of each of the identified vehicles comprises receiving the autonomous capability metric from at least one vehicle of the identified vehicles.
| 10. The method of claim 1, wherein determining the autonomous capability metric of each of the identified vehicles comprises determining values that collectively identify or predict a level of autonomy or a performance capability of a nearby vehicle.
| 11. The method of claim 10, wherein determining the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle comprises determining the values by one or more of:
observing driving behavior of the nearby vehicle;
determining computing or sensor capability of the nearby vehicle; or
receiving information regarding the nearby vehicle's rating or certifications via C-V2X communications.
| 12. The method of claim 11, further comprising determining at least one of the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle based on the observed driving behavior, the determined at least one value representing one or more of:
a consistency, regularity or uniformity of vehicle operations;
a level of predictability for future vehicle operations;
a level of driver aggression;
a degree to which the nearby vehicle tracks a center of a driving lane;
number of driving errors per unit time;
compliance with local road rules;
compliance with safety rules;
reaction time of the autonomous vehicle; or
responsiveness of the autonomous vehicle to observable events.
| 13. The method of claim 10, further comprising determining at least one of the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle based on the determined sensor capability, the determined at least one value representing one of:
a sensor type;
a sensor make or model;
a sensor manufacturer;
number of autonomous driving sensors operating in the nearby vehicle;
sensor accuracy; or
precision of one or more sensors.
| 14. The method of claim 10, further comprising determining one or more of the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle based on information received via C-V2X communications, the one or more values representing one or more of:
a key performance indicator (KPI);
a surface performance rating;
a weather performance rating;
a vehicle capability;
a vehicle feature;
a supported algorithm; or
a prediction and control strategy.
| 15. A processor for a vehicle, wherein the processor is configured with processor executable instructions to:
determine dynamically a threshold distance appropriate for current conditions;
identify vehicles that are within the dynamically determined threshold distance of the vehicle;
determine an autonomous capability metric of each of the identified vehicles; and
adjust a driving parameter based on the determined autonomous capability metric of each of the identified vehicles.
| 16. The processor of claim 15, wherein the processor is further configured with processor executable instructions to determine the autonomous capability metric of each of the identified vehicles by determining a level of autonomy of each identified vehicle.
| 17. The processor of claim 15, wherein the processor is further configured with processor executable instructions to adjust the driving parameter of the vehicle based on the determined autonomous capability metric of each identified vehicle by adjusting at least one of:
a minimum separation distance to be maintained between the vehicle and at least one vehicle of the identified vehicles;
a minimum following distance to be maintained between the vehicle and the at least one vehicle of the identified vehicles;
a speed of the vehicle; or
an acceleration rate at which the vehicle will change speed.
| 18. The processor of claim 17, wherein the processor is further configured with processor executable instructions to:
adjust the minimum separation distance based on the autonomous capability metric of the least one vehicle and a behavior model of the at least one vehicle;
adjust the minimum following distance based on the autonomous capability metric of the least one vehicle and the behavior model of the at least one vehicle;
adjust the speed based on the autonomous capability metric of the at least one vehicle of the identified vehicles and the behavior model of the at least one vehicle; or
adjust the acceleration rate based on the autonomous capability metric of the at least one vehicle of the identified vehicles and the behavior model of the at least one vehicle.
| 19. The processor of claim 15, wherein the processor is further configured with processor executable instructions to determine the autonomous capability metric of each of the identified vehicles by receiving the autonomous capability metric from at least one vehicle of the identified vehicles.
| 20. The processor of claim 15, wherein the processor is further configured with processor executable instructions to determine the autonomous capability metric of each of the identified vehicles by determining values that collectively identify or predict a level of autonomy or a performance capability of a nearby vehicle.
| 21. The processor of claim 20, wherein the processor is further configured with processor executable instructions to determine the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle by determining the values by one or more of:
observing driving behavior of the nearby vehicle;
determining computing or sensor capability of the nearby vehicle; or
receiving information regarding the nearby vehicle's rating or certifications via C-V2X communications.
| 22. The processor of claim 21, wherein the processor is further configured with processor executable instructions to determine the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle based on the observed driving behavior by determining a value representing one or more of:
a consistency, regularity or uniformity of vehicle operations;
a level of predictability for future vehicle operations;
a level of driver aggression;
a degree to which the nearby vehicle tracks a center of a driving lane;
number of driving errors per unit time;
compliance with local road rules;
compliance with safety rules;
reaction time of the vehicle; or
responsiveness of the vehicle to observable events.
| 23. The processor of claim 21, wherein the processor is further configured with processor executable instructions to determine the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle based on the determined sensor capability by determining a value representing one or more of:
a sensor type;
a sensor make or model;
a sensor manufacturer;
number of autonomous driving sensors operating in the nearby vehicle;
sensor accuracy; or
precision of one or more sensors.
| 24. The processor of claim 21, wherein the processor is further configured with processor executable instructions to determine the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle based on information received via C-V2X communications by determining a value representing one or more of:
a key performance indicator (KPI);
a surface performance rating;
a weather performance rating;
a vehicle capability;
a vehicle feature;
a supported algorithm; or
a prediction and control strategy.
| 25. A non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of an autonomous vehicle to perform operations comprising:
determining dynamically a threshold distance appropriate for current conditions;
identifying vehicles that are within the dynamically determined threshold distance of the autonomous vehicle;
determining an autonomous capability metric of each of the identified vehicles; and
adjusting a driving parameter of the autonomous vehicle based on the determined autonomous capability metric of each of the identified vehicles.
| 26. A vehicle, comprising:
means for determining dynamically a threshold distance appropriate for current conditions;
means for identifying vehicles that are within the dynamically determined threshold distance of the vehicle;
means for determining an autonomous capability metric of each of the identified vehicles; and
means for adjusting a driving parameter of the vehicle based on the determined autonomous capability metric of each of the identified vehicles.
| 27. The vehicle of claim 26, wherein means for determining an autonomous capability metric of each of the identified vehicles comprises means for determining values that collectively identify or predict a level of autonomy or performance capability of a nearby vehicle based on one or more of:
observing driving behavior of the nearby vehicle;
determining computing or sensor capability of the nearby vehicle; or
receiving information regarding the nearby vehicle's rating or certifications via C-V2X communications.
| 28. The vehicle of claim 26, wherein means for determining an autonomous capability metric of each of the identified vehicles comprises means for determining one or more values that collectively identify or predict a level of autonomy or a performance capability of a nearby vehicle based on information received via C-V2X communications, the one or more values representing one or more of:
a key performance indicator (KPI);
a surface performance rating;
a weather performance rating;
a vehicle capability;
a vehicle feature;
a supported algorithm; or
a prediction and control strategy. | The method involves identifying (902) the vehicles that are within a threshold distance of an autonomous vehicle through a processor of the autonomous vehicle. An autonomous capability metric of each of the identified vehicles is determined (1104) in which the determining comprises a determining of a level of autonomy of each identified vehicle. A driving parameter of the autonomous vehicle is adjusted based on the determined autonomous capability metric of the identified vehicles. INDEPENDENT CLAIMS are included for the following:a processor;a non-transitory processor-readable storage medium storing program for controlling autonomous vehicle; anda vehicle. Method for controlling autonomous vehicle (claimed). The sensors enable the autonomous vehicle to operate safely with improved performance. The drawing shows a flow diagram illustrating method for adjusting behavior and operations of an autonomous vehicle based on the determined capabilities of the other surrounding vehicle. 902Step for identifying the vehicles that are within a threshold distance of an autonomous vehicle1012Step for controlling behavior of operation of vehicle1104Step for determining autonomous capability metric of each of the identified vehicles1108Step for adjusting driving parameter of the autonomous vehicle to be more trusting1114Step for adjusting driving parameter of the autonomous vehicle to be less trusting |
Please summarize the input | TRANSMISSION CONTROL IN APPLICATION LAYER BASED ON RADIO BEARER QUALITY METRICS IN VEHICULAR COMMUNICATIONMethods, apparatuses, and computer-readable mediums for wireless communication are disclosed by the present disclosure. In an aspect, receive, an application layer in a host user equipment (UE) receives, from an access layer in the host UE, a quality of service (QoS) indication comprising a metric that represents a quality of one or more radio bearers used for a vehicular communication with one or more other UEs. The application layer performs a transmission control over the vehicular communication based on the QoS indication.|1. A?method?of?wireless?communication,?comprising:
receiving,?by?an?application?layer?in?a?host?user?equipment?(UE)?,?from?an?access?layer?in?the?host?UE,?a?quality?of?service?(QoS)?indication?comprising?a?metric?that?represents?a?quality?of?one?or?more?radio?bearers?used?for?a?vehicular?communication?with?one?or?more?other?UEs;?and
performing,?at?the?application?layer,?a?transmission?control?over?the?vehicular?communication?based?on?the?QoS?indication.
| 2. The?method?of?claim?1,?wherein?the?metric?is?indicative?of?a?message?reception?performance?as?affected?by?a?presence?or?an?absence?of?message?interference?or?collision?in?the?one?or?more?radio?bearers.
| 3. The?method?of?claim?1,?wherein?the?performing?comprises?one?or?more?of?adjusting?a?transmission?rate?of?a?unicast?communication?of?the?host?UE,?a?transmission?range?of?a?groupcast?communication?of?the?host?UE,?a?maneuver?of?the?host?UE,?or?an?autonomous?driving?status?of?the?host?UE,?according?to?the?QoS?indication.
| 4. The?method?of?claim?1,?further?comprising?sharing?sensor?data?of?the?host?UE?with?a?remote?UE?via?a?unicast?communication?at?a?first?transmission?rate.
| 5. The?method?of?claim?4,?wherein?the?performing?comprises:
determining,?by?the?application?layer,?based?on?the?QoS?indication,?a?second?transmission?rate?supportable?by?the?unicast?communication;?and
adjusting?the?unicast?communication?according?to?the?second?transmission?rate.
| 6. The?method?of?claim?5,?wherein?the?adjusting?comprises?performing?inter-transmission?time?(ITT)?control?at?the?host?UE.
| 7. The?method?of?claim?5,
wherein?the?sharing?comprises?sharing?video?sensor?data?of?the?host?UE?with?the?remote?UE?over?the?unicast?communication;?and
wherein?the?adjusting?comprises?adjusting?a?video?resolution?of?a?video?codec?of?the?host?UE?according?to?the?second?transmission?rate?supportable?by?the?unicast?communication.
| 8. The?method?of?claim?4,?wherein?the?receiving?comprises?receiving?at?least?one?of?a?packet?error?rate?(PER)?or?a?negative?acknowledgement?(NACK)?statistic?related?to?the?unicast?communication?with?the?remote?UE.
| 9. The?method?of?claim?1,?further?comprising:
communicating,?by?the?host?UE,?with?a?plurality?of?other?UEs?via?a?groupcast?communication;?and
wherein?the?receiving?comprises?receiving?at?least?one?of?a?packet?error?rate?(PER)?or?a?negative?acknowledgement?(NACK)?statistic?related?to?the?groupcast?communication?with?the?plurality?of?other?UEs.
| 10. The?method?of?claim?9,?wherein?the?performing?comprises:
determining,?based?on?the?at?least?one?of?the?PER?or?the?NACK?statistic,?that?a?reachable?range?of?the?host?UE?fails?to?comply?with?a?minimum?range?requirement?of?a?vehicular?application?configured?for?controlling?a?maneuver?of?the?host?UE.
| 11. The?method?of?claim?10,?wherein?the?performing?further?comprises?cancelling?or?postponing?the?maneuver?of?the?host?UE.
| 12. The?method?of?claim?10,?wherein?the?performing?further?comprises?regenerating?a?driving?strategy?of?the?host?UE?to?match?the?reachable?range.
| 13. The?method?of?claim?10,?wherein?the?performing?further?comprises:
modifying?a?range?of?the?host?UE?according?to?the?reachable?range;?and
adjusting?the?maneuver?of?the?host?UE?based?on?the?range.
| 14. The?method?of?claim?13,?wherein?the?modifying?comprises?adjusting?a?radiated?power?of?the?host?UE.
| 15. The?method?of?claim?13,?wherein?adjusting?the?maneuver?comprises?slowing?down?the?host?UE,?following?a?stop?and?go?operation?at?the?host?UE,?or?exiting?an?autonomous?driving?mode?at?the?host?UE.
| 16. The?method?of?claim?10,?wherein?the?maneuver?comprises?a?coordinated?intersection?crossing.
| 17. The?method?of?claim?1,?wherein?the?QoS?indication?comprises?one?or?more?of?a?Packet?Error?Rate?(PER)?,?a?Packet?Received?Rate?(PRR)?,?an?average?number?of?retransmissions,?an?average?PER,?an?average?PRR,?an?acknowledgement?(ACK)?/negative?acknowledgement?(NACK)?statistic,?a?range?statistic?of?a?groupcast?group,?or?a?supported?bit?rate?for?a?radio?bearer.
| 18. The?method?of?claim?1,?wherein?the?vehicular?communication?comprises?a?new?radio?(NR)?vehicle-to-everything?(V2X)?communication.
| 19. A?non-transitory?computer-readable?medium?storing?instructions?that?when?executed?by?a?processor?cause?the?processor?to:
receive,?by?an?application?layer?in?a?host?user?equipment?(UE)?,?from?an?access?layer?in?the?host?UE,?a?quality?of?service?(QoS)?indication?comprising?a?metric?that?represents?a?quality?of?one?or?more?radio?bearers?used?for?a?vehicular?communication?with?one?or?more?other?UEs;?and
perform,?at?the?application?layer,?a?transmission?control?over?the?vehicular?communication?based?on?the?QoS?indication.
| 20. A?host?user?equipment?(UE)?for?wireless?communication,?comprising:
a?memory?storing?instructions;?and
a?processor?in?communication?with?the?memory,?wherein?the?processor?is?configured?to?execute?the?instructions?to:
receive,?by?an?application?layer?in?the?host?UE,?from?an?access?layer?in?the?host?UE,?a?quality?of?service?(QoS)?indication?comprising?a?metric?that?represents?a?quality?of?one?or?more?radio?bearers?used?for?a?vehicular?communication?with?one?or?more?other?UEs;?and
perform,?at?the?application?layer,?a?transmission?control?over?the?vehicular?communication?based?on?the?QoS?indication.
| 21. A?host?user?equipment?(UE)?for?wireless?communication,?comprising:
means?for?receiving,?by?an?application?layer?in?the?host?UE,?from?an?access?layer?in?the?host?UE,?a?quality?of?service?(QoS)?indication?comprising?a?metric?that?represents?a?quality?of?one?or?more?radio?bearers?used?for?a?vehicular?communication?with?one?or?more?other?UEs;?and
means?for?performing,?at?the?application?layer,?a?transmission?control?over?the?vehicular?communication?based?on?the?QoS?indication. | The method involves receiving, by an application layer (142) in a host user equipment (UE), from an access layer (146) in the host UE, a quality of service (QoS) indication including a metric that represents a quality of multiple radio bearers used for a vehicular communication with the other UEs. The transmission control over the vehicular communication is performed at the application layer based on the QoS indication. The inter-transmission time control is performed at the host UE. The range of the host UE is modified according to the reachable range. INDEPENDENT CLAIMS are included for the following:a non-transitory computer-readable medium storing instructions that when executed by a processor cause the processor to perform an operation for transmission control in application layer based on radio bearer quality metrics in vehicular communication, such as new radio vehicle-to-everything communication with vehicular communication system; anda host user equipment for wireless communication. Method for performing transmission control in application layer based on radio bearer quality metrics in vehicular communication, such as new radio vehicle-to-everything communication with vehicular communication system. Can also be used to provide various telecommunication services, such as telephony, video, data, messaging, and broadcasts. The quality of service indications may be used by the application layer to adapt the range for groupcast, thus allowing the application layer to adjust autonomous driving behavior. The drawing shows a schematic view of a wireless communications system and an access network. 100Wireless communications system110Coverage area132,134Backhaul links142Application layer146Access layer |
Please summarize the input | APPLICATION LAYER MESSAGES FOR LANE DESCRIPTION IN VEHICULAR COMMUNICATIONMethods, apparatuses, and computer-readable mediums for wireless communication are disclosed by the present disclosure. In an aspect, an application layer of a protocol layer stack of vehicular user equipment (UE) receives a vehicular communication message including an application layer data element that directly indicates a curvature or a slope of a lane in a road. The vehicular UE may then implement autonomous driving functionality based on the application layer data element. In another aspect, an application layer of a protocol layer stack of a device generates a vehicular communication message including an application layer data element that directly indicates a curvature or a slope of a lane in a road. The device may then transmit the vehicular communication message to a vehicular UE configured to implement autonomous driving functionality.|1. A?method?of?wireless?communication?at?a?vehicular?user?equipment?(UE)?,?comprising:
receiving,?by?an?application?layer?of?a?protocol?layer?stack?of?the?vehicular?UE,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and
implementing?autonomous?driving?functionality?based?on?the?application?layer?data?element.
| 2. The?method?of?claim?1,?wherein?the?receiving?comprises?receiving?the?vehicular?communication?message?from?another?vehicular?UE,?a?network,?an?infrastructure,?a?road?side?unit?(RSU)?,?or?a?relay.
| 3. The?method?of?claim?1,?wherein?the?vehicular?communication?message?further?includes?one?or?more?application?layer?data?elements?that?indicate?a?lane?width?of?the?lane.
| 4. The?method?of?claim?1,?wherein?the?vehicular?communication?message?further?includes?one?or?more?application?layer?data?elements?that?indicate?a?longitude?value,?a?latitude?value,?and?an?elevation?value?for?each?point?in?a?list?of?spaced?points?positioning?a?center?line?of?the?lane.
| 5. The?method?of?claim?4,?wherein?a?spacing?between?two?consecutive?points?in?the?list?of?spaced?points?is?a?function?of?the?curvature?of?the?road.
| 6. The?method?of?claim?4,?wherein?at?least?one?of?the?one?or?more?application?layer?data?elements?indicates?a?differential?value?of?a?position,?curvature,?or?slope?of?a?point?in?the?list?of?spaced?points?as?compared?to?a?neighboring?point?in?the?list?of?spaced?points.
| 7. The?method?of?claim?4,?wherein?at?least?one?of?the?one?or?more?application?layer?data?elements?indicates?a?differential?value?of?a?position,?curvature,?or?slope?of?a?point?in?the?list?of?spaced?points?as?compared?to?a?corresponding?previous?value?of?the?position,?curvature,?or?slope?of?the?point?in?the?list?of?spaced?points.
| 8. The?method?of?claim?4,?wherein?the?one?or?more?application?layer?data?elements?indicate?a?plurality?of?curvatures?or?slopes,?each?associated?with?at?least?one?point?in?the?list?of?spaced?points.
| 9. The?method?of?claim?1,?wherein?implementing?the?autonomous?driving?functionality?comprises?controlling?a?motion?of?the?vehicular?UE?on?the?road.
| 10. The?method?of?claim?1,?wherein?implementing?the?autonomous?driving?functionality?comprises?implementing?according?to?an?advanced?driver-assistance?system?(ADAS)?.
| 11. The?method?of?claim?1,?wherein?implementing?the?autonomous?driving?functionality?comprises?controlling?a?speed?or?an?acceleration?of?the?vehicular?UE.
| 12. The?method?of?claim?1,?wherein?implementing?the?autonomous?driving?functionality?comprises?adjusting?a?detection?range?of?a?sensor?used?in?an?advanced?driver-assistance?system?(ADAS)?.
| 13. The?method?of?claim?12,?wherein?the?sensor?comprises?a?camera,?a?radar,?or?a?light?detection?and?ranging?(LIDAR)?sensor.
| 14. The?method?of?claim?12,?wherein?adjusting?the?detection?range?of?the?sensor?comprises?adjusting?a?position?or?an?angle?of?the?sensor?based?on?the?curvature?of?the?lane?in?the?road.
| 15. The?method?of?claim?12,?wherein?adjusting?the?detection?range?of?the?sensor?comprises?adjusting?a?yaw?angle?of?the?sensor?toward?the?curvature?of?the?lane?in?the?road.
| 16. The?method?of?claim?12,?wherein?the?slope?comprises?a?longitudinal?slope,?wherein?adjusting?the?detection?range?of?the?sensor?comprises?adjusting?a?pitch?angle?of?the?sensor?toward?the?longitudinal?slope?of?the?lane?in?the?road.
| 17. The?method?of?claim?1,?wherein?implementing?the?autonomous?driving?functionality?comprises?determining?a?speed?or?acceleration?limitation?based?on?the?curvature?or?the?slope?of?the?lane?in?the?road.
| 18. The?method?of?claim?17,?wherein?implementing?the?autonomous?driving?functionality?further?comprises?managing?a?safe?turning?of?the?vehicular?UE?by?decelerating?to?an?allowed?maximum?speed.
| 19. The?method?of?claim?17,?wherein?implementing?the?autonomous?driving?functionality?comprises?determining?the?speed?or?acceleration?limitation?based?on?a?sharpness?level?of?the?curvature?of?the?lane?in?the?road.
| 20. The?method?of?claim?1,?wherein?the?slope?comprises?a?longitudinal?slope,?wherein?implementing?the?autonomous?driving?functionality?comprises?determining?an?efficient?acceleration?value?based?on?the?longitudinal?slope?to?manage?an?uphill?motion?of?the?vehicular?UE.
| 21. The?method?of?claim?1,?wherein?the?slope?comprises?a?longitudinal?slope?or?a?transverse?slope?or?both.
| 22. The?method?of?claim?1,?wherein?the?vehicular?communication?message? comprises?a?vehicle-to-everything?(V2X)?message.
| 23. A?non-transitory?computer-readable?medium?storing?instructions?that?when?executed?by?a?processor,?cause?the?processor?to:
receive,?by?an?application?layer?of?a?protocol?layer?stack?of?a?vehicular?user?equipment?(UE)?,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and
implement?autonomous?driving?functionality?based?on?the?application?layer?data?element.
| 24. The?non-transitory?computer-readable?medium?of?claim?23,?wherein?the?processor?is?further?configured?to?perform?any?of?methods?2-22.
| 25. A?vehicular?user?equipment?(UE)?for?wireless?communication,?comprising:
a?memory?storing?instructions;?and
a?processor?in?communication?with?the?memory,?wherein?the?processor?is?configured?to?execute?the?instructions?to:
receive,?by?an?application?layer?of?a?protocol?layer?stack?of?the?vehicular?UE,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and
implement?autonomous?driving?functionality?based?on?the?application?layer?data?element.
| 26. The?vehicular?UE?of?claim?25,?wherein?the?processor?is?further?configured?to?perform?any?of?methods?2-22.
| 27. A?vehicular?user?equipment?(UE)?for?wireless?communication,?comprising:
means?for?receiving,?by?an?application?layer?of?a?protocol?layer?stack?of?the? vehicular?UE,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and
means?for?implementing?autonomous?driving?functionality?based?on?the?application?layer?data?element.
| 28. The?vehicular?UE?of?claim?27,?further?comprising?means?for?performing?any?of?methods?2-22.
| 29. A?method?of?wireless?communication,?comprising:
generating,?by?an?application?layer?of?a?protocol?layer?stack?of?a?device,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and
transmitting?the?vehicular?communication?message?to?a?vehicular?user?equipment?(UE)?configured?to?implement?autonomous?driving?functionality.
| 30. A?non-transitory?computer-readable?medium?storing?instructions?that?when?executed?by?a?processor,?cause?the?processor?to:
generate,?by?an?application?layer?of?a?protocol?layer?stack?of?a?device,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and
transmit?the?vehicular?communication?message?to?a?vehicular?user?equipment?(UE)?configured?to?implement?autonomous?driving?functionality.
| 31. A?device,?comprising:
a?memory?storing?instructions;?and
a?processor?in?communication?with?the?memory,?wherein?the?processor?is?configured?to?execute?the?instructions?to:
generate,?by?an?application?layer?of?a?protocol?layer?stack?of?a?device,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and
transmit?the?vehicular?communication?message?to?a?vehicular?user?equipment?(UE)?configured?to?implement?autonomous?driving?functionality.
| 32. A?device,?comprising:
means?for?generating,?by?an?application?layer?of?a?protocol?layer?stack?of?a?device,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and
means?for?transmitting?the?vehicular?communication?message?to?a?vehicular?user?equipment?(UE)?configured?to?implement?autonomous?driving?functionality. | The method (900) involves receiving (902) a vehicular communication message including an application layer data element that directly indicates a curvature or a slope of a lane in a road by an application layer of a protocol layer stack of the vehicular UE. The autonomous driving functionality is implemented (904) based on the application layer data element. The vehicular communication message is received from another vehicular UE, a network, an infrastructure, a road side unit (RSU), or a relay. The vehicular communication message is provided with several application layer data elements that indicate a lane width of the lane. The vehicular communication message is provided with application layer data elements that indicate a longitude value, a latitude value, and an elevation value for each point in a list of spaced points positioning a center line of the lane. INDEPENDENT CLAIMS are included for the following:a non-transitory computer-readable medium storing program for wireless communication;a vehicular user equipment for wire1ess communication; anda device for wire1ess communication. Method for wireless communication at vehicular UE referred to as internet of things (IoT) devices such as parking meter, gas pump, toaster, vehicles, and heart monitor. The capacity of the access network is improved. The accuracy of estimations depend on the density of points is improved. The system allows for improved driving assistance such as speed and acceleration control. The drawing shows a flowchart illustrating the method for wireless communication at vehicular UE. 900Method for wireless communication at vehicular UE902Step for receiving vehicular communication message including application layer data element904Step for implementing autonomous driving functionality based on application layer data |
Please summarize the input | ENFORCING RANGE RELIABILITY FOR INFORMATION SHARED VIA WIRELESS TRANSMISSIONSAn ego vehicle determines an intended maneuver and identifies a first set of agents for coordinating the intended maneuver. The ego vehicle also determines a spatial distance for obtaining a level of communication reliability with the set of agents that is greater than a communication reliability threshold. The ego vehicle further applies the determined spatial distance to a sensor-sharing message. The ego vehicle also transmits the sensor-sharing message to a second set of agents within the determined range. The ego vehicle performs the intended maneuver.What is claimed is:
| 1. A method performed by an ego vehicle, comprising:
determining an intended maneuver of the ego vehicle;
identifying a first set of agents for coordinating the intended maneuver;
determining a spatial distance for obtaining a level of communication reliability with the set of agents that is greater than a communication reliability threshold;
applying the determined spatial distance to a sensor-sharing message;
transmitting the sensor-sharing message to a second set of agents within the determined spatial distance; and
performing the intended maneuver.
| 2. The method of claim 1, further comprising transmitting the sensor-sharing message via at least one of a vehicle-to-everything (V2X) transmission, a vehicle-to-vehicle (V2V) transmission, a vehicle-to-infrastructure (V2I) transmission, or a combination thereof.
| 3. The method of claim 1, in which agents in the first set of agents and the second set of agents comprise at least one of a vehicle, an infrastructure component, a road side unit, a non-vehicular road user, or a combination thereof.
| 4. The method of claim 3, further comprising receiving communications from an embedded vehicle-to-everything (V2X) device of the non-vehicular road user or a hand-held V2X device of the non-vehicular road user.
| 5. The method of claim 1, further comprising determining the range based on at least one of the intended maneuver, a speed of the ego vehicle, a number of agents detected within a distance of the ego vehicle, a speed of at least one other agent, a direction of travel of at least one other agent, a road condition, a visibility level, a type of road, a quality of service (QoS), an automation level of the ego vehicle, a direction of travel of the ego vehicle, or a combination thereof.
| 6. The method of claim 5, in which the distance is based on at least the intended maneuver, the road condition, the type of road, the speed of the ego vehicle or a combination thereof.
| 7. The method of claim 5, in which:
the ego vehicle is capable of performing a plurality of maneuvers, and
each maneuver corresponds to a different range.
| 8. The method of claim 1, further comprising coordinating the intended maneuver with each agent of the first set of agents within the determined spatial distance.
| 9. The method of claim 8, further comprising coordinating the intended maneuver via least one of a vehicle-to-everything (V2X) transmission, a vehicle-to-vehicle (V2V) transmission, a vehicle-to-infrastructure (V2I), or a combination thereof.
| 10. The method of claim 1, in which the sensor sharing message identifies objects detected within a distance of the ego vehicle via at least one sensor integrated with the ego vehicle.
| 11. The method of claim 10, in which the objects comprise at least one of non-V2X capable vehicles, non-vehicular road users, infrastructure, road obstacles, road impairments, or a combination thereof.
| 12. The method of claim 1, further comprising:
determining the range at an application-layer; and
enforcing the range at a physical-layer.
| 13. The method of claim 1, in which the ego vehicle comprises an autonomous vehicle or a semi-autonomous vehicle.
| 14. An apparatus of an ego vehicle, comprising:
means for determining an intended maneuver of the ego vehicle;
means for identifying a first set of agents for coordinating the intended maneuver;
means for determining a spatial distance for obtaining a level of communication reliability with the set of agents that is greater than a communication reliability threshold;
means for applying the determined spatial distance to a sensor-sharing message;
means for transmitting the sensor-sharing message to a second set of agents within the determined spatial distance; and
means for performing the intended maneuver.
| 15. The apparatus of claim 14, further comprising means for transmitting the sensor-sharing message via at least one of a vehicle-to-everything (V2X) transmission, a vehicle-to-vehicle (V2V) transmission, a vehicle-to-infrastructure (V2I) transmission, or a combination thereof.
| 16. The apparatus of claim 14, in which agents in the first set of agents and the second set of agents comprise at least one of a vehicle, an infrastructure component, a road side unit, a non-vehicular road user, or a combination thereof.
| 17. The apparatus of claim 16, further comprising means for receiving communications from an embedded vehicle-to-everything (V2X) device of the non-vehicular road user or a hand-held V2X device of the non-vehicular road user.
| 18. The apparatus of claim 14, further comprising means for determining the range based on at least one of the intended maneuver, a speed of the ego vehicle, a number of agents detected within a distance of the ego vehicle, a speed of at least one other agent, a direction of travel of at least one other agent, a road condition, a visibility level, a type of road, a quality of service (QoS), an automation level of the ego vehicle, a direction of travel of the ego vehicle, or a combination thereof.
| 19. The apparatus of claim 18, in which the distance is based on at least the intended maneuver, the road condition, the type of road, the speed of the ego vehicle or a combination thereof.
| 20. The apparatus of claim 18, in which:
the ego vehicle is capable of performing a plurality of maneuvers, and
each maneuver corresponds to a different range.
| 21. The apparatus of claim 14, further comprising means for coordinating the intended maneuver with each agent of the first set of agents within the determined spatial distance.
| 22. The apparatus of claim 21, further comprising means for coordinating the intended maneuver via least one of a vehicle-to-everything (V2X) transmission, a vehicle-to-vehicle (V2V) transmission, a vehicle-to-infrastructure (V2I), or a combination thereof.
| 23. The apparatus of claim 14, in which the sensor sharing message identifies objects detected within a distance of the ego vehicle via at least one sensor integrated with the ego vehicle.
| 24. The apparatus of claim 23, in which the objects comprise at least one of non-V2X capable vehicles, non-vehicular road users, infrastructure, road obstacles, road impairments, or a combination thereof.
| 25. The apparatus of claim 14, further comprising:
means for determining the range at an application-layer; and
means for enforcing the range at a physical-layer.
| 26. The apparatus of claim 14, in which the ego vehicle comprises an autonomous vehicle or a semi-autonomous vehicle.
| 27. An ego vehicle, comprising:
a processor;
a memory coupled with the processor; and
instructions stored in the memory and operable, when executed by the processor, to cause the ego vehicle:
to determine an intended maneuver;
to identify a first set of agents for coordinate the intended maneuver;
to determine a spatial distance for obtaining a level of communication reliability with the set of agents that is greater than a communication reliability threshold;
to apply the determined spatial distance to a sensor-sharing message;
to transmit the sensor-sharing message to a second set of agents within the determined spatial distance; and
to perform the intended maneuver.
| 28. The ego vehicle of claim 27, in which the instructions further cause the ego vehicle to transmit the sensor-sharing message via at least one of a vehicle-to-everything (V2X) transmission, a vehicle-to-vehicle (V2V) transmission, a vehicle-to-infrastructure (V2I) transmission, or a combination thereof.
| 29. The ego vehicle of claim 27, in which agents in the first set of agents and the second set of agents comprise at least one of a vehicle, an infrastructure component, a road side unit, a non-vehicular road user, or a combination thereof.
| 30. The ego vehicle of claim 29, in which the instructions further cause the ego vehicle to receive communications from an embedded vehicle-to-everything (V2X) device of the non-vehicular road user or a hand-held V2X device of the non-vehicular road user.
| 31. The ego vehicle of claim 27, in which the instructions further cause the ego vehicle to determine the range based on at least one of the intended maneuver, a speed of the ego vehicle, a number of agents detected within a distance of the ego vehicle, a speed of at least one other agent, a direction of travel of at least one other agent, a road condition, a visibility level, a type of road, a quality of service (QoS), an automation level of the ego vehicle, a direction of travel of the ego vehicle, or a combination thereof.
| 32. The ego vehicle of claim 31, in which the distance is based on at least the intended maneuver, the road condition, the type of road, the speed of the ego vehicle or a combination thereof.
| 33. The ego vehicle of claim 31, in which:
the ego vehicle is capable of performing a plurality of maneuvers, and
each maneuver corresponds to a different range.
| 34. The ego vehicle of claim 27, in which the instructions further cause the ego vehicle to coordinate the intended maneuver with each agent of the first set of agents within the determined spatial distance.
| 35. The ego vehicle of claim 34, in which the instructions further cause the ego vehicle to coordinate the intended maneuver via least one of a vehicle-to-everything (V2X) transmission, a vehicle-to-vehicle (V2V) transmission, a vehicle-to-infrastructure (V2I), or a combination thereof.
| 36. The ego vehicle of claim 27, in which the sensor sharing message identifies objects detected within a distance of the ego vehicle via at least one sensor integrated with the ego vehicle.
| 37. The ego vehicle of claim 36, in which the objects comprise at least one of non-V2X capable vehicles, non-vehicular road users, infrastructure, road obstacles, road impairments, or a combination thereof.
| 38. The ego vehicle of claim 27, in which the instructions further cause the ego vehicle:
to determine the range at an application-layer; and
to enforce the range at a physical-layer.
| 39. The ego vehicle of claim 27, in which the ego vehicle comprises an autonomous vehicle or a semi-autonomous vehicle.
| 40. A non-transitory computer-readable medium having program code recorded thereon, the program code executed by a processor and comprising:
program code to determine an intended maneuver of an ego vehicle;
program code to identify a first set of agents for coordinate the intended maneuver;
program code to determine a spatial distance for obtaining a level of communication reliability with the set of agents that is greater than a communication reliability threshold;
program code to transmit the sensor-sharing message to a second set of agents within the determined spatial distance; and
program code to perform the intended maneuver. | The method involves determining (12) an intended maneuver of an ego vehicle, and identifying a set of agents for coordinating the intended maneuver. A spatial distance is determined (13) for obtaining a level of communication reliability with the agents that is greater than a communication reliability threshold. The determined spatial distance to a sensor-sharing message is applied, and the message is transmitted to another set of the agents within the determined distance. The intended maneuver is performed (17), and a range is determined at an application-layer. The range is enforced at a physical-layer, where the vehicle is an autonomous vehicle or semi-autonomous vehicle. INDEPENDENT CLAIMS are included for the following:an apparatus of an ego vehicle;an ego vehicle; anda non-transitory computer-readable medium storing program for applying a spatial distance to sensor-sharing messages. Method for applying spatial distance to sensor-sharing messages and used for enforcing range reliability for information through wireless transmissions by ego vehicle. Uses include but are not limited to telephony, video, data, messaging, and broadcasts. The method enables determining the spatial distance for obtaining a level of communication reliability with the set of agents that is greater than a communication reliability threshold in an efficient manner. The method allows the ego vehicle to transmit the sensor-sharing message to the agents within the determined spatial distance, so that the vehicle can perform the intended maneuver, thus increasing safety and preventing collisions of the vehicles. The drawing shows a flow diagram of a method for applying a spatial distance to sensor-sharing messages. 12Step for determining an intended maneuver of an ego vehicle13Step for determining spatial distance14Step for identifying objects15Step for sharing sensor information16Step for coordinating the intended maneuver |
Please summarize the input | ARCHITECTURE AND PROTOCOL LAYERING FOR SIDELINK POSITIONINGIn some implementations, a user equipment (UE) may implement a ranging support protocol layer comprising one or more ranging support elements. The UE may communicate, using the one or more ranging support elements of the ranging support protocol layer, with a corresponding ranging support protocol layer in one or more other UEs, wherein the communicating is conducted via at least one lower protocol layer implemented at the UE. The UE may provide, at the ranging support protocol layer, a positioning service to an upper protocol layer implemented at the UE, the positioning service based at least in part on the communicating.|1. A method for supporting side link (SL) positioning, the method is performed by a user equipment (UE) and includes the following steps:
implement a ranging support protocol layer including one or more ranging support elements at the UE;
The one or more ranging support elements of the ranging support protocol layer are used to communicate with a corresponding ranging support protocol layer in one or more other UEs, wherein the steps of communicating are via at least one step performed at the UE performed at the underlying protocol level; and
A positioning service is provided at the ranging support protocol layer to an upper protocol layer implemented at the UE, the positioning service being based at least in part on the steps of communication.
| 2. The method of claim 1, wherein the one or more ranging support elements include a discovery function, and the positioning service includes:
Information including a unique identifier of another UE among the one or more other UEs that is capable of participating in side link positioning and ranging services,
information including an indication of a service supported by another of the one or more other UEs,
a side link communication channel with another one of the one or more other UEs,
side link communication period with another one of the one or more other UEs, or
any combination thereof.
| 3. The method described in claim 2 further includes the following steps: using the exploration function to receive information from the upper protocol layer, wherein the information includes:
a trigger for exploring UEs participating in side-link positioning and ranging services,
The attributes of the UE to be explored,
a permission for exploration by another of the one or more other UEs and corresponding attributes of the other UE,
a request or permission for side-link positioning and ranging services, or
any combination thereof.
| 4. The method as described in request item 1, wherein:
the one or more ranging support elements include a group support function; and
Steps to provide a location service to the upper protocol layer include using the group support function to:
when the upper protocol layer specifies a side link positioning and ranging service group, establishing the side link positioning and ranging service group with more than two of the one or more other UEs,
Provide the group ID and group local member ID to the upper protocol layer,
Manage the addition or removal of group members,
Split or merge groups,
Monitor group membership status, or
any combination thereof.
| 5. The method described in claim 4 further includes the following steps: using the group support function to receive information from the upper protocol layer, wherein the information includes:
a request to establish the sidelink positioning and ranging service group,
a request to add or remove a specific group member UE,
A management request for the sidelink positioning and ranging service group that includes merging or splitting groups, or
any combination thereof.
| 6. The method of claim 1, wherein the one or more ranging support elements include side link positioning and ranging protocol functions, and the step of providing a positioning service to the upper protocol layer includes providing:
On-demand sidelink positioning and ranging for determining a range, direction, relative position or relative velocity of another UE or each of a group of other UEs;
Periodic sidelink positioning and ranging for periodically determining a range, direction, relative position or relative speed of another UE or each UE in a group of other UEs;
Triggered sidelink positioning and ranging for triggering determination of a range, direction, relative position or relative speed of another UE or each of a group of other UEs; or
any combination thereof.
| 7. The method described in claim 6 further includes the following steps: using the side link positioning and ranging protocol function to receive information from the upper protocol layer, wherein the information includes:
a request for a current range, direction, relative position or relative speed of another UE or group of UEs,
a request for a periodic range, direction, relative position or relative speed of another UE or group of UEs,
a request for a triggered range, direction, relative position or relative velocity for another UE or group of UEs, or
any combination thereof.
| 8. The method of claim 6 further includes the step of using the side-link positioning and ranging protocol function to communicate with a network server that supports side-link positioning and ranging.
| 9. The method of claim 8, wherein the side-link positioning and ranging protocol function communicates with the network server supporting side-link positioning and ranging using non-access layer (NAS) signaling.
| 10. The method of claim 1, wherein the upper protocol layer is an application layer, and the at least one lower protocol layer includes a ProSe layer, a V2X layer or an Access Layer (AS) layer.
| 11. The method of claim 10, wherein the application layer supports vehicle-to-everything (V2X), autonomous driving, movement of objects in a factory or warehouse, UE-to-UE ranging, or a combination thereof.
| 12. The method of claim 10, wherein the step of communicating using the one or more ranging support elements of the ranging support protocol layer includes using a PC5 communication provided by the ProSe layer, the V2X layer or the AS layer Serve.
| 13. A user equipment (UE) including:
a transceiver;
a memory; and
One or more processors communicatively coupled to the transceiver and the memory, wherein the one or more processors are configured to:
implement a ranging support protocol layer including one or more ranging support elements;
communicating via the transceiver using the one or more ranging support elements of the ranging support protocol layer with a corresponding ranging support protocol layer in one or more other UEs, wherein the communication is performed at the UE at least one lower protocol layer; and
A positioning service is provided at the ranging support protocol layer to an upper protocol layer implemented at the UE, the positioning service being based at least in part on the communication.
| 14. A UE as described in request item 13, wherein:
In order to communicate using the one or more ranging support elements, the one or more processors are configured to implement a discovery function; and
To provide the location service, the one or more processors are configured to provide:
Information including a unique identifier of another UE among the one or more other UEs that is capable of participating in side link positioning and ranging services,
information including an indication of a service supported by another of the one or more other UEs,
a side link communication channel with another one of the one or more other UEs,
side link communication period with another one of the one or more other UEs, or
any combination thereof.
| 15. The UE of claim 14, wherein the one or more processors are further configured to utilize the discovery function to receive information from the upper protocol layer, wherein the information includes:
a trigger for exploring UEs participating in side-link positioning and ranging services,
The attributes of the UE to be explored,
a permission for exploration by another of the one or more other UEs and corresponding attributes of the other UE,
a request or permission for side-link positioning and ranging services, or
any combination thereof.
| 16. A UE as described in request item 13, wherein:
To communicate using the one or more ranging support elements, the one or more processors are configured to implement a group support function; and
To provide the location service to the upper protocol layer, the one or more processors are configured to use the group support function to:
when the upper protocol layer specifies a side link positioning and ranging service group, establishing the side link positioning and ranging service group with more than two of the one or more other UEs,
Provide the group ID and group local member ID to the upper protocol layer,
Manage the addition or removal of group members,
Split or merge groups,
Monitor group membership status, or
any combination thereof.
| 17. The UE of claim 16, wherein the one or more processors are further configured to utilize the group support function to receive information from the upper protocol layer, wherein the information includes:
a request to establish the sidelink positioning and ranging service group,
a request to add or remove a specific group member UE,
A management request for the sidelink positioning and ranging service group that includes merging or splitting groups, or
any combination thereof.
| 18. A UE as described in request item 13, wherein:
To communicate using the one or more ranging support elements, the one or more processors are configured to implement side link positioning and ranging protocol functions; and
In order to provide the location service to the upper protocol layer, the one or more processors are configured to provide:
On-demand sidelink positioning and ranging for determining a range, direction, relative position or relative velocity of another UE or each of a group of other UEs;
Periodic sidelink positioning and ranging for periodically determining a range, direction, relative position or relative speed of another UE or each UE in a group of other UEs;
Triggered sidelink positioning and ranging for triggering determination of a range, direction, relative position or relative speed of another UE or each of a group of other UEs; or
any combination thereof.
| 19. The UE of claim 18, wherein the one or more processors are further configured to utilize the sidelink positioning and ranging protocol function to receive information from the upper protocol layer, wherein the information includes:
a request for a current range, direction, relative position or relative speed of another UE or group of UEs,
a request for a periodic range, direction, relative position or relative speed of another UE or group of UEs,
a request for a triggered range, direction, relative position or relative velocity for another UE or group of UEs, or
any combination thereof.
| 20. The UE of claim 18, wherein the one or more processors are further configured to use the sidelink positioning and ranging protocol function and a network server supporting sidelink positioning and ranging via the transceiver. communication.
| 21. The UE of claim 20, wherein the one or more processors are configured to use the side-link positioning and ranging protocol function to transmit and support side-link positioning and ranging using non-access layer (NAS) signaling. communication with this web server.
| 22. A UE as described in request item 13, wherein:
To provide the location service to the upper protocol layer, the one or more processors are configured to provide the location service to an application layer; and
To communicate via the at least one lower protocol layer, the one or more processors are configured to communicate via a ProSe layer, V2X layer or Access Layer (AS) layer.
| 23. The UE of claim 22, wherein in order to communicate using the one or more ranging support elements of the ranging support protocol layer, the one or more processors are configured to use the ProSe layer, the V2X layer Or a PC5 communication service provided by the AS layer.
| 24. A device for supporting side link (SL) positioning, the device comprising:
Components for implementing a ranging support protocol layer including one or more ranging support elements;
Means for communicating with a corresponding ranging support protocol layer in one or more other UEs using the one or more ranging support elements of the ranging support protocol layer, wherein the communication is performed at the UE at least one underlying protocol layer; and
Means for providing a positioning service at the ranging support protocol layer to an upper protocol layer implemented at the UE, the positioning service being based at least in part on the communication.
| 25. A device as claimed in claim 24, wherein:
The means for communicating using the one or more ranging support elements include means for implementing an exploration function; and
The components used to provide the location service include components used to provide the following information:
Information including a unique identifier of another UE among the one or more other UEs that is capable of participating in side link positioning and ranging services,
information including an indication of a service supported by another of the one or more other UEs,
a side link communication channel with another one of the one or more other UEs,
side link communication period with another one of the one or more other UEs, or
any combination thereof.
| 26. The device of claim 25, further comprising means for utilizing the discovery function to receive information from the upper protocol layer, wherein the information includes:
a trigger for exploring UEs participating in side-link positioning and ranging services,
The attributes of the UE to be explored,
a permission for exploration by another of the one or more other UEs and corresponding attributes of the other UE,
a request or permission for side-link positioning and ranging services, or
any combination thereof.
| 27. A device as claimed in claim 24, wherein:
the means for communicating using the one or more ranging support elements include means for implementing a group support function; and
The means for providing the location service include means for using the group support functionality to:
when the upper protocol layer specifies a side link positioning and ranging service group, establishing the side link positioning and ranging service group with more than two of the one or more other UEs,
Provide the group ID and group local member ID to the upper protocol layer,
Manage the addition or removal of group members,
Split or merge groups,
Monitor group membership status, or
any combination thereof.
| 28. The device of claim 27, further comprising means for utilizing the group support function to receive information from the upper protocol layer, wherein the information includes:
a request to establish the sidelink positioning and ranging service group,
a request to add or remove a specific group member UE,
A management request for the sidelink positioning and ranging service group that includes merging or splitting groups, or
any combination thereof.
| 29. A device as claimed in claim 24, wherein:
the means for communicating using the one or more ranging support elements include means for implementing side link positioning and ranging protocol functions; and
The components used to provide the location service include components used to provide the following information:
On-demand sidelink positioning and ranging for determining a range, direction, relative position or relative velocity of another UE or each of a group of other UEs;
Periodic sidelink positioning and ranging for periodically determining a range, direction, relative position or relative speed of another UE or each UE in a group of other UEs;
Triggered sidelink positioning and ranging for triggering determination of a range, direction, relative position or relative speed of another UE or each of a group of other UEs; or
any combination thereof.
| 30. A non-transitory computer-readable medium that stores instructions for supporting side-link (SL) positioning, including code for:
implement a ranging support protocol layer including one or more ranging support elements;
The one or more ranging support elements of the ranging support protocol layer are used to communicate with a corresponding ranging support protocol layer in one or more other UEs, wherein the communication is via at least one lower layer protocol implemented at the UE carried out in layers; and
A positioning service is provided at the ranging support protocol layer to an upper protocol layer implemented at the UE, the positioning service being based at least in part on the communication. | The method (1400) involves implementing a ranging support protocol layer comprising ranging support elements at a user equipment (UE) e.g. mobile phone. The communication is made with a corresponding ranging support layer in other UEs through a lower protocol layer implemented at the former UE. The positioning service is provided (1402) to an upper protocol layer at the latter UE, where the positioning service comprises information comprising a unique identifier of another UE of the latter UEs that participate in a sidelink positioning and ranging service. The information is received from the upper layer with a discovery function. INDEPENDENT CLAIMS are included for: (1) a user equipment comprises a transceiver; (2) an apparatus for supporting sidelink positioning of user equipment; (3) a non-transitory computer-readable medium for storing instructions. Method for supporting sidelink positioning of user equipment, such as cellular phone, personal digital assistant, laptop computer, cordless phone, wireless local loop station, personal computer, tablet, set-top box, web appliance, network router, switch or bridge. The method enables allowing the UEs to communicate using sidelink signaling and to be located using the sidelink related positioning in an effective manner. The method allows a user equipment (UE) to communicate with other UEs using the positioning service based on the positioning measurements obtained by the base station, so that the positioning services can be provided to the UE in an efficient manner. The drawing shows a flow diagram of a sidelink positioning supporting method.1400Sidelink positioning supporting method 1402Providing services to an upper layer of the architecture 1404Communicating by the ranging support elements |
Please summarize the input | METHOD AND APPARATUS FOR VEHICLE MANEUVER PLANNING AND MESSAGINGTechniques are provided which may be implemented using various methods and/or apparatuses in a vehicle to utilize vehicle external sensor data, vehicle internal sensor data, vehicle capabilities and external V2X input to determine, send, receive and utilize V2X information and control data, sent between the vehicle and a road side unit (RSU) to determine intersection access and vehicle behavior when approaching the intersection.|1. A method for an autonomous vehicle to enter an intersection, which includes:Determine a braking distance for the autonomous vehicle based on a vehicle external sensor, a vehicle internal sensor, vehicle capability, or external V2X input, or a combination thereof;A first message is sent from the autonomous vehicle, where the first message includes an identification data element or a vehicle type or a vehicle priority or a combination thereof for the autonomous vehicle and a braking distance data for the autonomous vehicle element;Receiving a second message from a roadside unit (RSU) based at least in part on the braking distance for the autonomous vehicle, which includes one or more instructions regarding the autonomous vehicle's intersection entry; andControl the autonomous vehicle to enter the intersection in response to the one or more instructions received from the RSU.
| 2. For example, the method for intersection entry of request item 1, further includes sending a third message from the autonomous vehicle to the RSU before the second message, thereby requesting intersection entry.
| 3. For example, the method for driving at an intersection in claim 1, wherein the braking distance for the autonomous vehicle is determined based at least in part on the speed of the autonomous vehicle.
| 4. For example, the intersection approach method of claim 3, wherein the braking distance for the autonomous vehicle is determined based at least in part on the tire pressure or weather conditions or tire traction data for the autonomous vehicle or a combination thereof.
| 5. For example, the method of driving at an intersection in claim 1, wherein the braking distance for the autonomous vehicle is shorter in the automatic control mode than in the manual mode.
| 6. For example, the method for entering the intersection of request item 1, wherein the first message is a broadcast message.
| 7. For example, the method of entering the intersection of request item 1, wherein the first message is a point-to-point message.
| 8. For example, the method for entering an intersection of request item 1, wherein the first message is a basic safety message or a cooperative sensing message.
| 9. An autonomous vehicle, which includes:One or more wireless transceivers;Vehicle interior sensor;Vehicle external sensor;A memory; andOne or more processors, which are communicatively coupled to the one or more wireless transceivers, the vehicle internal sensors, the vehicle external sensors, and the memory;The one or more processors are configured to:Determine a braking distance for the autonomous vehicle based on the external sensors of the vehicle, the internal sensors of the vehicle, the vehicle capability or the external V2X input, or a combination thereof;A first message is sent from the one or more wireless transceivers, wherein the first message includes an identification data element or a vehicle type or a vehicle priority or a combination thereof for the autonomous vehicle and for the autonomous vehicle One of the braking distance data elements;A second message is received at the one or more wireless transceivers from a road side unit (RSU) based at least in part on the braking distance for the autonomous vehicle, which includes one or more information about the intersection of the autonomous vehicle Multiple instructions; andControl the autonomous vehicle to enter the intersection in response to the one or more instructions received from the RSU.
| 10. Such as the autonomous vehicle of claim 9, wherein the one or more processors are further configured to send a third message from the one or more wireless transceivers to the RSU before the second message, thereby requesting an intersection Drive in.
| 11. Such as the autonomous vehicle of claim 9, wherein the braking distance for the autonomous vehicle is determined based at least in part on the speed of the autonomous vehicle and the empirical stopping distance data associated with the speed of the autonomous vehicle.
| 12. Such as the autonomous vehicle of claim 11, wherein the braking distance for the autonomous vehicle is determined based at least in part on the tire pressure or weather conditions or tire traction data for the autonomous vehicle or a combination thereof.
| 13. Such as the autonomous vehicle of claim 9, wherein the braking distance for the autonomous vehicle is shorter in the automatic control mode than in the manual mode.
| 14. Such as the autonomous vehicle of claim 9, wherein the first message is a broadcast message.
| 15. For example, the autonomous vehicle of claim 9, wherein the first message is a point-to-point message.
| 16. For example, the autonomous vehicle of claim 9, wherein the first message is a basic safety message or a cooperative sensing message.
| 17. An autonomous vehicle, which includes:A component used to determine a braking distance for the autonomous vehicle based on a vehicle exterior sensor, a vehicle interior sensor, vehicle capability, or external V2X input, or a combination thereof;A means for sending a first message from the autonomous vehicle, wherein the first message includes an identification data element or a vehicle type or a vehicle priority or a combination thereof for the autonomous vehicle and the information for the autonomous vehicle A braking distance data element;A member for receiving a second message from a road side unit (RSU) based at least in part on the braking distance for the autonomous vehicle, the second message including one or more instructions regarding the autonomous vehicle entering an intersection ;andA component for controlling the entry of the autonomous vehicle at the intersection in response to the one or more instructions received from the RSU.
| 18. For example, the autonomous vehicle of claim 17, which further includes a component for sending a third message from the autonomous vehicle to the RSU before the second message to request entry at the intersection.
| 19. For example, the autonomous vehicle of claim 17, wherein the first message is a broadcast message.
| 20. For example, the autonomous vehicle of claim 17, wherein the first message is a point-to-point message.
| 21. For example, the autonomous vehicle of claim 17, wherein the first message is a basic safety message or a cooperative sensing message.
| 22. A non-transitory computer-readable medium on which is stored computer-readable instructions that cause one or more processors on an autonomous vehicle to perform the following operations:Determine a braking distance for the autonomous vehicle based on a vehicle external sensor, a vehicle internal sensor, vehicle capability, or external V2X input, or a combination thereof;A first message is sent from the autonomous vehicle, where the first message includes an identification data element or a vehicle type or a vehicle priority or a combination thereof for the autonomous vehicle and a braking distance data for the autonomous vehicle element;Receiving a second message from a roadside unit (RSU) based at least in part on the braking distance for the autonomous vehicle, which includes one or more instructions regarding the autonomous vehicle's intersection entry; andControl the autonomous vehicle to enter the intersection in response to the one or more instructions received from the RSU.
| 23. For example, the non-transitory computer-readable medium of the request item 22 further includes an instruction for the one or more processors to send a third message to the RSU before the second message, so as to request entry into the intersection.
| 24. For example, the non-transitory computer-readable medium of claim 22, wherein the first message is a broadcast message.
| 25. For example, the non-transitory computer-readable medium of request 22, wherein the first message is a point-to-point message.
| 26. For example, the non-transitory computer-readable medium of request 22, wherein the first message is a basic security message or a cooperative awareness message. | The method involves determining a braking distance for the ego vehicle based upon vehicle external sensors, vehicle internal sensors, vehicle capabilities, or external V2X input, or a combination. A first message is sent from the ego vehicle. The first message includes an identification data element for the ego vehicle or a vehicle type or a vehicle priority or a combination thereof and a braking distance data element for the ego vehicle. A second message includes instructions with respect to intersection access by the ego vehicle is received from a roadside unit (RSU) based upon the braking distance for the ego vehicle. The intersection access is controlled by the ego vehicle in response to the instructions received from the RSU. INDEPENDENT CLAIMS are included for the following:an ego vehicle; anda non-transitory computer-readable medium storing program for an ego vehicle. Method for intersection access by ego vehicle. Increased tire inflation decreases the tire surface in contact with the road, reducing traction, and thus increases vehicle turning radius at current speed and reduces maneuverability at current speed. The drawing shows a block diagram of a system level embodiment for an ego vehicle. 910Processor930Wireless transceiver935Camera940Car sensor950Lidar |
Please summarize the input | System and method for relative positioning based safe autonomous drivingDisclosed is a method and apparatus for managing a driving plan of an autonomous vehicle. The method may include obtaining observations of a neighboring vehicle using one or more sensors of the autonomous vehicle. The method may also include classifying one or more behavioral driving characteristics of the neighboring vehicle based on the observations. Furthermore, the method may include updating the driving plan based on a classification of the one or more behavioral driving characteristics of the neighboring vehicle, and controlling one or more operations of the autonomous vehicle based on the updated driving plan.What is claimed is:
| 1. A method for managing a driving plan of an autonomous vehicle, comprising:
obtaining observations of a neighboring vehicle using one or more sensors of the autonomous vehicle, the observations including observed driving behaviors of the neighboring vehicle;
generating a driving risk pattern of the neighboring vehicle based on the observations;
updating the driving plan based on the generated driving risk pattern of the neighboring vehicle; and
controlling one or more operations of the autonomous vehicle based on the updated driving plan,
wherein the generated driving risk pattern indicates one of a plurality of different risk levels,
wherein generating the driving risk pattern of the neighboring vehicle comprises classifying the neighboring vehicle as a vehicle that lacks autonomous driving capability based on the observed driving behaviors of the neighboring vehicle, and
wherein the generated driving risk pattern comprises a first classification indicating the neighboring vehicle as a vehicle lacking autonomous driving capability.
| 2. The method of claim 1, further comprising:
determining a second classification of the neighboring vehicle based on vehicle characteristics of the neighboring vehicle exchanged in a vehicle to vehicle communication; and
cross-checking the second classification against the first classification.
| 3. The method of claim 1, wherein the one or more sensors of the autonomous vehicle used to obtain the observations comprise a RADAR sensor, a LIDAR sensor, a GPS sensor, a proximity sensor, a visual sensor, or a combination thereof.
| 4. The method of claim 1, wherein the generated driving risk pattern is generated using a machine learning model.
| 5. The method of claim 1, wherein obtaining the observations comprises:
collecting, using the one or more sensors of the autonomous vehicle, one or more observable vehicle characteristics of the neighboring vehicle.
| 6. The method of claim 5, wherein the one or more observable vehicle characteristics of the neighboring vehicle collected by the one or more sensors comprise one or more relative accelerations of the neighboring vehicle, a relative speed of the neighboring vehicle, a relative direction of travel of the neighboring vehicle, one or more visual characteristics of the neighboring vehicle, one or more visual characteristics of a driver of the neighboring vehicle, or a combination thereof.
| 7. The method of claim 5, further comprising:
sending, to a server, the one or more observable vehicle characteristics of the neighboring vehicle.
| 8. The method of claim 7, further comprising:
sending, to the server, a plurality of observable vehicle characteristics associated with a plurality of observed vehicles collected by the autonomous vehicle; and
receiving, from the server, a machine learning model trained to identify behavioral characteristics from observable vehicle characteristics using the plurality of observable vehicle characteristics and associated known behavioral characteristics.
| 9. The method of claim 8, further comprising:
periodically sending, to the server, new observable vehicle characteristics associated with new observed vehicles collected by the autonomous vehicle; and
periodically receiving, from the server, an updated machine learning model.
| 10. The method of claim 1, wherein the autonomous vehicle is an autonomous car.
| 11. A system for managing a driving plan of an autonomous vehicle, the system comprising:
one or more sensors configured to obtain observations of a neighboring vehicle, the observations including observed driving behaviors of the neighboring vehicle;
a memory configured to store the observations; and
one or more processors communicably coupled with the memory and the sensors, the one or more processors configured to:
generate a driving risk pattern of the neighboring vehicle based on the observations,
update the driving plan based on the generated driving risk pattern of the neighboring vehicle, and
control one or more operations of the autonomous vehicle based on the updated driving plan,
wherein the generated driving risk pattern indicates one of a plurality of different risk levels,
wherein the one or more processors configured to generate the driving risk pattern of the neighboring vehicle are configured to classify the neighboring vehicle as a vehicle that lacks autonomous driving capability based on the observed driving behaviors of the neighboring vehicle, and
wherein the generated driving risk pattern comprises a first classification indicating the neighboring vehicle as a vehicle lacking autonomous driving capability.
| 12. The system of claim 11, wherein the one or more processors are further configured to:
determine a second classification of the neighboring vehicle based on vehicle characteristics of the neighboring vehicle exchanged in a vehicle to vehicle communication; and
cross-check the second classification against the first classification.
| 13. The system of claim 11, wherein the one or more sensors used to obtain the observations comprise a RADAR sensor, a LIDAR sensor, a GPS sensor, a proximity sensor, a visual sensor, or a combination thereof.
| 14. The system of claim 11, wherein the one or more processors are further configured to use a machine learning model to generate the driving risk pattern of the neighboring vehicle.
| 15. The system of claim 11, further wherein the one or more sensors are configured to:
collect, using the one or more sensors of the autonomous vehicle, one or more observable vehicle characteristics of the neighboring vehicle.
| 16. The system of claim 15, wherein the one or more observable vehicle characteristics of the neighboring vehicle collected by the one or more sensors comprise one or more relative accelerations of the neighboring vehicle, a relative speed of the neighboring vehicle, a relative direction of travel of the neighboring vehicle, one or more visual characteristics of the neighboring vehicle, one or more visual characteristics of a driver of the neighboring vehicle, or a combination thereof.
| 17. The system of claim 15, further comprising:
a wireless subsystem configured to send to a server the one or more observable vehicle characteristics of the neighboring vehicle.
| 18. The system of claim 17, wherein the wireless subsystem is further configured to:
send a plurality of observable vehicle characteristics associated with a plurality of observed vehicles to the server; and
receive, from the server, a machine learning model trained to identify behavioral characteristics from observable vehicle characteristics using the plurality of observable vehicle characteristics and associated known behavioral characteristics.
| 19. The system of claim 18, wherein the wireless subsystem is further configured to:
periodically send, to the server, new observable vehicle characteristics associated with new observed vehicles collected by the autonomous vehicle; and
periodically receive, from the server, an updated machine learning model.
| 20. The system of claim 11, wherein the autonomous vehicle is an autonomous car.
| 21. A non-transitory computer readable storage medium including instructions that, when executed by a processor, cause the processor to perform operations for managing a driving plan of an autonomous vehicle, the operations comprising:
obtaining observations of a neighboring vehicle using one or more sensors of the autonomous vehicle, the observations including observed driving behaviors of the neighboring vehicle;
generating a driving risk pattern of the neighboring vehicle based on the observations;
updating the driving plan based on the generated driving risk pattern of the neighboring vehicle; and
controlling one or more operations of the autonomous vehicle based on the updated driving plan,
wherein the generated driving risk pattern indicates one of a plurality of different risk levels,
wherein generating the driving risk pattern of the neighboring vehicle comprises classifying the neighboring vehicle as a vehicle that lacks autonomous driving capability based on the observed driving behaviors of the neighboring vehicle, and
wherein the generated driving risk pattern comprises a first classification indicating the neighboring vehicle as a vehicle lacking autonomous driving capability.
| 22. The non-transitory computer readable storage medium of claim 21, wherein the operations further comprise:
determining a second classification of the neighboring vehicle based on vehicle characteristics of the neighboring vehicle exchanged in a vehicle to vehicle communication; and
cross-checking the second classification against the first classification.
| 23. The non-transitory computer readable storage medium of claim 21, wherein obtaining the observations comprises:
collecting, using the one or more sensors of the autonomous vehicle, one or more observable vehicle characteristics of the neighboring vehicle; and
sending to a server the one or more observable vehicle characteristics of the neighboring vehicle.
| 24. An apparatus, comprising:
means for obtaining observations of a neighboring vehicle using one or more sensors of an autonomous vehicle, the observations including observed driving behaviors of the neighboring vehicle;
means for generating a driving risk pattern of the neighboring vehicle based on the observations;
means for updating a driving plan based on the generated driving risk pattern of the neighboring vehicle; and
means for controlling one or more operations of the autonomous vehicle based on the updated driving plan,
wherein the generated driving risk pattern indicates one of a plurality of different risk levels,
wherein the means for generating the driving risk pattern of the neighboring vehicle comprises means for classifying the neighboring vehicle as a vehicle that lacks autonomous driving capability based on the observed driving behaviors of the neighboring vehicle, and
wherein the generated driving risk pattern comprises a first classification indicating the neighboring vehicle as a vehicle lacking autonomous driving capability.
| 25. The apparatus of claim 24, further comprising:
means for determining a second classification of the neighboring vehicle based on vehicle characteristics of the neighboring vehicle exchanged in a vehicle to vehicle communication; and
means for cross-checking the second classification against the first classification.
| 26. The apparatus of claim 24, wherein the means for obtaining the observations comprises:
means for collecting, using the one or more sensors of the autonomous vehicle, one or more observable vehicle characteristics of the neighboring vehicle; and
means for sending, to a server, the one or more observable vehicle characteristics of the neighboring vehicle.
| 27. The method of claim 1, further comprising:
obtaining observations of a second neighboring vehicle;
generating a second driving risk pattern of the second neighboring vehicle based on the observations of the second neighboring vehicle; and
updating the driving plan based on a weight average of the driving risk pattern of the neighboring vehicle and the second driving risk pattern of the second neighboring vehicle.
| 28. The method of claim 1, further comprising:
determining a second classification of the neighboring vehicle based on observed visual characteristics of the neighboring vehicle; and
cross-checking the second classification against the first classification.
| 29. The method of claim 1, wherein the observed driving behaviors comprise at least one of a relative speed, a relative acceleration, a relative deceleration, a relative position, or relative direction changes of the neighboring vehicle.
| 30. The method of claim 4, wherein the machine learning model uses objective driving behavior information as truth data to analyze the observations. | The method (300) involves obtaining (302) observations of a neighboring vehicle using one or more sensors of an autonomous vehicle. One or more behavioral driving characteristics of the neighboring vehicle is classified (304) based on the observations. A driving plan is updated (306) based on a classification of the one or more behavioral driving characteristics of the neighboring vehicle. One or more operations of the autonomous vehicle is controlled (308) based on the updated driving plan. INDEPENDENT CLAIMS are included for the following:a system for managing driving plan of autonomous vehicle; anda non-transitory computer readable storage medium storing program for managing driving plan of autonomous vehicle. Method for managing driving plan of autonomous motor vehicles such as cars, trucks and trains using machine learning model. The drive control system updates drive plan relative to the irregular behavioral driving characteristics of vehicle, causing autonomous vehicle to slow down, increase a distance between autonomous vehicle and other vehicle, activate an emergency system e.g. collision warning and brake support. Enables autonomous vehicle to operate in a safe and autonomous manner and continuously adjust and react its environment. The drawing shows the flow diagram of a method for managing a driving plan of an autonomous vehicle. 300Method for managing driving plan of autonomous vehicle302Step for obtaining observations of a neighboring vehicle304Step for classifying one or more behavioral driving characteristic306Step for updating a driving plan308Step for controlling one or more operations of the autonomous vehicle |
Please summarize the input | Shape detecting autonomous vehicleAccording to various embodiments, there is provided a method for controlling a vehicle, the method including detecting a triggering event. The method further includes, in response to detecting the triggering event, determining updated dimensions of the vehicle. The method further includes adjusting operation of the vehicle based on the updated dimensions.What is claimed is:
| 1. A method for controlling a vehicle, the method comprising:
detecting, by a sensor, a triggering event;
determining updated dimensions of the vehicle in response to detecting the triggering event; and
adjusting, by control electronics, at least one operation of the vehicle, wherein the at least one operation of the vehicle comprises an adjustment of at least one of a speed, a turn radius, a navigation path, a clearance allowance, or a parking behavior of the vehicle based at least in part on the updated dimensions.
| 2. The method of claim 1, wherein the triggering event comprises a changed shape event.
| 3. The method of claim 2, wherein the changed shape event comprises detecting a parameter associated with the vehicle and determining whether the parameter exceeds a threshold.
| 4. The method of claim 3, wherein the parameter corresponds to one or more of a weight parameter, a wind parameter, a drag parameter, or an engine torque value.
| 5. The method of claim 2, further comprising:
determining one or more surrounding conditions of the vehicle; and
detecting the changed shape event of the vehicle based at least in part on the one or more surrounding conditions.
| 6. The method of claim 5, wherein the one or more surrounding conditions comprises at least one of a wind force, a road slope, a radius of curvature of a road, or road terrain conditions.
| 7. The method of claim 1, wherein determining the updated dimensions of the vehicle comprises:
sending a scan request to one or more proximate vehicles;
receiving one or more at least partial scans of at least one of the one or more proximate vehicles; and
constructing the updated dimensions of the vehicle based at least in part on at least one of the one or more at least partial scans.
| 8. The method of claim 7, wherein the scan request is sent via vehicle-to-vehicle (V2V) communication.
| 9. The method of claim 7, wherein the one or more at least partial scans comprises at least one Light Detection and Ranging (LIDAR) scan.
| 10. The method of claim 1, wherein the at least one operation of the vehicle serves to control braking, to perform wireless communication, or to perform environment scanning.
| 11. The method of claim 1, further comprising configuring at least one of an engine sensor, a weight sensor, a wind sensor, or a cargo sensor.
| 12. The method of claim 1, wherein determining the updated dimensions of the vehicle comprises:
sending a scan request to one or more scanning devices of the vehicle;
receiving an at least partial scan from at least one of the one or more scanning devices; and
constructing the updated dimensions of the vehicle based on at least one of the at least partial scan.
| 13. The method of claim 12, wherein the at least partial scan is received from another vehicle.
| 14. The method of claim 12, wherein the at least partial scan is received from an unmanned aerial vehicle.
| 15. The method of claim 12, wherein the at least partial scan is received from a camera arranged on a fixed object.
| 16. A controller in a vehicle, the controller comprising:
a processor; and
a memory storing instructions that, when executed by the processor, cause the vehicle to:
detect a triggering event;
determine updated dimensions of the vehicle in response to detection of the triggering event; and
adjust at least one operation of the vehicle, wherein the at least one operation of the vehicle comprises an adjustment of at least one of a speed, a turn radius, a navigation path, a clearance allowance, or a parking behavior of the vehicle based at least in part on the updated dimensions.
| 17. The controller of claim 16, wherein the triggering event comprises detecting a changed shape event.
| 18. The controller of claim 17, wherein execution of the instructions causes the vehicle to:
detect a parameter associated with the vehicle; and
determine whether the parameter exceeds a threshold.
| 19. The controller of claim 18, wherein the parameter corresponds to one or more of a weight parameter, a wind parameter, a drag parameter, or an engine torque parameter.
| 20. The controller of claim 17, wherein execution of the instructions causes the vehicle to further:
determine one or more surrounding conditions of the vehicle; and
detect the changed shape event of the vehicle based at least in part on the one or more surrounding conditions.
| 21. The controller of claim 20, wherein the one or more surrounding conditions comprises at least one of a wind force, a road slope, a radius of curvature of a road, or road terrain conditions.
| 22. The controller of claim 16, wherein execution of the instructions for determining the updated dimensions further causes the vehicle to:
send a scan request to one or more proximate vehicles;
receive one or more at least partial scan of at least one of the one or more proximate vehicles; and
construct the updated dimensions of the vehicle based at least in part on at least one of the one or more at least partial scans.
| 23. The controller of claim 22, wherein the one or more at least partial scans comprises at least one Light Detection and Ranging (LIDAR) scan.
| 24. The controller of claim 16, wherein execution of the instructions causes the vehicle to control braking, to perform wireless communication, or to perform environment scanning.
| 25. The controller of claim 16, wherein execution of the instructions causes the vehicle to:
send a scan request to one or more scanning devices of the vehicle;
receive an at least partial scan from at least one of the one or more scanning devices; and
construct the updated dimensions of the vehicle based on the at least partial scan.
| 26. The controller of claim 25, wherein at least a partial scan is received from another vehicle.
| 27. An apparatus for controlling a vehicle, the apparatus comprising:
means for detecting a triggering event;
means for determining updated dimensions of the vehicle in response to detecting the triggering event; and
means for adjusting at least one operation of the vehicle, wherein the at least one operation of the vehicle comprises an adjustment of at least one of a speed, a turn radius, a navigation path, a clearance allowance, or a parking behavior of the vehicle based at least in part on the updated dimensions.
| 28. The apparatus of claim 27, wherein the triggering event comprises a changed shape event.
| 29. The apparatus of claim 28, wherein the changed shape event comprises detecting a parameter associated with the vehicle and determining whether the parameter exceeds a threshold.
| 30. The apparatus of claim 29, wherein the parameter corresponds to one or more of a weight parameter, a wind parameter, a drag parameter, or an engine torque value. | The method involves detecting a triggering event, determining updated dimensions of the vehicle in response to detecting the triggering event, and adjusting operation of the vehicle based on the updated dimensions. The triggering event involves detecting a changed shape event of the vehicle. The changed shape event is detected by detecting a parameter associated with the vehicle and determining whether the parameter exceeds a threshold. The parameter corresponds to the weight, wind drag, or engine torque of the vehicle. INDEPENDENT CLAIMS are also included for the following:a controller in a vehicle; anda vehicle. Controlling method of vehicle. The vehicle accesses the updated shape information to determine the optimal turn radius for safely traversing the curvature in the road, by using the environment scanning information and the temperature sensor information. The drawing shows the flowchart of a method of controlling an autonomous vehicle. 402Receiving sensor data404Detecting changed shape event406Continuing normal operation408Triggering shape scanning410Adjusting operation based on new shape |
Please summarize the input | VIRTUAL TRAFFIC LIGHT VIA C-V2XTechniques are provided for traffic intersection control information to vehicles via V2X communication links. An example method for providing traffic intersection control messages includes receiving vehicle information associated with a plurality of proximate vehicles, generating one or more vehicle groups based on the vehicle information, generating a traffic control plan based at least in part on the one or more vehicle groups, and transmitting one or more traffic intersection control messages to one or more of the plurality of proximate vehicles based at least in part on the traffic control plan.CLAIMS:
| 1. A method for providing traffic intersection control messages, comprising: receiving vehicle information associated with a plurality of proximate vehicles; generating one or more vehicle groups based on the vehicle information; generating a traffic control plan based at least in part on the one or more vehicle groups; and transmitting one or more traffic intersection control messages to one or more of the plurality of proximate vehicles based at least in part on the traffic control plan.
| 2. The method of claim 1 wherein the vehicle information includes basic safety messages transmitted by one or more vehicles in the plurality of proximate vehicles.
| 3. The method of claim 1 the one or more vehicle groups are based a location of a vehicle, a number of vehicles in a proximate area, a traffic density flowing in a direction, a configuration of an intersection, a size associated with a vehicle, a priority value associated with one or more vehicles, or any combination thereof.
| 4. The method of claim 1 wherein receiving the vehicle information includes receiving vehicle group information from a network resource.
| 5. The method of claim 1 wherein the traffic control plan is based at least in part on a time of day, a date, a current density of traffic, a turn lane configuration, or any combination thereof.
| 6. The method of claim 1 wherein transmitting the one or more traffic intersection control messages includes unicasting a traffic control message including proceed information to one or more vehicles in the plurality of proximate vehicles.
| 7. The method of claim 1 wherein transmitting the one or more traffic intersection control messages includes groupcasting a traffic control message including a list of vehicle identification values.
| 8. The method of claim 1 wherein the one or more traffic intersection control messages are transmitted via a PC5 interface, a Uu interface, a device-to-device protocol, or any combinations thereof.
| 9. A method of receiving a traffic intersection control message, comprising: transmitting one or more basic safety messages; receiving one or more traffic intersection control messages including proceed information; and providing an indication to proceed or halt progress through an intersection based at least in part on the one or more traffic intersection control messages.
| 10. The method of claim 9 further comprising transmitting vehicle priority information.
| 11. The method of claim 9 wherein receiving the one or more traffic intersection control messages includes receiving a unicast message including the proceed information.
| 12. The method of claim 9 wherein receiving the one or more traffic intersection control messages includes receiving a groupcast message including a list of vehicle identification values.
| 13. The method of claim 9 wherein providing the indication to proceed or halt progress through the intersection includes providing an instruction to a controller in an autonomous or semi-autonomous vehicle.
| 14. The method of claim 9 wherein providing the indication to proceed or halt progress through the intersection includes activating a driver alert device.
| 15. The method of claim 9 wherein the one or more traffic intersection control messages are received via a PC5 interface, a Uu interface, a device-to-device protocol, or any combinations thereof.
| 16. An apparatus, comprising: a memory; at least one transceiver; at least one processor communicatively coupled to the memory and the at least one transceiver, and configured to: receive vehicle information associated with a plurality of proximate vehicles; generate one or more vehicle groups based on the vehicle information; generate a traffic control plan based at least in part on the one or more vehicle groups; and transmit one or more traffic intersection control messages to one or more of the plurality of proximate vehicles based at least in part on the traffic control plan.
| 17. The apparatus of claim 16 wherein the vehicle information includes basic safety messages transmitted by one or more vehicles in the plurality of proximate vehicles.
| 18. The apparatus of claim 16 the one or more vehicle groups are based a location of a vehicle, a number of vehicles in a proximate area, a traffic density flowing in a direction, a configuration of an intersection, a size associated with a vehicle, a priority value associated with one or more vehicles, or any combination thereof.
| 19. The apparatus of claim 16 wherein the at least one processor is further configured to receive vehicle group information from a network resource as at least part of the vehicle information associated with the plurality of proximate vehicles.
| 20. The apparatus of claim 16 wherein the traffic control plan is based at least in part on a time of day, a date, a current density of traffic, a turn lane configuration, or any combination thereof.
| 21. The apparatus of claim 16 wherein the at least one processor is further configured to unicast a traffic control message including proceed information to one or more vehicles in the plurality of proximate vehicles as the one or more traffic intersection control messages.
| 22. The apparatus of claim 16 wherein the at least one processor is further configured to groupcast a traffic control message including a list of vehicle identification values as the one or more traffic intersection control messages.
| 23. The apparatus of claim 16 wherein the one or more traffic intersection control messages are transmitted via a PC5 interface, a Uu interface, a device-to-device protocol, or any combinations thereof.
| 24. An apparatus, comprising: a memory; at least one transceiver; at least one processor communicatively coupled to the memory and the at least one transceiver, and configured to: transmit one or more basic safety messages; receive one or more traffic intersection control messages including proceed information; and provide an indication to proceed or halt progress through an intersection based at least in part on the one or more traffic intersection control messages.
| 25. The apparatus of claim 24 wherein the at least on processor is further configured to transmit vehicle priority information.
| 26. The apparatus of claim 24 wherein the at least on processor is further configured to receive a unicast message including the proceed information as the one or more traffic intersection control messages.
| 27. The apparatus of claim 24 wherein the at least one processor is further configured to receive a groupcast message including a list of vehicle identification values as the one or more traffic intersection control messages.
| 28. The apparatus of claim 24 wherein the at least one processor is further configured to provide an instruction to a controller in an autonomous or semi- autonomous vehicle as the indication to proceed or halt progress through the intersection.
| 29. The apparatus of claim 24 wherein the at least one processor is further configured to activate a driver alert device as the indication to proceed or halt progress through the intersection.
| 30. The apparatus of claim 24 wherein the one or more traffic intersection control messages are received via a PC5 interface, a Uu interface, a device-to-device protocol, or any combinations thereof. | The method (1100) involves receiving vehicle information associated with multiple proximate vehicles (1102). The vehicle groups are generated (1104) based on the vehicle information. A traffic control plan is generated (1106) based in portion on the vehicle groups. The traffic intersection control messages are transmitted (1108) to the proximate vehicle through PC5 interface, a UMTS air interface (Uu) interface, or a device-to-device protocol, based in portions on the traffic control plans, where the vehicle information includes basic safety messages transmitted by the vehicles in the multiple vehicles. INDEPENDENT CLAIMS are included for: (1) a method for receiving a traffic intersection control message; (2) an apparatus for providing traffic intersection control messages to vehicles through vehicle-to-everything communication links; (3) an apparatus for receiving a traffic intersection control message. Method for providing traffic intersection control messages to vehicles such as autonomous or semi-autonomous vehicles, i.e. car through vehicle-to-everything communication links uisng a user euipment (UE). Uses include but are not limited to mobile phone, router, tablet computer, laptop computer, consumer asset tracking device, Internet of Things (IoT) device, ot on-board unit (OBU). The traffic congestion and the potential for collisions at intersections can be reduced. The traffic control messages can be unicasted or groupcasted to the vehicles and the vehicles can proceed or halt at an intersection as a group. The vehicle groups are evaluated in view of a traffic control plan and the groups can be prioritized for proceeding through a traffic intersection. The positioning reference signal (PRS) muting can be used to reduce interference by muting PRS signals. The drawing shows a flow diagram illustrating a method for providing traffic intersection control information to vehicles.1100Method for providing traffic intersection control messages to vehicles 1102Receiving vehicle information associated with multiple proximate vehicle 1104Generating vehicle groups 1106Generating traffic control plan 1108Transmitting traffic intersection control messages to the proximate vehicle |
Please summarize the input | IMPLEMENTING CONFIDENCE METRICS IN VEHICLE-TO-EVERYTHING (V2X) COMMUNICATIONSCertain aspects of the present disclosure provide techniques for enhancing vehicle operations safety using coordinating vehicle platooning or enhancing platooning safety against location spoofing attacks. In one example, a source user equipment (UE) detects a potential spoofing event associated with location information being altered in an unauthorized manner, the source UE may transmit a request to a platoon control system (PCS) to join a vehicle platoon. In another example, a first UE associated with a lead vehicle in an existing platoon may detect a potential spoofing event associated with location information being altered in an unauthorized manner. The lead vehicle may transmit to a second UE of another vehicle in the platoon an indication of the detection and a request to exchange the respective roles in the platoon. The PCS may also monitor the conditions of the first and the second UEs, and arrange for the platoon reorganization.WHAT IS CLAIMED IS:
| 1. A source user equipment (UE) for wireless communications, comprising: a memory; and a processor coupled with the memory, the processor and the memory configured to: detect a potential spoofing event associated with location information being altered in an unauthorized manner; and transmit, in response to the detected potential spoofing event, a request to a platoon control system (PCS) to join a vehicle platoon, wherein the request includes an indication of the detected potential spoofing event.
| 2. The source UE of claim 1, wherein the request to the PCS comprises a confidence metric that indicates a probability that the source UE is receiving spoofed location information.
| 3. The source UE of claim 2, wherein the processor and the memory are configured to detect the potential spoofing event by detecting that the confidence metric is above a threshold value.
| 4. The source UE of claim 2, wherein the confidence metric indicates one of a plurality of levels of accuracy of a corresponding level of certainty of the potential spoofing event, and wherein a threshold value corresponds to a predefined level of accuracy.
| 5. The source UE of claim 2, wherein the processor and the memory are configured to detect the potential spoofing event by receiving one or more signals from at least one of a network entity or a second UE in one or more basic safety messages (BSMs).
| 6. The source UE of claim 5, wherein the confidence metric is determined by comparing at least one characteristic indicated by the one or more signals and a characteristic indicated by received location information. 48
| 7. The source UE of claim 2, wherein the processor and the memory are configured to detect the potential spoofing event by measuring, using at least one onboard sensor independent from the location information, a movement attribute of the source UE to examine a validity of the location information.
| 8. The source UE of claim 2, wherein the request further comprises at least one of: a vehicle identifier, destination information, or a source positioning location.
| 9. The source UE of claim 1, wherein the request further indicates at least one of: an occupancy parameter of a vehicle associated with the source UE; an autonomy level of a vehicle associated with the UE; or a travel preference parameter.
| 10. The source UE of claim 1, wherein the processor and the memory are further configured to : receive a response indicating confirmation that the source UE is allowed to join a vehicle platoon assigned by the PCS; receive an invitation corresponding to the confirmation from a lead UE of a lead vehicle of the vehicle platoon; and abstain from transmitting vehicle-to-everything (V2X) messages upon receiving the response.
| 11. The source UE of claim 1, wherein the processor and the memory are further configured to: receive an alert notice from the PCS when the PCS does not have an available vehicle platoon to assign, wherein the alert notice comprises alert messages requesting manual control.
| 12. A network entity for wireless communications, comprising: a memory; and a processor coupled with the memory, the processor and the memory configured to: 49 receive a request from a user equipment (UE), the request triggered by a detection of a potential spoofing event at the UE; and transmit, to the UE, an assignment of a vehicle platoon for the UE to join based on the request.
| 13. The network entity of claim 12, wherein the request further includes a confidence metric that indicates a probability that the UE is receiving spoofed location information of the potential spoofing event.
| 14. The network entity of claim 13, wherein the request comprises at least one of: a vehicle identifier, destination information, a source positioning location, or the confidence metric.
| 15. The network entity of claim 12, wherein the assignment has a higher priority when the UE is associated with an autonomous vehicle than when the UE is associated with a non-autonomous vehicle.
| 16. The network entity of claim 12, wherein the processor and the memory are configured to: transmit, to at least one platoon UE of a corresponding vehicle in the vehicle platoon, an instruction for the at least one platoon UE to transmit a beacon to the UE, wherein the beacon is to be measured by the UE.
| 17. The network entity of claim 12, wherein the processor and the memory are configured to: transmit, to a roadside unit (RSU), an instruction for the RSU to measure a location of the UE for comparison with location information therein and assessment of the potential spoofing event; and confirm the assignment of the vehicle platoon based on the location measured by the RSU.
| 18. A first user equipment (UE), comprising: a memory; and 50 a processor coupled with the memory, the processor and the memory configured to: detect a potential spoofing event associated with location information being altered in an unauthorized manner; transmit, to a second UE, an indication of the detection of the potential spoofing event, wherein the first UE and the second UE are associated with vehicles in a platoon; and transmit, to the second UE, a request to exchange a role of a vehicle corresponding to the first UE in the platoon with a role of a vehicle corresponding to the second UE in the platoon.
| 19. The first UE of claim 18, wherein the indication is carried in one or more basic safety messages (BSMs).
| 20. The first UE of claim 18, wherein the role of the vehicle corresponding to the first UE is a lead vehicle managing UEs of other vehicles in the platoon, and wherein the role of the vehicle corresponding to the second UE is a secondary vehicle managed by the lead vehicle.
| 21. The first UE of claim 18, wherein the processor and the memory are configured to: detect the potential spoofing event by determining a first confidence metric of the first UE, the first confidence metric associated with a position accuracy based on a verification of global navigation satellite system (GNSS) position information received at the first UE.
| 22. The first UE of claim 21, wherein the processor and the memory are further configured to: detect the potential spoofing event by determining that the first confidence metric indicating a probability that the first UE is receiving spoofed location information is above a threshold value.
| 23. The first UE of claim 21, wherein the processor and the memory are further configured to: receive from at least the second UE, data of sensors thereof, wherein the verification of the GNSS position information is based on the data of sensors.
| 24. The first UE of claim 21, wherein the processor and the memory are further configured to: receive data from a roadside unit (RSU), wherein the verification of the GNSS position information is further based on the data of the RSU.
| 25. The first UE of claim 21, wherein the processor and the memory are further configured to: transmit, an indication of the potential spoofing event, to a platoon control system (PCS) in control of the platoon when the confidence metric is above a threshold value.
| 26. The first UE of claim 21, wherein the processor and the memory are further configured to: request, from the second UE in the platoon, a second confidence metric of the second UE, the second confidence metric indicating a probability that the second UE is receiving spoofed location information, wherein transmitting the request to exchange roles in the platoon is based on the second confidence metric being below a threshold value and indicating an absence of spoofing attack to the second UE; and transmit, to the second UE in the platoon, an indication for the second UE to leave the platoon based on the second confidence metric being greater than or equal to the threshold value.
| 27. An apparatus for wireless communications, comprising: a memory; and a processor coupled with the memory, the processor and the memory configured to: receive an indication from a first user equipment (UE) of a first vehicle in a vehicle platoon, the indication triggered by the first UE detecting a first potential spoofing event associated with location information being altered in an unauthorized manner; and transmit, to a second UE in the vehicle platoon, an indication for the second UE to assume functionalities performed by the first UE in the vehicle platoon.
| 28. The apparatus of claim 27, wherein the first UE is a lead UE configured to perform functionalities including management of other UEs in the vehicle platoon.
| 29. The apparatus of claim 27, wherein the second UE and the first UE are in sidelink communication, and the second UE is managed by the first UE before the indication of the first potential spoofing event.
| 30. The apparatus of claim 27, wherein the first potential spoofing event is determined based on that a first confidence metric of the first UE indicating a probability that the first UE is receiving spoofed location information is above a threshold value. | The equipment has a processor coupled with a memory. The processor detects a potential spoofing event associated with location information altered in an unauthorized manner and transmits a request to a platoon control system (PCS) to join a vehicle platoon in response to the detected potential spoofing event, where the request includes an indication of the detected potential spoofing event and the confidence metric indicates levels of accuracy of a corresponding level of certainty of the potential spoofing event. The processor receives a response indicating confirmation of a source user equipment allowed to join a vehicle platoon assigned by the PCS. INDEPENDENT CLAIMS are included for:(1) a network entity for performing wireless communication for coordinating vehicle platooning;(2) an apparatus for performing wireless communication for coordinating vehicle platooning. Source user equipment e.g. mobile station for performing wireless communication for coordinating vehicle platooning for use in telecommunication services e.g. telephony. Uses include but are not limited to video, data, messaging, broadcasts, a terminal, an access terminal, a subscriber unit, a station, a customer premises equipment, a cellular phone, an intelligent phone, a personal digital assistant (PDA), a wireless modem, a wireless communication device, a handheld device, a laptop computer, a cordless phone, a wireless local loop (WLL) station, a tablet computer, a camera, a gaming device, a netbook, a intelligent book, an ultrabook and a medical device. The equipment enhances vehicle operations safety using coordinated vehicle platooning or platooning safety from location spoofing attacks or attempts to alter location information in unauthorized manners. The equipment realizing improved spectral efficiency, reduced operation cost and increased reliability, maintains a minimal distance or headway between moving vehicles at high speeds and avoids use of potentially spoofed location information. The drawing shows a schematic view of a source user equipment. 400Vehicle-to-everything system 402Vehicle 406Wireless communication link 408Vehicle-to-vehicle interface 410Roadside service unit |
Please summarize the input | Method and apparatus for vehicle steering plan and messagingThe present invention provides techniques that may be implemented using various methods and/or devices in a vehicle to utilize vehicle external sensor data, vehicle internal sensor data, vehicle capability and external V2X input to determine, transmit; receiving and using the V2X information and control data sent between the vehicle and the roadside unit (RSU), so as to determine the intersection entrance and the vehicle behavior at the near intersection.|1. A method for entering a crossroad of a self-control vehicle, comprising: based on vehicle external sensor, vehicle internal sensor, vehicle capacity or external V2X input or a combination thereof to determine the braking distance of the self-control vehicle; sending a first message from the autonomous vehicle, wherein the first message comprises identification data element or vehicle type or vehicle priority or a combination thereof for the self-control vehicle and a braking distance data element for the self-control vehicle; receiving a second message from the roadside unit RSU at least partially based on the braking distance for the autonomous vehicle; the second message comprises one or more instructions related to the intersection of the autonomous vehicle; and controlling the intersection entry of the self-control vehicle in response to the one or more instructions received from the RSU.
| 2. The method according to claim 1, further comprising sending a third message from the self-control vehicle to the RSU prior to the second message to request entry of a crossroad.
| 3. The method of entering a crossroad according to claim 1, wherein the braking distance for the self-control vehicle is determined based at least in part on the speed of the self-control vehicle.
| 4. The method for entering a crossroad according to claim 3, wherein the braking distance for the self-control vehicle is determined based at least in part on a tire pressure or weather condition or tire traction data for the self-control vehicle or a combination thereof.
| 5. The method according to claim 1, wherein the braking distance for the self-control vehicle is shorter in the autonomous mode than in the manual mode.
| 6. The method of entering a crossroad according to claim 1, wherein the first message is a broadcast message.
| 7. The method according to claim 1, wherein the first message is a peer-to-peer message.
| 8. The intersection entry method according to claim 1, wherein the first message is a basic security message or a cooperative awareness message.
| 9. A self-control vehicle, comprising: one or more wireless transceivers; a vehicle internal sensor; a vehicle external sensor; a memory; and one or more processors, the one or more processors communicatively coupling to the one or more wireless transceivers, the vehicle internal sensor, the vehicle external sensor and the memory; wherein the one or more processors are configured to: based on the vehicle external sensor, the vehicle internal sensor, vehicle capacity or external V2X input or a combination thereof to determine a braking distance for the autonomous vehicle; sending a first message from the one or more transceivers, wherein the first message comprises identification data element or vehicle type or vehicle priority or a combination thereof for the self-control vehicle and a braking distance data element for the self-control vehicle; at the one or more wireless transceivers at least partially based on the braking distance of the self-control vehicle from the roadside unit RSU receives the second message, the second message comprises one or more instructions related to the intersection of the self-control vehicle; and controlling the intersection entry of the self-control vehicle in response to the one or more instructions received from the RSU.
| 10. The self-control vehicle according to claim 9, wherein the one or more processors are further configured to send a third message from the one or more wireless transceivers to the RSU prior to the second message to request entry of a crossroad.
| 11. The self-control vehicle according to claim 9, wherein the braking distance for the self-control vehicle is determined based at least in part on the speed of the self-control vehicle and the experience stop distance data associated with the speed of the self-control vehicle.
| 12. The self-control vehicle according to claim 11, wherein the braking distance for the self-control vehicle is determined based at least in part on a tire pressure or weather condition or tire traction data for the self-control vehicle or a combination thereof.
| 13. The self-control vehicle according to claim 9, wherein the braking distance for the self-control vehicle is shorter in the autonomous mode than in the manual mode.
| 14. The self-control vehicle according to claim 9, wherein the first message is a broadcast message.
| 15. The self-control vehicle according to claim 9, wherein the first message is a peer-to-peer message.
| 16. The self-control vehicle according to claim 9, wherein the first message is a basic security message or a cooperative sensing message.
| 17. A self-control vehicle, comprising: means for determining a braking distance for the autonomous vehicle based on a vehicle external sensor, a vehicle internal sensor, a vehicle capability or an external V2X input, or a combination thereof; means for sending a first message from the self-control vehicle, wherein the first message comprises identification data element or vehicle type or vehicle priority or a combination thereof for the self-control vehicle and a braking distance data element for the self-control vehicle; means for receiving a second message from a roadside unit RSU based at least in part on the braking distance for the self-control vehicle; the second message comprises one or more instructions to enter at a crossroad of the self-control vehicle; and means for controlling the entry of the crossroads of the self-control vehicle in response to the one or more instructions received from the RSU.
| 18. The self-control vehicle according to claim 17, further comprising means for sending a third message from the self-control vehicle to the RSU prior to the second message to request entry of a crossroad.
| 19. The self-control vehicle according to claim 17, wherein the first message is a broadcast message.
| 20. The self-control vehicle according to claim 17, wherein the first message is a peer-to-peer message.
| 21. The self-control vehicle according to claim 17, wherein the first message is a basic security message or a cooperative sensing message.
| 22. A non-transitory computer-readable medium having stored thereon computer-readable instructions for causing one or more processors on a self-control vehicle to perform the following operations: based on vehicle external sensor, vehicle internal sensor, vehicle capacity or external V2X input or a combination thereof to determine the braking distance of the self-control vehicle; sending a first message from the autonomous vehicle, wherein the first message comprises identification data element or vehicle type or vehicle priority or a combination thereof for the self-control vehicle and a braking distance data element for the self-control vehicle; receiving a second message from the roadside unit RSU at least partially based on the braking distance for the autonomous vehicle; the second message comprises one or more instructions related to the intersection of the autonomous vehicle; and controlling the intersection entry of the self-control vehicle in response to the one or more instructions received from the RSU.
| 23. The non-transitory computer-readable medium according to claim 22, further comprising instructions that cause the one or more processors to send a third message to the RSU prior to the second message to request entry of a crossroad.
| 24. The non-transitory computer readable medium according to claim 22, wherein the first message is a broadcast message.
| 25. The non-transitory computer-readable medium according to claim 22, wherein the first message is a peer-to-peer message.
| 26. The non-transitory computer-readable medium according to claim 22, wherein the first message is a basic security message or a cooperative awareness message. | The method involves receiving a first message from a first vehicle at an ego vehicle. The first message includes an identification data element for the first vehicle, an autonomous vehicle status data element for the first vehicle or a braking distance data element for the first vehicle or a combination. A second message is received from a second vehicle at the ego vehicle. The second message comprises an identification data element for the second vehicle. A target space is determined based upon a size of the ego vehicle, the autonomous vehicle status data element for the first vehicle. The autonomous vehicle status data element for the second vehicle. The braking distance data element for the first vehicle or the braking distance data element for the second vehicle or a combination. An INDEPENDENT CLAIM is included for an ego vehicle with wireless transceivers. Method for messaging an automotive device to facilitate maneuvering of an ego vehicle (claimed). Method for messaging an automotive device to facilitate vehicle maneuvering increases vehicle turning radius at current speed and reduces maneuverability at current speed, and avoid collisions during an emergency stop of the vehicles. The drawing shows a block diagram of a device for determination and communication of a Vehicle-to-everything (V2X) capability data element value based on vehicle internal and external sensors. 100Vehicle external sensors110Vehicle internal sensors120Vehicle capabilities910Processor |
Please summarize the input | Methods and systems for managing interactions between vehicles with varying levels of autonomyMethods, devices and systems enable controlling an autonomous vehicle by identifying a vehicle that is within a threshold distance of the autonomous vehicle, determining an autonomous capability metric (ACM) the identified vehicle, determining whether the ACM of the identified vehicle is greater than a first threshold, determining whether the ACM of the identified vehicle is less than a second threshold, and adjusting a driving parameter of the autonomous vehicle so that the autonomous vehicle is more or less reliant on the capabilities of the identified vehicle based on whether the ACM of the identified vehicle exceeds the thresholds.What is claimed is:
| 1. A method of controlling an autonomous vehicle, comprising:
identifying, via a processor of the autonomous vehicle, a vehicle that is within a threshold distance of the autonomous vehicle;
determining an autonomous capability metric (ACM) of the identified vehicle, wherein the ACM is a vector data structure including a plurality of values each representing a capability of the identified vehicle, the ACM being dynamically determined based on real-time data from the identified vehicle and certificates received via cellular vehicle-to-everything (C-V2X) communications;
determining whether the ACM of the identified vehicle is greater than a first threshold; and
adjusting a driving parameter of the autonomous vehicle based on capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold.
| 2. The method of claim 1, wherein adjusting the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle exceeds the first threshold comprises decreasing a minimum following distance to be maintained between the autonomous vehicle and the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold.
| 3. The method of claim 1, further comprising:
determining whether the ACM of the identified vehicle is less than a second threshold in response to determining that the ACM of the identified vehicle is not greater than the first threshold; and
adjusting the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold.
| 4. The method of claim 3, wherein adjusting the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold comprises increasing a minimum following distance to be maintained between the autonomous vehicle and the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold.
| 5. The method of claim 1, wherein identifying the vehicle that is within the threshold distance of the autonomous vehicle comprises identifying a vehicle that is in front of the autonomous vehicle and within the threshold distance of the autonomous vehicle.
| 6. The method of claim 1, wherein determining the ACM of the identified vehicle comprises determining a value that identifies:
a current level of autonomy of the identified vehicle;
an autonomous capability of the identified vehicle; or
whether the identified vehicle includes an advanced autonomous control system.
| 7. The method of claim 1, wherein:
determining whether the ACM of the identified vehicle is greater than the first threshold comprises applying the plurality of values to a plurality of decision nodes that each evaluate a different feature, factor or data point.
| 8. The method of claim 7, wherein applying the plurality of values to the plurality of decision nodes that each evaluate the different feature, factor or data point comprises applying one or more of the plurality of values to a decision node that evaluates:
whether vehicle-to-vehicle (V2V) communication circuitry is present in the identified vehicle;
whether an accuracy range of a sensor in the identified vehicle is greater than a threshold value; or
whether a thickness of each brake pad in the identified vehicle exceeds a threshold thickness of friction material.
| 9. A processor for an autonomous vehicle, wherein the processor is configured with processor executable instructions to:
identify a vehicle that is within a threshold distance of the autonomous vehicle;
determine an autonomous capability metric (ACM) of the identified vehicle, wherein the ACM is a vector data structure including a plurality of values each representing a capability of the identified vehicle, the ACM being dynamically determined based on real-time data from the identified vehicle and certificates received via cellular vehicle-to-everything (C-V2X) communications;
determine whether the ACM of the identified vehicle is greater than a first threshold; and
adjust a driving parameter of the autonomous vehicle based on capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold.
| 10. The processor of claim 9, wherein the processor is configured with processor executable instructions to adjust the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle exceeds the first threshold by decreasing a minimum following distance to be maintained between the autonomous vehicle and the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold.
| 11. The processor of claim 9, wherein the processor is further configured with processor executable instructions to:
determine whether the ACM of the identified vehicle is less than a second threshold in response to determining that the ACM of the identified vehicle is not greater than the first threshold; and
adjust the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold.
| 12. The processor of claim 11, wherein the processor is configured with processor executable instructions to adjust the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold by increasing a minimum following distance to be maintained between the autonomous vehicle and the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold.
| 13. The processor of claim 9, wherein the processor is further configured with processor executable instructions to identify the vehicle that is within the threshold distance of the autonomous vehicle by identifying a vehicle that is in front of the autonomous vehicle and within the threshold distance of the autonomous vehicle.
| 14. The processor of claim 9, wherein the processor is configured with processor executable instructions to determine the ACM of the identified vehicle by determining a value that identifies:
a current level of autonomy of the identified vehicle;
an autonomous capability of the identified vehicle; or
whether the identified vehicle includes an advanced autonomous control system.
| 15. The processor of claim 9, wherein the processor is configured with processor executable instructions to:
determine whether the ACM of the identified vehicle is greater than the first threshold by applying the plurality of values to a plurality of decision nodes that each evaluate a different feature, factor or data point.
| 16. The processor of claim 15, wherein the processor is configured with processor executable instructions to apply the plurality of values to the plurality of decision nodes that each evaluate the different feature, factor or data point by applying one or more of the plurality of values to a decision node that evaluates:
whether vehicle-to-vehicle (V2V) communication circuitry is present in the identified vehicle;
whether an accuracy range of a sensor in the identified vehicle is greater than a threshold value; or
whether a thickness of each brake pad in the identified vehicle exceeds a threshold thickness of friction material.
| 17. A non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of an autonomous vehicle to perform operations comprising:
identifying a vehicle that is within a threshold distance of the autonomous vehicle;
determining an autonomous capability metric (ACM) of the identified vehicle, wherein the ACM is a vector data structure including a plurality of values each representing a capability of the identified vehicle, the ACM being dynamically determined based on real-time data from the identified vehicle and certificates received via cellular vehicle-to-everything (C-V2X) communications;
determining whether the ACM of the identified vehicle is greater than a first threshold; and
adjusting a driving parameter of the autonomous vehicle based on capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold.
| 18. The non-transitory processor-readable storage medium of claim 17, wherein the stored processor-executable instructions are configured to cause the processor of the autonomous vehicle to perform the operations such that adjusting the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle exceeds the first threshold comprises decreasing a minimum following distance to be maintained between the autonomous vehicle and the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold.
| 19. The non-transitory processor-readable storage medium of claim 17, wherein the stored processor-executable instructions are configured to cause the processor of the autonomous vehicle to perform operations further comprising:
determining whether the ACM of the identified vehicle is less than a second threshold in response to determining that the ACM of the identified vehicle is not greater than the first threshold; and
adjusting the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold.
| 20. An autonomous vehicle, comprising:
means for identifying a vehicle that is within a threshold distance of the autonomous vehicle;
means for determining an autonomous capability metric (ACM) of the identified vehicle, wherein the ACM is a vector data structure including a plurality of values each representing a capability of the identified vehicle, the ACM being dynamically determined based on real-time data from the identified vehicle and certificates received via cellular vehicle-to-everything (C-V2X) communications;
means for determining whether the ACM of the identified vehicle is greater than a first threshold; and
means for adjusting a driving parameter of the autonomous vehicle so that the autonomous vehicle based on capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold. | The method (1100) involves identifying (902) a vehicle that is within a threshold distance of an autonomous vehicle by a processor of the autonomous vehicle. An Autonomous capability metric (ACM) of the identified vehicle is determined (1104). A determination is made (1106) to check whether the ACM is greater than a first threshold. A driving parameter of the autonomous vehicle is adjusted (1108) based on capabilities of the determined identified vehicle in response to determining that the determined ACM exceeds the first threshold by decreasing a minimum following distance to be maintained between the vehicle and an identified vehicle e.g. car. The vehicle is in front of and within the threshold distance. INDEPENDENT CLAIMS are included for: (1) a processor for an autonomous vehicle; (2) a non-transitory processor-readable storage medium for storing processor-executable instructions; (3) an autonomous vehicle comprises a unit for identifying a vehicle that is within a threshold distance of the autonomous vehicle. Method for controlling an autonomous vehicle, such as a car. The method enables utilizing vehicle-based communications for safer and more efficient use of motor vehicles and transportation resources. The method allows the autonomous vehicle to determine the autonomous capability metric of the identified vehicles and adjust the driving parameter of the autonomous vehicles based on the determined autonomous capability metrics of the vehicles, thus improving safety and performance of the vehicle in an efficient manner. The drawing shows a flow chart of the method for controlling an autonomous vehicle.902Identifying a vehicle that is within a threshold distance of the autonomous vehicle 1100Method for controlling an autonomous vehicle 1104Determining ACM of the identified vehicle 1106Determining whether the ACM of the identified vehicle is greater than a first threshold 1108Adjusting a driving parameter of the autonomous vehicle |
Please summarize the input | RESOURCE MANAGEMENT FOR COMMUNICATION AND SENSING SERVICESVarious aspects of the present disclosure generally relate to wireless communication. In some aspects, a first core network entity may receive a first request associated with initiation of a sensing service associated with a user equipment (UE). The first core network entity may receive, from at least one other core network entity and based at least in part on the first request, one or more sensing session parameters associated with the sensing service. The first core network entity may provide, to a second core network entity and based at least in part on the sensing session parameters, a second request to establish a virtual communication session with the UE, the request including one or more communication session parameters. Numerous other aspects are described.WHAT IS CLAIMED IS:
| 1. A first core network entity, comprising: one or more memories; and one or more processors, coupled to the one or more memories, configured to: receive a first request associated with initiation of a sensing service associated with a user equipment (UE); receive, from at least one other core network entity and based at least in part on the first request, one or more sensing session parameters associated with the sensing service; and provide, to a second core network entity and based at least in part on the sensing session parameters, a second request to establish a virtual communication session with the UE, the request including one or more communication session parameters.
| 2. The first core network entity of claim 1, wherein the one or more processors are further configured to: map the one or more sensing session parameters to the one or more communication session parameters.
| 3. The first core network entity of claim 1, wherein the one or more sensing session parameters include at least one of: a sensing type parameter, a range parameter, a range resolution parameter, a velocity parameter, a velocity resolution parameter, an azimuth field of view parameter, an angular resolution parameter, a maximum number of detected targets, a data rate parameter, or a latency parameter.
| 4. The first core network entity of claim 1, wherein the one or more communication session parameters include at least one of: a quality of service parameter, a signal-to-interference-plus-noise ratio (SINR) parameter, a data rate parameter, or a latency parameter.
| 5. The first core network entity of claim 1, wherein the one or more sensing session parameters are based at least in part on at least one of: subscription information associated with the UE and the sensing service, or policy information associated with the sensing service.
| 6. The first core network entity of claim 1, wherein the first core network entity comprises a non-communication session management function (N-SMF) entity and the second core network entity comprises a session management function (SMF) entity.
| 7. The first core network entity of claim 1, wherein the at least one other core network entity comprises at least one of: a non-communication policy control function (N-PCF) entity, or a unified data management (UDM) entity.
| 8. The first core network entity of claim 1, wherein the first request is received from an access and mobility management function (AMF) entity.
| 9. The first core network entity of claim 1, wherein the first request is associated with a sensing network slice that indicates the first request is for the sensing service.
| 10. The first core network entity of claim 1, wherein the first request is associated with a dynamic network name (DNN) or access point name (APN) that indicates the first request is for the sensing service.
| 11. The first core network entity of claim 1, wherein the one or more processors, to receive the one or more sensing session parameters, are configured to: receive information indicating one or more policies for managing the sensing service.
| 12. The first core network entity of claim 11, wherein the information indicating the one or more policies is based at least in part on information that identifies a location of the UE.
| 13. The first core network entity of claim 11, wherein the information indicating the one or more policies is received from a non-communication policy control function (N-PCF) entity.
| 14. The first core network entity of claim 1, wherein the one or more processors are further configured to: determine, based at least in part on the sensing session parameters, a communication service type for the virtual communication session; and indicate, to the second core network entity, that the virtual communication session is associated with the communication service type.
| 15. The first core network entity of claim 14, wherein the communication service type is associated with at least one of: vehicle-to-everything (V2X) communications, or unmanned autonomous vehicle (UAV) communications.
| 16. The first core network entity of claim 1, wherein the one or more processors are further configured to: provide, for a network node associated with the UE, embedded radio level operation configuration information in a sensing session specific container.
| 17. The first core network entity of claim 1, wherein the one or more processors are further configured to: provide, to a network node, a third request to establish the sensing service between the network node and the UE.
| 18. The first core network entity of claim 17, wherein the third request includes information indicating whether the sensing service is for a radar service or positioning service.
| 19. The first core network entity of claim 17, wherein the third request includes information indicating the one or more sensing session parameters.
| 20. The first core network entity of claim 17, wherein the third request includes information indicating whether the sensing service is for UE-based sensing or network node-based sensing.
| 21. The first core network entity of claim 17, wherein the third request includes information indicating a priority associated with the sensing service.
| 22. A first core network entity, comprising: one or more memories; and one or more processors, coupled to the one or more memories, configured to: receive a first request associated with a communication session associated with a user equipment (UE), wherein the first request is associated with one or more first communication session parameters; receive, from a second core network entity, a second request to establish a virtual communication session associated with the UE, wherein the second request is associated with one or more second communication session parameters, and wherein the virtual communication session corresponds to a sensing service; and provide, to a network node, information indicating the one or more first communication session parameters for the communication session and the one or more second communication session parameters for the virtual communication session.
| 23. The first core network entity of claim 22, wherein the one or more processors are further configured to: receive, from the second core network entity and embedded in a sensing session specific container, radio level operation configuration information for the network node; and provide, to the network node, the radio level operation configuration information.
| 24. The first core network entity of claim 22, wherein the first core network entity comprises a session management function (SMF) entity and the second core network entity comprises a non-communication session management function (N-SMF) entity.
| 25. The first core network entity of claim 22, wherein the one or more processors are further configured to: provide, to the network node, sensing information indicating that the virtual communication session is for the sensing service, to be established between the network node and the UE.
| 26. The first core network entity of claim 25, wherein the sensing information includes information indicating one or more sensing session parameters.
| 27. A network node, comprising: one or more memories; and one or more processors, coupled to the one or more memories, configured to: receive, from a first core network entity, a first request associated with initiation of a sensing service associated with a user equipment (UE), wherein the first request includes information identifying one or more sensing session parameters associated with the sensing service; determine, based at least in part on the one or more sensing session parameters and one or more communication session parameters associated with a communication session between the UE and the network node, one or more resources for the sensing service; and transmit, to the UE, information identifying the one or more resources for the sensing service.
| 28. The network node of claim 27, wherein the one or more processors are further configured to: receive, from a second core network entity, a second request including the one or more communication session parameters.
| 29. The network node of claim 27, wherein the one or more processors, to determine the one or more resources, are configured to: determine a first portion of joint communication and sensing resources for the sensing service; determine a second portion of the joint communication and sensing resources for the communication session; and determine, as the one or more resources, the first portion of the joint communication and sensing resources.
| 30. A method of wireless communication performed by a first core network entity, comprising: receiving a first request associated with initiation of a sensing service associated with a user equipment (UE); receiving, from at least one other core network entity and based at least in part on the first request, one or more sensing session parameters associated with the sensing service; and providing, to a second core network entity and based at least in part on the sensing session parameters, a second request to establish a virtual communication session with the UE, the second request including one or more communication session parameters. | The entity has a processor coupled to a memory configured to receive a first request associated with initiation of a sensing service associated with a user equipment (UE) (120) and sensing session parameters associated with the sensing service based on the first request, where the processor provides a second request including communication session parameters to establish a virtual communication session with the UE and the sensing session parameters include one of sensing type parameter, range parameter, range resolution parameter, velocity parameter, velocity resolution parameter, azimuth field of view parameter and angular resolution parameter and the communication session parameters include one of quality-of-service parameter, signal-to-interference-plus-noise ratio (SINR) parameter, data rate parameter or latency parameter. The processor maps the sensing session parameters to the communication session parameters. An INDEPENDENT CLAIM is also included for a method for performing wireless communication by a core network entity. Core network entity for performing wireless communication for facilitating resource management for communication and sensing services. Uses include but are not limited to telephony, video, data, messaging and broadcasts. The entity effectively supports mobile broadband internet access by improving spectral efficiency, lowering costs, improving services, making use of new spectrum and/or providing new-radio (NR) services. The drawing shows a block diagram of a core network for facilitating resource management for communication and sensing services.100Wireless network120UE420Unified data repository455Message bus605Core network |
Please summarize the input | Location of a suspicious vehicle using V2X communication Location of a suspicious vehicle (SV), implemented within a detection entity (DE), such as another vehicle, the detection entity being distinct from the suspect vehicle, from a V2X communication . FIG. 1|1. [Claim 1] Method for locating a suspect vehicle (SV), implemented within a detection entity (DE), the detection entity being distinct from the suspect vehicle, comprising the steps of:
- reception (3) from a remote server (SRV) of a request for the location of the suspect vehicle, the request including an identification number of the suspect vehicle;
- generation (5) of an identification request message;
- transmission (7) of the message on a V2X network, that is to say by a direct link between the entity and the suspect vehicle;
- upon receipt of a positive response from the suspect vehicle, transmission (17) to the remote server of geolocation information of the entity and/or the suspect vehicle.
| 2. [Claim 2] Method according to claim 1, in which the suspect vehicle is geolocated and in which the identification request message further comprises a request for geolocation of the suspect vehicle, so that the positive response from the suspect vehicle further comprises the geolocation of the vehicle suspicious.
| 3. [Claim 3] Method according to one of the preceding claims, in which the detection entity is a motorized land vehicle and in which the suspect vehicle is geolocated, and further comprising a step of: - calculation (19) of a tracking route for the suspect vehicle by the detection vehicle, the tracking route being configured so that the detection vehicle remains within V2X range of the suspect vehicle while being out of visual range of the suspect vehicle.
| 4. [Claim 4] Method according to claim 3, in which the ability to be out of visual range of the suspect vehicle is determined from at least one of the following elements: ? a first predetermined distance; ? a plurality of second predetermined distances, each second predetermined distance corresponding to a type of geographical area; ? data acquired by a sensor of the detection vehicle;
? data acquired by a sensor of the suspect vehicle.
| 5. [Claim 5] Method according to one of claims 3 or 4, further comprising a step of: - display on a vehicle navigation aid system the tracking route.
| 6. [Claim 6] Method according to one of claims 3 to 5, further comprising a step of: - generation of a configured autonomous driving instruction for autonomous driving to follow the tracking route.
| 7. [Claim 7] Method according to one of claims 1 or 2, in which the detection entity is a road infrastructure element.
| 8. [Claim 8] Computer program comprising instructions for implementing the method according to any one of the preceding claims, when these instructions are executed by a processor (200).
| 9. [Claim 9] Device for locating a suspect vehicle, included in a detection entity, the detection entity being distinct from the suspect vehicle, and comprising at least one memory and at least one processor arranged to perform the operations of: - reception from a remote server of a location request lization of the suspect vehicle, the request including an identification number of the suspect vehicle; - generation of an identification request message; - transmission of the message on a V2X network, that is to say by a direct link between the entity and the suspect vehicle; - upon receipt of a positive response from the suspect vehicle, transmission to the remote server of geolocation information of the suspicious entity and/or vehicle.
| 10. [Claim 10] Motorized land vehicle, corresponding to the detection entity, and comprising the device according to claim 9.
1/2 | The method involves receiving request for location of a suspect vehicle from a remote server (3), where the request includes an identification number of the suspect vehicle. An identification request message is generated (5). The message is transmitted (7) on a vehicle-to-everything network by a direct link between a detection entity and the suspect vehicle. Geolocation information of the entity and the suspect vehicle are transmitted (17) to the remote server upon receipt of a positive response from the suspect vehicle. A tracking route for the suspect vehicle is calculated (19) by the detection vehicle, while being out of visual range of the suspect vehicle. INDEPENDENT CLAIMS are also included for:a computer program comprising a set of instructions for locating a suspect vehicle implemented within a detection entity;a device for locating a suspect vehicle implemented within a detection entity; anda motorized land vehicle. Method for locating a suspect vehicle implemented within a detection entity i.e. road infrastructure element, of a motorized land vehicle (all claimed). Uses include but are not limited to a motor vehicle, a moped, a motorcycle and a storage robot in a warehouse. The method enables locating the suspect vehicle within the detection entity, so that interactions of the location method with the suspect vehicle can be reduced to a strict minimum. The drawing shows a flowchart illustrating a method for locating a suspect vehicle implemented within a detection entity.3Step for receiving request for location of a suspect vehicle from a remote server5Step for generating identification request message7Step for transmitting message on a vehicle-to-everything network by a direct link between a detection entity and the suspect vehicle17Step for transmitting geolocation information of the entity and the suspect vehicle to the remote server upon receipt of a positive response from the suspect vehicle19Step for calculating tracking route for the suspect vehicle by the detection vehicle, while being out of visual range of the suspect vehicle |
Please summarize the input | Immobilization of a suspicious vehicle using V2X communication Method and device for immobilizing a suspect vehicle (SV), implemented within a detection entity (DE), the detection entity being distinct from the suspect vehicle, based on V2X communication. FIG. 1|1. Claims
[Claim 1] Method for immobilizing a suspect vehicle (SV), implemented within a detection entity (DE), the detection entity being distinct from the suspect vehicle, comprising the steps of: - reception (3) from a remote server (SRV) of a request immobilization of the suspect vehicle, the request including an identification number of the suspect vehicle; - transmission (7) to the suspect vehicle of a message immobilization, the immobilization message being configured so that the suspect vehicle is immobilized, the message being transmitted over a V2X network, that is to say by a direct link between the entity and the suspect vehicle.
| 2. [Claim 2] Method according to claim 1, further comprising, still at the level of the detection entity, the steps of: - receipt (17) of an acknowledgment of receipt of immobilization from the suspect vehicle; - upon receipt of acknowledgment of receipt, transmission (17) to remote server of immobilization information of the suspect vehicle.
| 3. [Claim 3] Method according to one of the preceding claims, further comprising, at the level of the suspect vehicle, the step of: - upon receipt of the immobilization message, activation of a mode of inhibiting at least one function of a powertrain, the inhibition of the function being configured so that the suspect vehicle is immobilized.
| 4. [Claim 4] Method according to one of the preceding claims, further comprising, at the level of the suspect vehicle, the step of: - upon receipt of the immobilization message, activation of a locking mode of an electronic trajectory corrector, the locking being configured so that the suspect vehicle is immobilized by activation of at least one brake of the vehicle.
| 5. [Claim 5] Method according to one of the preceding claims, further comprising, at the level of the suspect vehicle, the step of: - upon receipt of the immobilization message, activation of a immediate stopping mode of at least one autonomous driving function; - upon activation of the immediate stop mode, generation of an ins- Autonomous driving action configured to make the suspect vehicle stop.
| 6. [Claim 6] Computer program comprising instructions for implementing the method according to any one of claims 1 or 2, when these instructions are executed by a processor (200).
| 7. [Claim 7] Device for locating a suspect vehicle, included in a detection entity, the detection entity being distinct from the suspect vehicle, and comprising at least one memory and at least one processor arranged to perform the operations of: - reception from a remote server of a request immobilization of the suspect vehicle, the request including an identification number of the suspect vehicle; - transmission of a message to the suspect vehicle immobilization, the immobilization message being configured so that the suspect vehicle is immobilized, the message being transmitted over a V2X network, that is to say by a direct link between the entity and the suspect vehicle.
| 8. [Claim 8] Motorized land vehicle, corresponding to the detection entity, and comprising the device according to claim 7. | The method involves performing reception (3) of request from a remote server (SRV) for immobilization of suspect vehicle. The request includes identification number of the suspect vehicle, and an immobilization message is transmitted (7) to the suspect vehicle in response to received request. The immobilization message is configured so that the suspect vehicle is immobilized, and the message is transmitted over V2X network via direct link between entity and suspect vehicle. INDEPENDENT CLAIMS are included for the following:a computer program comprising instructions for immobilizing suspect vehicle;a device for locating suspect vehicle; anda motorized land vehicle. Method for immobilizing suspect vehicle such as motorized land vehicle e.g. motor vehicle, moped, motorcycle. Can also be used in storage robot in warehouse. The stolen vehicle itself detects suspicious activity and transmits alert so that theft of vehicle is prevented. The interactions of immobilization method with the suspect vehicle are reduced to the strict minimum. The drawing shows a flow diagram illustrating the process for immobilizing suspect vehicle. (Drawing includes non-English language text) 1Step for generating request to immobilize suspect vehicle3Step for reception of request from remote server5Step for generating immobilization message7Step for transmitting immobilization message to suspect vehicle9Step for providing direct exchanges possible between entities |
Please summarize the input | Communication method for reinforcing the safety of autonomous driving of a vehicle, vehicle provided with means for implementing this method Communication method for strengthening the safety of autonomous driving of a vehicle (1), this method comprising an exchange of perception data which is limited to what is necessary. In this process, the V2X communication network is only used when the uncertainty on an environmental element (object external to the vehicle (1)) is too great. On the other hand, if the V2X communication network is requested, the corresponding perception request calls for a response which is limited by attributes specifically correlated to the environmental element. Vehicle (1) provided with means (2, 3, 4, 5, 7) for implementing this method. Figure 1|1. Claims
[Claim 1] Communication method for strengthening the safety of autonomous driving of a vehicle (1), this method comprising a) a step of acquiring perception data on an environmental element of the vehicle (1), b) a step of determining an uncertainty on the environmental element, c) a step of evaluating the level of uncertainty by comparing this level of uncertainty to a safety threshold, d) a step of creating a collection request, in which the perception request calls for a response limited by attributes specifically correlated to the environment element, e) a step of transmitting this request to a V2X communication network, f) a step of receiving a response to the request, from the V2X communication network, or taking into account a lack of response to the request, from the V2X communication network, g) a merger step, in the event of a response, collective perception data obtained in response to the perception request, with at least some of the perception data which had initially been obtained in step a), h) a step of returning to step b) for determining d an uncertainty updated with the perception data obtained by the fusion step of step g).
| 2. [Claim 2] A method according to claim 1, wherein the attributes specifically correlated to the environment element are defined by a perception query identity, a relevance region and a type of the environment element.
| 3. [Claim 3] Method according to one of the preceding claims in which step h) returning to step b) is carried out as part of a logic of isolation and correction of fault detection.
| 4. [Claim 4] Method according to one of the preceding claims, in which the security threshold is determined as a function of at least one of the following parameters: the severity of the situation, the recurrence of the situation and the controllability of the situation.
| 5. [Claim 5] Method according to one of the preceding claims, in which step e) of transmitting the perception request to the V2X communication network is carried out before activating an autonomous driving process in minimal risk maneuvering mode.
| 6. [Claim 6] Method according to one of the preceding claims, comprising, prior to step a), a step of filtering on the relevant environmental elements.
| 7. [Claim 7] Vehicle (1) equipped with a driving assistance system (2), sensors (3) configured to collect perception data, means for recording and storing perception data (4), means for calculating and processing perception data (5) configured to determine an uncertainty on an environmental element, and means for exchanging perception data (7) with a V2X network, in which at least some of the perception data are used to control the driving assistance system (2), characterized in that the means for exchanging perception data (7) are only implemented in relation to the environmental element only if the uncertainty on this environmental element is greater than or equal to a predetermined security threshold.
| 8. [Claim 8] Vehicle according to claim 7, in which the means for calculating and processing perception data (5), as well as the means for exchanging perception data (7) are configured to transmit a perception request comprising specifically correlated attributes to the environmental element on which the uncertainty is greater than or equal to the predetermined safety threshold.
| 9. [Claim 9] Computer program comprising program code instructions for executing the method according to one of claims 1 to 6, when said program is executed on a computer, and for triggering an action of said vehicle (1) depending on the updated uncertainty on the environmental element.
| 10. [Claim 10] Distributed computer system comprising an on-board computer (5) for executing the program according to claim 9, as well as on-board data processing means (I) configured to process said perception request. | The method involves acquiring perception data on an environmental element of the vehicle (1). The uncertainty on the environmental element is determined. The level of uncertainty is evaluated by comparing level of uncertainty to a safety threshold. The collection request in which the perception request calls for a response limited by attributes specifically correlated to the environment element is created. The request is transmitted to a V2X communication network. The response to the request is received from the V2X communication network, or taking into account a lack of response to the request from the V2X communication network. The collective perception data is obtained in response to the perception request. The uncertainty updated with the perception data obtained by the fusion step is determined. INDEPENDENT CLAIMS are included for the following:a vehicle;a computer program for strengthening safety of autonomous driving of vehicle; anda distributed computer system for strengthening safety of autonomous driving of vehicle. Communication method for strengthening safety of autonomous driving of vehicle such as motor vehicle. The controllability is the ability of users to maintain control in the event of a failure and indicator is identified based on knowledge of user behavior in the situation. The drawing shows a schematic view of the vehicle. (Drawing includes non-English language text) 1Vehicle2Driving assistance system3Sensor4Perception data5Processing perception data |
Please summarize the input | Method and device for parking a motor vehicle A method of parking a motor vehicle comprises the steps of: acquisition (31) of the environment of a parking space, comprising a sub-step of detection (35) of at least one connected autonomous vehicle parked at the edge of the parking space; sending (37) of a command travel to the connected autonomous vehicle, said movement command comprising movement instructions making it possible to widen the parking space; execution (45) of a parking maneuver after widening the parking space. A parking device and a motor vehicle comprising the device are also described. Figure to be published with the abstract: Fig2|1. Claims
[Claim 1] Method of parking a motor vehicle comprising the steps of: ? acquisition (31) of the environment of a parking space parking, comprising a sub-step of detection (35) of at least one connected autonomous vehicle parked at the edge of the parking space; ? sending (37) a movement command to the vehicle autonomous connected, said movement command comprising movement instructions making it possible to widen the parking space; ? execution (45) of a parking maneuver after widening the parking space.
| 2. [Claim 2] Method according to claim 1, in which the movement command is included in a DENM type V2X message.
| 3. [Claim 3] Method according to claim 1 or 2, wherein the step of acquiring the environment comprises acquiring (39) the environment of the at least one parked autonomous vehicle.
| 4. [Claim 4] Method according to claim 3, wherein the step of acquiring the environment of the at least one parked vehicle comprises receiving a message from said parked vehicle containing possible movement information.
| 5. [Claim 5] Method according to claim 3 or 4, in which the movement command is only sent after determining that the environment of the at least one parked autonomous vehicle allows movement to enlarge the parking space sufficient to allow parking of the vehicle.
| 6. [Claim 6] Method according to one of claims 1 to 5, further comprising a step of sending an information message to the owner of the parked autonomous vehicle.
| 7. [Claim 7] Method according to any one of the preceding claims, comprising a preliminary step of determining the size of the parking space and comparing it with the minimum size necessary to allow parking of the vehicle.
| 8. [Claim 8] Device for parking a motor vehicle (1) comprising: ? means of acquiring (5) the environment of a parking space, comprising detection of at least one connected autonomous vehicle parked at the edge of the space parking ;
? a transmitter (7) of a movement command to the connected autonomous vehicle, said movement command comprising movement instructions making it possible to widen the parking space;
? a controller (3) adapted to control a parking maneuver after widening the parking space.
| 9. [Claim 9]
Motor vehicle comprising a device according to claim 8.
| 10. [Claim 10] Computer program product downloadable from a communications network and/or recorded on a computer-readable medium and/or executable by a processor, characterized in that it comprises program code instructions for implementing the method according to at least one of claims 1 to 7. | The method involves acquiring (31) environment of parking space, and performing sub-step detection (35) of at least one connected autonomous vehicle parked at edge of the parking space. A movement command is send (37) to vehicle, which comprises movement instructions to widen the parking space, and a parking maneuver is executed (45) after widening the parking space. The movement command is included in DENM type V2X message, and the environment of at parked autonomous vehicle is acquired by receiving message from parked vehicle containing possible movement information. INDEPENDENT CLAIMS are included for the following:a device for parking motor vehicle; anda computer program product for parking motor vehicle. Method for parking motor vehicle (claimed) such as car and van. The available location calculation is carried out on sides of vehicles and the widening maneuver by parked vehicles is improved by automation of autonomous vehicles. The drawing shows a flow diagram illustrating the process for parking motor vehicle. 31Step for acquiring environment of parking space35Step for detecting connected autonomous vehicle parked at edge of parking space37Step for sending movement command to vehicle39Step for determining whether movement is possible45Step for executing parking maneuver |
Please summarize the input | Method and device for securing an autonomous vehicle The invention relates to a method and a device for securing a vehicle (10) adapted to travel in an autonomous driving mode. To this end, a communication infrastructure control device (1) detects when the vehicle (10) is stationary on a road. The control device transmits to the vehicle (10) a first request awaiting a response from the driver, according to a vehicle-to-infrastructure communication mode, called V2I. In the absence of a response from the driver to the first request, the control device transmits in V2I one or more driving instructions to the vehicle (10) so that the latter reaches a safety position. Figure for abstract: Figure 1|1. Claims
[Claim 1] Method for securing a vehicle (10), said vehicle (10) being configured to travel in an autonomous driving mode, said method comprising the following steps: - determination (21), by a communication infrastructure control device (1), of a current state of said vehicle (10) representative of a stop of said vehicle on a road; - transmission (22), by said control device, of first data representative of a first request intended for said vehicle (10), said first request requiring a response from a driver of said vehicle (10), said transmission being works according to a vehicle-to-infrastructure communication mode, known as V2I; - transmission (23), by said control device according to said V2I communication mode, of second data representative of at least one driving instruction towards a safety position intended for said vehicle (10) in the event of non-response from said driver to said first request.
| 2. [Claim 2] Method according to claim 1, for which said second data correspond to: - data representative of coordinates of a geolocation system corresponding to said safety position to be reached; and or - data representative of at least one image of said safety position to be reached; and or - data representative of a voice command to be given in said vehicle (10) to control said vehicle (10) towards said safety position to be reached.
| 3. [Claim 3] Method according to claim 1 or 2, further comprising a step of transmission (36), by said control device according to said V2I communication mode, of a second request intended for said vehicle (10), said second request requesting a passage said vehicle (10) from a current level of autonomy to a determined level of autonomy, said determined level of autonomy being greater than said current level of autonomy.
| 4. [Claim 4] Method according to any one of Claims 1 to 3, for which the said first data correspond to: - data representative of a voice message to be delivered in said vehicle (10) requiring an action from said driver in response to said voice message; and or - data representing graphic content to be displayed on a display screen on board said vehicle (10) requiring an action from said driver in response to said displayed graphic content; and/ Where - data representative of an alarm to be returned by an alarm system on board said vehicle and requiring an action from said driver in response to said alarm.
| 5. [Claim 5] Method according to any one of claims 1 to 4, further comprising a step of transmission, by said control device according to said V2I communication mode, of at least one command intended for at least one control system of said vehicle (10) in the event of non-execution of said at least one driving instruction by said vehicle (10).
| 6. [Claim 6] Method according to claim 5, for which said at least one command belongs to a set of commands comprising: - a command to start said vehicle; - A stop command of said vehicle; - A speed control of said vehicle; - A steering control of said vehicle; - A braking control of said vehicle; and - A path control of said vehicle.
| 7. [Claim 7] Method according to one of claims 1 to 6, for which said current state of said vehicle (10) is determined from information representative of the environment of said vehicle (10).
| 8. [Claim 8] Method according to claim 7, for which said information representative of the environment is obtained by said control device from: - at least one sensor (112) of said communication infrastructure (1) configured to acquire data representative of said environment; and or - said vehicle according to said V2I communication mode; and or - at least one other vehicle (11) according to said V2I communication mode.
| 9. [Claim 9] Device (4) for securing a vehicle, said device (4) comprising a memory (41) associated with at least one processor (40) configured for the implementation of the steps of the method according to any one of the claims 1 to 8.
| 10. [Claim 10] System comprising the device (4) according to claim 9 and at least one vehicle (10) configured to travel in an autonomous driving mode. | The method involves configuring vehicle (10) to travel in an autonomous driving mode. The current state of vehicle representative of a stop of vehicle on a road is determined by a communication infrastructure control device (1). The first data representative of a first request intended for vehicle is transmitted by a control device, where the first request requires a response from a driver of vehicle and the transmission works according to a vehicle-to-infrastructure communication mode. The second data representative of driving instruction is transmitted towards a safety position intended for vehicle in the event of non-response from driver to first request, by the control device according to the vehicle-to-infrastructure communication mode. INDEPENDENT CLAIMS are included for the following:a device for securing vehicle; anda system comprising device for securing vehicle. Method for securing vehicle e.g. autonomous vehicle such as motor vehicle, and land vehicle such as truck, bus and motorcycle. The safety of vehicles on the roads is improved. The vehicle is secured, and the safety of vehicles and passengers is increased. The risk of collision with another vehicle is avoided, and the risk of accident or additional accident linked to the presence of stationary vehicle on the road is reduced. The drawing shows a schematic view of communication infrastructure and vehicle. 1Communication infrastructure control device10Vehicle100Cloud of network110,111Communication devices112Camera |
Please summarize the input | Method for managing a convoy comprising at least two motor vehicles in an autonomous driving mode The invention relates to a method for managing a convoy grouping together at least two motor vehicles (1, 4, 5) in an autonomous driving mode, the convoy traveling on a road (3) and being formed by at least one lead vehicle (1) and at least one follower vehicle (4, 5), the method comprising the following steps: - determining, according to a programmed route, if the leading vehicle (1) must change direction and if the time or the distance remaining before a planned change of direction is less than a time or distance threshold ; - send an information message to the vehicles (4, 5, 6) located near the lead vehicle (1) indicating that a change of direction of the lead vehicle (1) must occur soon. Figure for the abstract: Fig. 3|1. Claims
[Claim 1] Method for managing a convoy grouping together at least two motor vehicles (1, 4, 5) in an autonomous driving mode, the convoy traveling on a road (3) and being formed by at least one leading vehicle (1) and at least one follower vehicle (4, 5), the method comprising the following steps: - determining (40), according to a programmed route, if the leading vehicle (1) must change direction and if the time or the distance remaining before a planned change of direction is less than a time threshold or distance; - Transmit (42) an information message to vehicles (4, 5, 6) located close to the lead vehicle (1) indicating that a change of direction of the lead vehicle (1) is due soon.
| 2. [Claim 2] Method according to the preceding claim, in which, when the information message is received by a follower vehicle (4, 5) of the convoy, this vehicle determines (44) whether the change of direction of the lead vehicle (1) is compatible with a programmed route of the following vehicle (4, 5).
| 3. [Claim 3] Method according to the preceding claim, in which, if it is determined that the programmed route of a following vehicle is incompatible with the change of direction of the leading vehicle, the following vehicle performs (46) at least one of the actions following; - search for a compatible lead vehicle; - maintenance of autonomous driving mode; - stopping autonomous driving mode.
| 4. [Claim 4] Method according to the preceding claim, in which the stopping of the autonomous driving mode is preceded by an alert message sent to the attention of the driver of the following vehicle (4, 5) concerned, the message being for example of the visual type and/or sound.
| 5. [Claim 5] Method according to one of the preceding claims, in which the step of transmitting (42) an information message is implemented by means of a wireless communication module on board the lead vehicle and compatible with the standard V2X.
| 6. [Claim 6] Method according to the preceding claim, in which the wireless communication module on board the lead vehicle is compatible with one or more of the following wireless communication protocols: IEEE 802.lip, ETSIITS-G5, Wifi?, Bluetooth?, GSM 3G-4G-5G, C, LTE.
| 7. [Claim 7] Method according to one of the preceding claims, in which the step of determining (40) whether the lead vehicle (1) must change direction is carried out by a computer on board the lead vehicle (1), as a function of information provided by a geolocation module (14) on board the lead vehicle (1).
| 8. [Claim 8] Method according to one of the preceding claims, in which the steps of determining (40) whether the lead vehicle (1) should change direction and of transmitting (42) an information message are implemented so that the information message can be transmitted before the leading vehicle (1) begins the change of direction maneuver.
| 9. [Claim 9] Method according to the preceding claim, in which the duration between the transmission of an information message and the start of the maneuver to change direction of the leading vehicle (1) is greater than or equal to a threshold duration, the threshold duration being at least equal to 10 seconds.
| 10. [Claim 10] Computer program product comprising instructions which, when the program is executed by one or more processor(s), cause the latter(s) to implement the steps of the method in accordance with one of Claims 1 to 9.
1/2 | The method involves determining if a leading vehicle (1) changes direction and if time or distance remaining before a planned change of direction is less than a time threshold or distance. An information message is transmitted to vehicles (4-6) located close to the lead vehicle indicating that change of direction of the lead vehicle is due soon, where the vehicle determines whether the change of direction of the lead vehicle is compatible with a programmed route of the following vehicle when the information message is received by the follower vehicle of a convoy. The leading vehicle is compatible with wireless communication protocols e.g. IEEE 802.lip protocols, ETSIITS-G5 protocols, Wifi protocols, Bluetooth protocols, GSM 3G protocols-4G protocols-5G protocols and LTE protocols. An INDEPENDENT CLAIM is also included for a computer program product comprising a set of instructions for managing a convoy of motor vehicles. Method for managing a convoy of motor vehicles i.e. autonomous or partially autonomous motor vehicles, traveling on a road. The method enables managing the convoy to manage a situation in which the leading vehicle is brought to change direction in complete safety so as to improve the management of driving in the convoy of autonomous vehicles. The drawing shows a schematic view of a portion of a road.1Leading vehicle4-6Vehicles18Sensors30, 32Traffic lanes38Exit lane |
Please summarize the input | Method and device for controlling an autonomous vehicleThe invention relates to a method and a device for controlling an autonomous vehicle (10). To this end, first information representing an environment of the vehicle (10) is obtained, the environment comprising a set of elements (11, 12, 13, 14). At least a portion of the environment is subdivided into a plurality of cells (211-217, 221-225, 231-238). For each cell, a value representative of a level of nuisance weighing on the vehicle (10) is determined from the first information, second information representative of the vehicle (10) and third information representative of a level of nuisance associated with each element (11, 12) of at least part of the set of elements. The vehicle (10) is controlled according to the values ??representative of the level of nuisance. Figure for the abstract: Figure 2|1. Claims
[Claim 1] A method of controlling an autonomous vehicle (10), said method comprising the following steps: - obtaining (51) first information representative of an environment (1) of the vehicle (10), said environment (1) comprising a set of elements (11, 12, 13, 14) comprising at least one element, at least a part of the first information being representative of said at least one element; - subdivision (52) of at least part of said environment into a plurality of cells (211 to 217, 221 to 225, 231 to 238); - for each cell, determination (53) of a value representative of a level of nuisance weighing on said autonomous vehicle (10) from said first information and second information representative of said autonomous vehicle (10), said determination of a value representative of a level of nuisance weighing on said autonomous vehicle (10) further being a function of third information representative of a level of nuisance associated with each element (11, 12) of at least one part said set of elements; - control of said autonomous vehicle (10) according to said values ??representative of the level of nuisance.
| 2. [Claim 2] A method according to claim 1, further comprising a step of representing a dynamic environment of said autonomous vehicle (10) based on said cells (211 to 217, 221 to 225, 231 to 238) and said values ??associated with said cells (211 to 217, 221 to 225, 231 to 238).
| 3. [Claim 3] Method according to claim 1 or 2, for which said set of elements comprises at least one element among the following elements: - static object; and/or - moving object; and/or - floor markings; and/or - traffic information; and/or - signaling device; and/or - hole in the road.
| 4. [Claim 4] Method according to claim 1, for which said nuisance level is a function of: - a kinetic energy resulting in the event of collision of said autonomous vehicle (10) with an element of said assembly; and or - a braking force resulting from braking of said autonomous vehicle (10); and or - a centrifugal force resulting from a change of direction of said autonomous vehicle (10); - a set of traffic rules; and or - information representative of a determined path for said autonomous vehicle (10).
| 5. [Claim 5] Method according to claim 1 or 2, for which said first information belongs to a set of information comprising: - position representative information; - information representative of a type of element; - representative size information; - kinematic information of the associated element; - information representative of meteorological conditions; - information representative of traffic rules; - information representative of trajectory; - traffic information; - information representative of traffic conditions.
| 6. [Claim 6] Method according to any one of Claims 1 to 3, for which the said second information belongs to a set of information comprising: - information representative of position; - Kinematic information of said autonomous vehicle; - information representative of the trajectory of said autonomous vehicle.
| 7. [Claim 7] Method according to any one of claims 1 to 4, for which said first information is obtained from at least one sensor of a detection system on board said autonomous vehicle (10) and/or from at least one element of said assembly of elements (11, 12, 13, 14) according to a vehicle-to-everything type communication mode, called V2X.
| 8. [Claim 8] Method according to one of claims 1 to 7, for which said step of controlling the autonomous vehicle comprises determining information representative of the trajectory of said autonomous vehicle (10) as a function of said cells (211 to 217, 221 to 225, 231 to 238) and associated representative nuisance values.
| 9. [Claim 9] Device (4) configured to control an autonomous vehicle, said device (4) comprising a memory (41) associated with at least one processor (40) configured to implement the steps of the method according to any one of Claims 1 to 8.
| 10. [Claim 10] Autonomous vehicle (10) comprising the device (4) according to claim 9. | The method involves obtaining first information representative of an environment of an autonomous vehicle (10), the environment comprises a set of elements e.g. Hole (13) and road sign (14). A part of the environment is subdivided into a set of cells (211-217). A value representative of a level of nuisance weighing is determined on the autonomous vehicle from the information and second information representative of the autonomous vehicle. The value representative of the level of nuisance weighing is determined on the autonomous vehicle from a function of third information representative of a level of nuisance associated with each element of the set of elements. The autonomous vehicle is controlled according to the values representative of the level of nuisance. INDEPENDENT CLAIMS are also included for:a device for controlling an autonomous vehicle; andan autonomous vehicle. Method for controlling a kinematic parameter e.g. speed and acceleration, a trajectory, a braking system and a safety system of an autonomous vehicle (claimed) i.e. autonomous land motor vehicle. The method enables controlling the autonomous vehicle so as to improve representation of the environment of the vehicle and to improve the decision-making of the vehicle in the context of autonomous driving. The drawing shows a schematic view representing a spatial subdivision of an environment.10Autonomous vehicle13Hole14Road sign101-103Three lanes of traffic211-217Cells |
Please summarize the input | Method for updating road signs by an autonomous vehicle The invention relates to a method and device for updating road signs by an autonomous vehicle traveling in a road environment, comprising the steps of: - detection (42), by the autonomous vehicle, of a presence or of an absence of a road sign element from the road environment; - emission (43), by the autonomous vehicle, of information representing a result of the detection, - reception (44) of said information by a device remote from an infrastructure of a communication network and; - updating (45) of the road signs by the remote device according to said information received. Figure for abstract: Figure 4|1. Claims
[Claim 1] Method for updating road signs by an autonomous vehicle traveling in a road environment, comprising steps of: - Detection (42), by the autonomous vehicle, of a presence or absence of a road sign element of the road environment; - emission (43), by the autonomous vehicle, of information representing a result of the detection, - reception (44) of said information by a device remote from an infrastructure of a communication network and; - updating (45) of the road signs by the remote device as a function of said information received.
| 2. [Claim 2] Method according to claim 1, which further comprises a step of transmitting said information by the autonomous vehicle to another autonomous vehicle.
| 3. [Claim 3] A method according to claim 2, wherein said information is transmitted from the autonomous vehicle to the other autonomous vehicle according to vehicle-to-vehicle communication.
| 4. [Claim 4] Method according to claim 2 or 3, which further comprises a step of transmitting said information by said other autonomous vehicle and intended for the remote device.
| 5. [Claim 5] Method according to one of claims 1 to 4, for which said information is transmitted from an autonomous vehicle to the remote device according to a vehicle-to-infrastructure communication.
| 6. [Claim 6] Method according to one of claims 1 to 5, which further comprises, prior to the steps of detecting and transmitting by the autonomous vehicle, a step of transmitting (41), by the remote device, of a request for obtaining an update of the road signs and a step of receiving said request by the autonomous vehicle.
| 7. [Claim 7] A method according to claim 6, wherein said request informs the autonomous vehicle that a road sign element is present at a particular location.
| 8. [Claim 8] Device for updating road signs by an autonomous vehicle traveling in a road environment, comprising a memory associated with at least one processor configured for implementing the steps of the method according to any one of the claims
| 9. [Claim 9] 1 to 7. Computer program product comprising instructions suitable for executing the steps of the method according to one of claims 1 to 7, when the computer program is executed by at least one
| 10. [Claim 10] processor. Computer-readable recording medium on which is recorded a computer program comprising instructions for carrying out the steps of the method according to one of claims 1 to 7. | The method (400) involves detection (42) of a presence or absence of a road sign element of the road environment by the autonomous vehicle. Information is emitted (43) that represents a result of the detection by the autonomous vehicle. An information is received (44) by a device remote from an infrastructure of a communication network. The road signs are updated (45) by the remote device as a function of information received. Information is transmitted by the autonomous vehicle to another autonomous vehicle. INDEPENDENT CLAIMS are included for the following:a device for updating road signs by an autonomous vehicle traveling in a road environment; anda computer program product for updating road signs by an autonomous vehicle traveling in a road environment. Method for updating road signs by an autonomous vehicle traveling in a road environment. Method avoids the congestion of the communication network and overloading of the remote device by avoiding the systematic sending of information representing detection results by autonomous vehicles as soon as they detect an element of road signs along their route. The drawing shows a flow chart of a method for updating road signs by an autonomous vehicle traveling in a road environment. (Drawing includes non-English language text). 42Detection of a presence or absence of a road sign element of the road environment by the autonomous vehicle43Emitting the information that represents a result of the detection by the autonomous vehicle44Receiving the information by a device remote from an infrastructure of a communication network45Updating the road signs by the remote device as a function of information received400Method |
Please summarize the input | Vehicle communication method and device The invention relates to a communication method and device for vehicles (10 and 11). To this end, information representative of the arrival of a second vehicle (11), traveling on a traffic lane, is transmitted to the first vehicle (10) via a wireless link of the vehicle-to-vehicle type. At least one guidance instruction is then determined so that the first vehicle (10) allows the second vehicle (11) to pass. Figure for abstract: Figure 1|1. Claims [Claim 1] Communication method for a vehicle, said method being implemented by a first vehicle (10), said method comprising the following steps: - reception (31) of information representative of an approach of a second vehicle (11) on a traffic lane of said first vehicle (10) according to a vehicle-to-vehicle type communication mode, V2V; - determination (32) of at least one instruction for guiding said first vehicle (10) in order to allow passage to said second vehicle (11).
| 2. [Claim 2] The method of claim 1, further comprising a step of rendering said at least one guidance instruction, to rendering means associated with said first vehicle (10).
| 3. [Claim 3] A method according to claim 1 or 2, wherein said first vehicle (10) implements said at least one guidance instruction in autonomous driving level 3 or higher.
| 4. [Claim 4] Method according to any one of claims 1 to 3, for which said at least one guidance instruction comprises at least one command representative of a movement of said first vehicle (10) with respect to said traffic lane.
| 5. [Claim 5] Method according to any one of the preceding claims, for which said at least one guidance instruction is further dependent on information representative of traffic conditions and / or on information representative of the speed of said first vehicle.
| 6. [Claim 6] Method according to any one of the preceding claims, for which said second vehicle (11) corresponds to: - a priority vehicle; and or - a two-wheeled vehicle.
| 7. [Claim 7] Method according to any one of the preceding claims, for which said information representative of an approach of a second vehicle (11) is included in at least one message of CAM and / or DENM type.
| 8. [Claim 8] Device (2) comprising a memory (21) associated with at least one processor (20) configured for implementing the steps of the method according to any one of claims 1 to 7.
| 9. [Claim 9] Vehicle (10) comprising the device (3) according to claim 8.
| 10. [Claim 10] Computer program product comprising instructions suitable for executing the steps of the method according to one of claims 1 to 7, when the computer program is executed by at least one processor.
1/2 | The method involves receiving the information representative of an approach of second vehicle (11) on a traffic lane of first vehicle (10) according to a vehicle-to-vehicle (V2V) type communication mode. The instruction is determined for guiding first vehicle in order to allow passage to second vehicle. The guidance instruction is rendered to a rendering unit associated with first vehicle. The first vehicle implements the guidance instruction in autonomous driving level 3 or higher. The guidance instruction includes a command representative of a movement of first vehicle with respect to the traffic lane. The guidance instruction is dependent on information representative of traffic conditions and/or on information representative of the speed of first vehicle. INDEPENDENT CLAIMS are included for the following:a communication device; anda computer program product. Communication method for a vehicle (claimed), such as an ambulance, fire engine and police vehicle. The safety on the roads is improved. The free passage of priority vehicles is facilitated. The drawing shows a schematic view of a first vehicle traveling on a traffic lane of a road. 1Road environment10First vehicle11Second vehicle101,102Communication devices1000Road |
Please summarize the input | System and method for projecting a trajectory of an autonomous vehicle onto a road surfaceThe invention claims a system and a method for projecting a current track of an autonomous vehicle onto a road surface In some embodiments, claims an autonomous vehicle with light projector, wherein the light projector is on the top surface the autonomous vehicle. In addition, in some embodiments, the autonomous vehicle may include an electronic control unit, which is used for controlling the operation of the light projector, wherein the electronic control unit detects whether the autonomous vehicle is started. In other embodiments, the electronic control unit receives the data of the environment condition around the autonomous vehicle and receiving the track of the imminent occurrence of the autonomous vehicle. the electronic control unit further can project the light from the light projector to the road, surface the track of the autonomous vehicle to appear.|1. A system for projecting a trajectory of an autonomous vehicle onto a road surface, the system comprising: a light projector on the top surface the autonomous vehicle, and an electronic control unit, which is used for controlling the operation of the light projector, wherein the electronic control unit: detecting whether the autonomous vehicle is activated; If the autonomous vehicle is detected: receiving data of the environment condition around the autonomous vehicle, wherein the environment condition is based on the shape of a part of the object in the image corresponding to the pedestrian, pixel intensity and line to indicate the presence of the upcoming turning track and the pedestrian; receiving data of an imminent track of the autonomous vehicle; adjusting the upcoming trajectory based on the environmental condition; and indicating the light projector to indicate the autonomous vehicle to the front appointed distance and the turning direction of the turning direction of the light beam and text is projected surface the road, the projected light beam indicates the track of the autonomous vehicle is to appear, and indicating the light projector after turning projection three-dimensional fence, The three-dimensional fence is perpendicular to the light beam and the text.
| 2. The system according to claim 1, wherein the data of the track to be present of the autonomous vehicle comprises: GPS data and at least one of the received data of the environmental condition.
| 3. The system according to claim 2, wherein the data of the environment condition comprises: Traffic information, road sign information object detection and road condition information
| 4. The system according to claim 3, wherein the data of the environmental condition is collected by at least one of a camera, a sensor, a navigation system, a vehicle communication system, a vehicle-to-infrastructure communication system, and a laser scanner
| 5. The system according to claim 2, further comprising a vehicle communication module, wherein the vehicle communication module is configured to transmit and receive GPS data and data of the environmental condition to different autonomous vehicles.
| 6. The system according to claim 1, wherein each of the projected beams comprises: straight arrow, turning arrow, inclined arrow, projection of at least one of character and number.
| 7. The system according to claim 6, wherein each of the projected beams further comprises: The projection of the current speed of the autonomous vehicle.
| 8. The system according to claim 1, wherein at least one of the projected beams comprises: the projection of the fence, for indicating the parking area of the autonomous surface on the road.
| 9. The system according to claim 1, wherein the light projector comprises: It comprises light source of light emitting diode or laser diode.
| 10. The system according to claim 1, wherein, when the autonomous vehicle detects the presence of a pedestrian, the light beam is projected surface the road.
| 11. The system according to claim 10, wherein the system comprises a vehicle-to-vehicle communication system for alerting a nearby vehicle of the possibility of the presence of the pedestrian.
| 12. The system according to claim 1, wherein the electronic control unit provides an audible notification when the vehicle determines that the upcoming track will collide with an object or a pedestrian.
| 13. The system according to claim 12, wherein when the electronic control unit determines that an object or pedestrian in a certain distance from the autonomous vehicle is collided, the audible notification is provided.
| 14. The system according to claim 1, wherein the light beam from the light projector is projected from 2 feet to 20 feet in front of the autonomous vehicle.
| 15. The system according to claim 1, wherein the light beam from the light projector is projected from 2 feet to 20 feet on the side of the autonomous vehicle.
| 16. A method for projecting a trajectory of an autonomous vehicle onto a road surface, the method comprising: detecting whether the autonomous vehicle is activated; If the autonomous vehicle is detected: receiving data of the environment condition around the autonomous vehicle, wherein the environment condition is based on the shape of a part of the object in the image corresponding to the pedestrian, pixel intensity and line to indicate the presence of the upcoming turning track and the pedestrian; receiving data of an imminent track of the autonomous vehicle; adjusting the upcoming trajectory based on the environmental condition; and indicating the light projector to indicate the autonomous vehicle to the front appointed distance and turning direction of the turning direction of the light beam and text is projected surface the road, the projected light beam indicates the track of the autonomous vehicle will appear, and indicating the light projector after turning projection three-dimensional fence, The three-dimensional fence is perpendicular to the light beam and the text.
| 17. The method according to claim 16, wherein the track to be present comprises: expected path of the autonomous vehicle.
| 18. The method according to claim 16, wherein the data of the imminent track of the autonomous vehicle is determined by at least one of the GPS data and the received data of the environment condition, the data of the environment condition comprises: Traffic information, road sign information object detection and road condition information
| 19. The method according to claim 16, wherein each of the projected beams comprises: straight arrow, turning arrow, inclined arrow, projection of at least one of character and number.
| 20. The method according to claim 16, wherein at least one of the projected beams comprises: the projection of the fence, for indicating the parking area of the autonomous surface on the road.
| 21. The method according to claim 16, further comprising: displaying the upcoming track on the screen in front of the driver of the autonomous vehicle.
| 22. The method according to claim 16, wherein the light beam from the light projector is projected on the surface of the road, from 2 feet to 20 feet in front of the autonomous vehicle. | The system (100) has a light projector (120) arranged on a top surface of an autonomous vehicle. An electronic control unit (140) controls operation of the light projector, and detects whether the autonomous vehicle is turned on, receives data of an environmental condition surrounding the autonomous vehicle, receives an upcoming trajectory path of the autonomous vehicle and projects a light from the light projector onto a surface of a road indicating the upcoming trajectory path of the autonomous vehicle, where the upcoming trajectory path of the autonomous vehicle is determined by one of received global positioning system data and the received data of the environmental condition. An INDEPENDENT CLAIM is also included for a method for projecting current trajectory path of an autonomous vehicle on a surface of road. System for projecting current trajectory path of an autonomous vehicle on a surface of road. The system allows autonomous vehicles to decrease traffic collision caused by human errors. The system allows the autonomous vehicles with enhanced driving control systems and safety mechanisms to ensure reliability and safety of the autonomous vehicles. The drawing shows a block diagram of a system for projecting current trajectory path of an autonomous vehicle on a surface of road. 100System for projecting current trajectory path of autonomous vehicle on surface of road120Light projector140Electronic control unit160Switch162Camera |
Please summarize the input | AN IMPROVED PERFORMANCE AND COST GLOBAL NAVIGATION SATELLITE SYSTEM ARCHITECTURESignificant, cost-effective improvement is introduced for Position, Navigation, and Timing (PNT) on a global basis, particularly enhancing the performance of Global Navigation Satellite Systems (GNSS), an example of which is the Global Positioning System (GPS). The solution significantly improves performance metrics including the accuracy, integrity, time to acquire, interference rejection, and spoofing protection. A constellation of small satellites employing a low-cost architecture combined with improved signal processing yields an affordable enabler for spectrum-efficient transportation mobility. As air traffic management modernization transitions to a greater dependence on satellite positioning, the solution provides aviation users new protections from both intentional and unintentional interference to navigation and surveillance. And in response to an era in which intelligent transportation is under development for automobiles, reliable where-in-lane positioning enables new applications in connected and autonomous vehicles. New military capability increases PNT availability.I claim:
| 1. A method for supporting resilient carrier phase positioning of user devices connected by respective communication links to at least one service data processor, measurements received from Global Navigation Satellite System (GNSS) satellites, and measurements received from low Earth orbit (LEO) satellites, said measurements including carrier phase pseudorange information, comprising the steps of: (a) the at least one service data processor accepting said measurements received from (i) at least one of said GNSS satellites by at least one LEO satellite, (ii) at least one of said GNSS satellites and the at least one LEO satellite by at least one ground reference station, and/or (iii) at least one other LEO satellite by the at least one LEO satellite via a LEO-to-LEO crosslink transmission; (b) the at least one service data processor generating precise orbit and clock predictions for the at least one LEO satellite from available said pseudorange infomiation; and (c) the at least one service data processor disseminating said predictions over said communications links to the user devices to enable the user devices to take into account the precise orbit and clock predictions when computing respective positions of the user devices upon receiving measurements from GNSS and LEO satellites.
| 2. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein (a) the at least one service data processor accepts said measurements received from (i) at least one of said GNSS satellites by the at least one LEO satellite and (ii) at least one of said GNSS satellites and the at least one LEO satellite by the at least one ground reference station and Date Recue/Date Received 2021-04-28 (b) the at least one service data processor (i) generates the orbit predictions from said pseudorange information received from at least one of said GNSS satellites by the at least one LEO satellite and (ii) generates the clock predictions from said pseudorange information received from at least one of said GNSS satellites and the at least one LEO satellite by the at least one ground reference station.
| 3. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein said measurements received from the at least one other LEO satellite by the at least one ground reference station are from configurations wherein the at least one ground reference station is outside the footprint of the at least one LEO satellite.
| 4. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein measurements received from LEO satellites by ground reference stations are unavailable.
| 5. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein measurements received from GNSS satellites by LEO satellites are unavailable.
| 6. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein (i) said at least one LEO satellite includes an oscillator of known stability coupled coherently to a receiver for use in measuring carrier phase pseudorange information from said GNSS satellites or from other said LEO satellites and a transmitter for use in broadcasting carrier phase to be received by said ground reference stations and (ii) the at least one user device endures loss of one or more clock predictions due to disablement of satellites, ground reference stations, service data processors, or data dissemination means via which the clock predictions are received. Date Recue/Date Received 2021-04-28
| 7. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein said at least one service data processor is integrated into a WAAS master station or a precise point positioning network operations center.
| 8. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein said disseminating step is accomplished using SBAS satellites, Inmarsat Narrowband, NDGPS data broadcast, VHF aviation radio, 4G LTE, DOT ITS V2I 5.9 GHz standard broadcast, or said LEO satellites.
| 9. A method for supporting resilient carrier phase positioning of user devices utilizing at least one service data processor connected to the user devices by respective communication links, measurements received from GNSS satellites, and measurements received from LEO satellites, said measurements including carrier phase pseudorange information, comprising the steps of: (a) the user devices accepting precise orbit and clock predictions disseminated by the at least one service data processor for at least one LEO satellite, said precise orbit and clock predictions being generated from available pseudorange information accepted by the at least one service data processor received from (i) at least one GNSS satellite by at least one LEO satellite, (ii) at least one GNSS satellite and the at least one LEO satellite by at least one ground reference station, and/or (iii) LEO-to-LEO crosslink transmissions between at least one other LEO satellite and the at least one LEO satellite; and b) the user devices taking into account the precise orbit and clock predictions disseminated by the at least one service data processor when computing respective positions of the user devices upon receiving respective said measurements from GNSS and LEO satellites. Date Recue/Date Received 2021-04-28
| 10. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein (i) the precise orbit predictions are generated from pseudorange information accepted by the at least one service data processor received from at least one GNSS satellite by the at least one LEO satellite and (ii) the precise clock predictions are generated from pseudorange information received from at least one GNSS satellite and the at least one LEO satellite by the at least one ground reference station.
| 11. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein the pseudorange information accepted from the at least one other LEO satellite by the at least one ground reference station is from configurations wherein the at least one ground reference station is outside the footprint of the at least one LEO satellite.
| 12. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein pseudorange information received from LEO satellites by ground reference stations is unavailable.
| 13. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein pseudorange information received from GNSS satellites by LEO satellites is unavailable.
| 14. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein (i) said at least one LEO satellite includes an oscillator of known stability coupled coherently to a receiver for use in measuring carrier phase pseudorange information from said GNSS satellites or from other said LEO satellites and a transmitter for use in broadcasting carrier phase to be received by said ground reference stations and (ii) the at least one user device endures loss of one or more clock predictions Date Recue/Date Received 2021-04-28 due to disablement of satellites, ground reference stations, service data processors, or data dissemination means via which the clock predictions are received.
| 15. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, further comprising the step of employing Receiver Autonomous Integrity Monitoring (RAIM) to weight a fusion of other sensors selected from at least one camera, lidar receiver, or radar receiver.
| 16. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, further comprising the step of forming coherent cross- correlations across at least one pair of satellites to combat potential interference.
| 17. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein said GNSS and LEO satellites have known oscillator stabilities, and further comprising the step of receiving precise clock predictions of the GNSS and LEO satellites from the at least one service data processor and enduring subsequent loss of one or more clock predictions due to disablement of ground reference stations, service data processors, or data dissemination means via which the clock predictions are received.
| 18. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein the method is carried out despite enduring subsequent loss of one or more clock predictions due to disablement of ground reference stations, service data processors, or data dissemination means therebetween or therefrom.
| 19. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein said at least one LEO satellite is included in a constellation of Date Recue/Date Received 2021-04-28 said LEO satellites that minimize the number of required PRN codes through PRN code re-use.
| 20. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, further comprising the steps of: (a) the user device, at such time as it is moving, receiving broadcasting signals from one or more terrestrial, free- running, pre- surveyed pseudolites of known oscillator stability and measuring carrier phase pseudorange information therefrom, and (b) incorporating the pre-surveyed locations and known oscillator stability of said pseudolites in said position computation.
| 21. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 20, wherein said pseudolites broadcast in the 5.9 GHz band.
| 22. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 20, wherein some or all of said pseudolites are mounted at street level.
| 23. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 20, wherein some or all of said pseudolites are mounted at an elevated position relative to said at least one user device.
| 24. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, further comprising the steps of: (a) receiving pseudorange information from multi-band LEO, single-band LEO, and GNSS satellites; (b) collecting service data processor precise orbit and clock predictions of both the LEO and GNSS satellites and road-specific ionosphere and troposphere estimates; Date Recue/Date Received 2021-04-28 (c) applying said road-specific estimates to correct said single- band LEO satellite pseudoranges.
| 25. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 24, wherein one or more of said single-band LEO satellite signals are broadcast in the band centered at 1,575,420,000 Hz.
| 26. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 24, wherein one or more of said single-band LEO satellite signals are broadcast in the band spanning 1,616,000,000 to 1,626,500,000 Hz.
| 27. A service data processor for supporting resilient carrier phase positioning of user devices utilizing at least one service data processor connected to the user devices by respective communication links, measurements received from GNSS satellites, and measurements received from LEO satellites, said measurements including carrier phase pseudorange information, comprising: (a) means for accepting said measurements from (i) at least one of said GNSS satellites by at least one LEO satellite (ii) at least one of said GNSS satellites and said at least one LEO satellite by at least one ground reference station and/or (iii) at least one other LEO satellite to the at least one LEO satellite via a LEO-to-LEO crosslink transmission; (b) means for generating precise orbit and clock predictions for the at least one LEO satellite from available said pseudorange information received by the at least one LEO satellite; and (c) means for disseminating said predictions to the user devices over the communications links to enable the user devices to take into account the precise orbit and clock predictions when computing respective positions of Date Recue/Date Received 2021-04-28 the user devices upon receiving respective said measurements from GNSS and LEO satellites. 28. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein (a) the accepted measurements are received from (i) at least one of said GNSS satellites by the at least one LEO satellite and (ii) at least one of said GNSS satellites and the at least one LEO satellite by the at least one ground reference station; and (b) the generated orbit predictions are from said pseudorange information received from at least one of said GNSS satellites by the at least one LEO satellite, and the generated clock predictions are from said pseudorange information received from at least one of said GNSS satellites and the at least one LEO satellite by the at least one ground reference station. 29. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein the measurements received from the at least one other LEO satellite by the at least one ground reference station are from configurations wherein the at least one ground reference station is outside the footprint of the at least one LEO satellite. 30. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein measurements received from LEO satellites by ground reference stations are unavailable. 31. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein measurements received from GNSS satellites by LEO satellites are unavailable. Date Recue/Date Received 2021-04-28
| 32. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein (i) said at least one LEO satellite includes an oscillator of known stability coupled coherently to a receiver for use in measuring carrier phase pseudorange information from said GNSS satellites or from other LEO satellites and a transmitter for use in broadcasting carrier phase to be received by said ground reference stations and (ii) the at least one user device endures loss of one or more clock predictions due to disablement of satellites, ground reference stations, service data processors, or data dissemination means via which the clock predictions are channeled.
| 33. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein said service data processor is spaceborne.
| 34. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 33, further including coupled transmitters and receivers provided in an integrated circuit chipset hosted by said LEO satellite.
| 35. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein said at least one service data processor is integrated into a WAAS master station or a precise point positioning network operations center.
| 36. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein said disseminating means utilizes SBAS satellites, Inmarsat Narrowband, NDGPS data broadcast, VHF aviation radio, 4G LTE, DOT ITS V2I 5.9 GHz standard broadcast, or said LEO satellites.
| 37. A user device supported by at least one service data processor, the at least one service data processor connected to a plurality of user devices by respective Date Recue/Date Received 2021-04-28 communication links, to utilize measurements received from GNSS satellites and measurements received from LEO satellites in order to compute a position of the user device, said measurements including carrier phase pseudorange information, comprising: (a) accepting means for accepting precise orbit and clock predictions disseminated by the at least one service data processor for at least one LEO satellite, the precise orbit and clock predictions being generated from available pseudorange information accepted by the at least one service data processor received from (i) at least one GNSS satellite by the at least one LEO satellite, (ii) at least one GNSS satellite and the at least one LEO satellite by at least one ground reference station, and/or (iii) at least one other LEO satellite by the at least one LEO satellite as a LEO-to-LEO crosslink transmission; and (b) computing means for computing the position of the user device by taking into account the precise orbit and clock predictions when computing the position upon receiving said measurements from GNSS and LEO satellites.
| 38. A user device supported by at least one service data processor as claimed in claim 37, wherein (i) the precise orbit predictions are generated from pseudorange information accepted by the at least one service data processor received from at least one GNSS satellite by the at least one LEO satellite and (ii) the precise clock predictions are generated from pseudorange information accepted by the at least one service data processor received from the at least one LEO satellite by the at least one ground reference station.
| 39. A user device supported by at least one service data processor as claimed in claim 37, wherein the pseudorange information received from the at least one other LEO satellite by the at least one ground reference station is from configurations wherein the at least one ground reference station is outside the footprint of the at least one LEO satellite. Date Recue/Date Received 2021-04-28
| 40. A user device supported by at least one service data processor as claimed in claim 37, wherein pseudorange information received from LEO satellites by ground reference stations is unavailable.
| 41. A user device supported by at least one service data processor as claimed in claim 37, wherein received from GNSS satellites by LEO satellites is unavailable.
| 42. A user device supported by at least one service data processor as claimed in claim 37, said at least one LEO satellite includes an oscillator of known stability coupled coherently to a receiver for use in measuring carrier phase pseudorange information from said GNSS satellites or from other LEO satellites and a transmitter for use in broadcasting carrier phase to be received by said ground reference stations and (ii) the at least one user device endures loss of one or more clock predictions due to disablement of satellites, ground reference stations, service data processors, or data dissemination means via which the clock predictions are received.
| 43. A user device supported by at least one service data processor as claimed in claim 37, wherein said computing means is coupled to a Receiver Autonomous Integrity Monitoring (RAIIVI) device.
| 44. A user device supported by at least one service data processor as claimed in claim 37, wherein said computing means is coupled to means for employing said RAIM device to weight the fusion of other sensors.
| 45. A user device supported by at least one service data processor as claimed in claim 44, wherein said other sensors include at least one of a camera and a lidar or radar receiver. Date Recue/Date Received 2021-04-28
| 46. A user device supported by at least one service data processor as claimed in claim 37, wherein LEO signals broadcast from each said LEO satellite to each said ground reference station and said user device use frequency bands that are the same as those used by GNSS satellites.
| 47. A user device supported by at least one service data processor as claimed in claim 46, wherein said LEO signals are consistent with legacy or modern GNSS PRN codes.
| 48. A user device supported by at least one service data processor as claimed in claim 47, wherein said GNSS PRN codes are selected from the following GNSS PRN codes: GPS C/A, GPS P(Y), GPS M, GPS M', GPS L5, GPS L2C, GPS Ll C, Galileo El, Galileo E5a, Galileo E5b, Galileo E5, and Galileo E6.
| 49. A user device supported by at least one service data processor as claimed in claim 46, wherein said LEO satellite signals are codes generated by a 128-bit AES counter producing a chipping rate of an integer multiple of 1,023,000 chips per second.
| 50. A user device supported by at least one service data processor as claimed in claim 37, further comprising means for: (a) the user device in motion receiving signals broadcast by one or more terrestrial, free-running, pre-surveyed pseudolites of known oscillator stability, the signals from the pseudolites including carrier phase pseudorange information and (b) incorporating the pre-surveyed locations and oscillator stabilities of said pseudolites in said position calculation.
| 51. A user device supported by at least one service data processor as claimed in claim 50, wherein said pseudolites broadcast in the 5.9 GHz band. Date Recue/Date Received 2021-04-28
| 52. A user device supported by at least one service data processor as claimed in claim 50, wherein some or all of said pseudolites are mounted at street level.
| 53. A user device supported by at least one service data processor as claimed in claim 50, wherein some or all of said pseudolites are mounted above where street vehicles operate. Date Recue/Date Received 2021-04-28 | The method involves providing a service data processor accepting measurements received from one of the global navigation satellite system (GNSS) satellites and one of the low earth orbit (LEO) satellites. The precise orbit and clock predictions for the LEO satellite are generated. The predictions to the user device is disseminated to enable the user device to take into account the precise orbit and clock predictions, when computing the position upon receiving signals and measuring additional carrier phase pseudo-ranges from GNSS and LEO satellites. INDEPENDENT CLAIMS are included for the following:a service data processor for supporting resilient carrier phase of user device;a user device;a method for carrier phase positioning of user device;a method for localizing emitter;a method for GNSS signal authentication;a method for user position authentication;a method for fielding positioning service for one or more users;a method for generating regional, high-power navigation signals;a system for thermal control of high-power regional navigation satellite system;a method for providing agile, robust, and cost-effective services; anda method of beam forming space-borne distributed aperture. Method for supporting resilient carrier phase of user device such as subscriber vehicle (claimed), manned aircraft and unmanned aircraft. The performance of the GNSS is improved. The performance metrics including the accuracy, integrity, time to acquire, interference rejection, and spoofing protection for supporting resilient carrier phase of user device is improved. The disseminating process conforms to 4G LTE. The drawing shows a schematic view illustrating the process for supporting resilient carrier phase of user device. |
Please summarize the input | METHOD AND APPARAUTE FOR IDENTIFYING AUTONOMOUS VEHICLES IN AUTONOMOUS DRIVING ENVIRONMENTDisclosed is an autonomous vehicle identification method and apparatus for identifying autonomous vehicles using V2X communication data in an autonomous driving environment. An autonomous vehicle identification method performed by an autonomous vehicle identification device installed in a roadside device includes data indicating an autonomous driving state in a probe vehicle data (PVD) message received from a first vehicle that has entered a V2I communication area of a roadside device. Determining whether a frame or a data element representing an autonomous driving level exists, and if the data frame or data element does not exist, identifying a vehicle that has transmitted the PVD message as a general vehicle.|1. An autonomous vehicle identification method performed by an autonomous vehicle identification device installed in a roadside device, comprising: receiving a probe vehicle data (PVD) message from a vehicle that has entered a vehicle to infrastructure (V2I) communication area of the roadside device;
determining whether a data element indicating an autonomous driving level exists in the PVD message; and if the data element does not exist in the PVD message, identifying a vehicle that has transmitted the PVD message as a general vehicle, wherein a data element indicating the autonomous driving level is DE_AutonomousLevel and includes the data element. The data frame is a specific data frame representing an autonomous driving state, and the specific data frame is any one of a data element (DE_ODDinfo) defining operation design domain (ODD) information and a data element (DE_FallbackStatus) defining fallback information. A self-driving vehicle identification method further comprising the above.
| 2. The method of claim 1, further comprising identifying a vehicle that has transmitted the PVD message as an autonomous vehicle when the data element indicates an autonomous driving level of one or more.
| 3. The autonomous driving method according to claim 2, further comprising identifying a vehicle that has transmitted the PVD message as a non-autonomous vehicle or a connected car without an autonomous driving function, if the data element indicates an autonomous driving level of less than 1. Vehicle identification method.
| 4. The method of claim 1, wherein the specific data frame is an AutonomousStatus indicating an autonomous driving state.
| 5. An autonomous vehicle identification method performed by an autonomous vehicle identification device installed in a roadside device, comprising: receiving a probe vehicle data (PVD) message from a vehicle that has entered a vehicle to infrastructure (V2I) communication area of the roadside device;
determining whether a data element indicating an autonomous driving level exists in the PVD message;
determining whether the data element indicates one or more autonomous driving levels; and identifying a vehicle transmitting the PVD message as an autonomous vehicle when the data element indicates one or more autonomous driving levels.
, wherein the data element indicating the autonomous driving level is DE_AutonomousLevel, the data frame including the data element is a specific data frame representing an autonomous driving state, and the specific data frame defines ODD (operation design domain) information. The autonomous vehicle identification method further includes any one or more of a data element (DE_ODDinfo) that defines fallback information and a data element (DE_FallbackStatus) that defines fallback information.
| 6. The method according to claim 5, further comprising: identifying a vehicle transmitting the PVD message as a non-autonomous vehicle or a connected car without an autonomous driving function when the data element indicates an autonomous driving level of less than 1; and if the data element does not exist in the PVD message, identifying the vehicle transmitting the PVD message as a normal vehicle.
| 7. The method of claim 5, wherein the specific data frame is Autonomous indicating an autonomous driving state.
| 8. An autonomous vehicle identification device installed in a roadside device to identify an autonomous vehicle, comprising: a wireless communication module supporting vehicle to everything (V2X) communication; and at least one processor connected to the wireless communication module, wherein the at least one processor receives a probe vehicle data (PVD) message from a first vehicle that has entered a vehicle to infrastructure (V2I) communication area of the roadside device. Receiving through the wireless communication module, determining whether a data element indicating an autonomous driving level exists in the PVD message, determining whether the data element indicates one or more autonomous driving levels, and determining whether the data element indicates one or more autonomous driving levels, and When the above autonomous driving level is indicated, a step of identifying a vehicle having transmitted the PVD message as an autonomous vehicle is performed, a data element indicating the autonomous driving level is DE_AutonomousLevel, and a data frame including the data element is It is a specific data frame representing the autonomous driving state, The specific data frame further includes any one or more of a data element (DE_ODDinfo) defining operation design domain (ODD) information and a data element (DE_FallbackStatus) defining fallback information.
| 9. The method according to claim 8, wherein the at least one processor, when the data element indicates an autonomous driving level of less than 1, the vehicle transmitting the PVD message to a non-autonomous vehicle or a connected car not equipped with an autonomous driving function. Identifying, and if the data element does not exist in the PVD message, identifying a vehicle that has transmitted the PVD message as a normal vehicle.
| 10. The apparatus of claim 8, wherein the specific data frame is an AutonomousStatus indicating an autonomous driving state. | The method involves receiving a probe vehicle data (PVDD) message from a vehicle, and determining whether a data element indicating an autonomous driving level exists in the PVD message. A specific data frame is provided for representing the autonomous driving state, where the data frame includes the data element that defines operation design domain (ODD) information and data element defines fallback status. The data element is provided with a data frame that represents an autonomous status. An INDEPENDENT CLAIM is included for a autonomous vehicle identification device for identify autonomous vehicle. Method for identifying autonomous vehicle i.e. car using autonomous vehicle identification device installed in roadside device in autonomous driving environment. The method enables effectively identifying whether a target vehicle within a communication area is an autonomous vehicle or a vehicle equipped with an autonomous driving function. The method enables defining an identification factor in a message frame in compliance with a V2X communication data standard defined in a SAE J2735, so that scenarios for future driving negotiations can be effectively responded. The drawing shows a block diagram of method for identifying autonomous vehicle in autonomous driving environment. 100Roadside device110Processor120Memory130Transceiver150Communication module |
Please summarize the input | The driving negotiation method and an apparatusPROBLEM TO BE SOLVED: To provide a driving negotiation method and apparatus for supporting stability against a blind spot, an unexpected situation, etc., and a rapid judgement and response of an autonomous vehicle in various driving environments.
SOLUTION: A driving negotiation apparatus includes a wireless communication module configured to support V2X (vehicle to everything) communication and at least one processor connected to the wireless communication module. The at least one processor receives a cooperative request message from a first vehicle, broadcasts a cooperative request message and additional information required for a negotiation to surrounding vehicles, receives a cooperative response message from at least one second vehicle among the surrounding vehicles and transmits a message indicating that the negotiation is possible or impossible to the first vehicle on the basis of the cooperative response message.
SELECTED DRAWING: Figure 3|1. The wireless communication module which assists V2X (vehicle to everything) communication; The device includes at least one processor connected to the wireless communication module; the at least one processor receives a cooperation request message (CooperativeRequestMsg) from the first vehicle; and the like, and the communication module is provided with the above-mentioned processor. The cooperation relay message including the type code of the additional information required for negotiation in the cooperation request message is broadcasted to the surrounding vehicle; and the cooperation relay message is performed to the surrounding vehicle. A cooperative response message (CooperativeReponseMsg) corresponding to the cooperative relay message is received from at least one second vehicle out of the surrounding vehicles; and the cooperative reply message is received from the second vehicle. A message for negotiable or non-negotiable state is transmitted to the first vehicle based on the cooperation response message; and the travel negotiation device is provided.
| 2. The cooperation request message or the cooperation response message includes a vehicle speed; a data frame (data frames, DFs) for lane change and lane merging; and The first data frame relative to the vehicle speed is overpassed as negotiation message information about the speed adjustment plan, and includes a first data element for deceleration and stop. A second data frame for the lane change is avoided as negotiation message information for a lane change plan; an accident; and a second data element for an interruption and a pedestrian; and a second data frame is included in the lane change plan. A third data frame for the lane merging is used as a negotiation message for a merging plan; a merging path; an intersection; and a third data element with respect to the rotation intersection; and a traveling negotiation device described in claim 1.
| 3. The cooperation request message or the cooperation response message is a time stamp (timestamp), a vehicle identifier (id or TemporaryID), a message identifier (UniqueMSG_ID, messangeid or MsgID), and a message identifier (ID). The traveling negotiation device includes a data element for a previous message identifier (previousmessageid or preMsgID) and message information (infoMsg), and is described in claim 1.
| 4. The message identifier that is the data element of the cooperation request message is the present travel negotiation session identification value, and is used to define a process from the time of the travel negotiation request to the response as one session. The traveling negotiation device is described in claim 3.
| 5. The message information that is the data element of the cooperation request message is the sequence of message lists or message types required for the negotiation information (SequenceofMessageType); The traveling negotiation device is received from the first vehicle in a null state in the first travel negotiation, and is described in claim 3.
| 6. The at least one processor is configured to generate a related message related to the message type by the message type of the negotiation request message in relation to the additional information before broadcasting the cooperative relay message to the surrounding vehicle. A preset type code of the related message is included in the cooperation request message; and the related message is included in the cooperation request message. The at least one processor, when broadcasting the cooperative relay message, broadcasted the relevant message together with the cooperative relay message; and the traveling negotiation device described in the claim 1.
| 7. The cooperation response message includes a response value to the negotiation of the second vehicle itself and a data field of a response type including information necessary for negotiation, and includes a data element for the message identifier of the first vehicle and the previous message identifier. The traveling negotiation device is described in claim 1.
| 8. The response value for the negotiation includes a data element for consent (agree) or rejection (refuse); the radio communication module is a roadside base station or a roadside device (road side) (road side) It is installed in unit, RSU); the travel negotiation device described in Claim 7.
| 9. The message for the negotiable or non-negotiable message is transmitted to the first vehicle in the form of a broadcast, and the traveling negotiation device described in the claim 1 is provided.
| 10. The at least one processor transmits a message for the negotiable or non-negotiable message to the first vehicle after the negotiation. A message for renegotiation for making an identification value of a present travel negotiation session from the first vehicle to an identification value of a previous travel negotiation session or further a cooperation request message is further received; the travel negotiation device described in the claim 1. .
| 11. A step of receiving a cooperation request message (CooperativeRequestMsg) from a first vehicle; A step for generating a cooperative relay message including a type code of additional information required for negotiation in the cooperation request message is generated. The cooperative relay message is broadcasted to the surrounding vehicle. A step of receiving a cooperative response message (CooperativeReponseMsg) corresponding to the cooperative relay message is received from at least one second vehicle out of the surrounding vehicles. A message for negotiable or non-negotiable state is transmitted to the first vehicle, based on the cooperation response message. ; the traveling negotiation method.
| 12. The cooperation request message or the cooperation response message includes a vehicle speed; a data frame (data frames, DFs) for lane change and lane merging; and The first data frame relative to the vehicle speed is overpassed as negotiation message information about the speed adjustment plan, and includes a first data element for deceleration and stop. A second data frame for the lane change is avoided as negotiation message information for a lane change plan; an accident; and a second data element for an interruption and a pedestrian; and a second data frame is included in the lane change plan. A third data frame for the lane merging is used as a negotiation message for the merging plan; and a merging path; an intersection; and a third data element with respect to the rotation intersection, and the traveling negotiation method described in the claim 11.
| 13. The cooperation request message or the cooperation response message is a time stamp (timestamp), a vehicle identifier (id or TemporaryID), a message identifier (UniqueMSG_ID, messangeid or MsgID), and a message identifier (ID). The traveling negotiation method includes a data element to a former message identifier (previousmessageid or preMsgID) and message information (infoMsg); and a traveling negotiation method described in claim 11.
| 14. The message identifier that is the data element of the cooperation request message is the present travel negotiation session identification value, and is used to define a process from the time of the travel negotiation request to the response as one session. The travel negotiation method described in claim 13.
| 15. The message information that is the data element of the cooperation request message is the sequence of message lists or message types required for the negotiation information (SequenceofMessageType); The traveling negotiation method is received from the first vehicle in a null state during the first travel negotiation, and the traveling negotiation method described in the claim 13.
| 16. Before the cooperation relay message is broadcasted to the surrounding vehicle, it further includes a stage for generating related messages related to the message type by the message type of the cooperation request message in relation to the additional information. When the cooperative relay message is broadcasted to the surrounding vehicle, the relevant message is broadcasted together with the cooperative relay message; and the traveling negotiation method described in the claim 11.
| 17. The cooperation response message includes a response value to the negotiation of the second vehicle itself and a data field of a response type including information necessary for negotiation, and includes a data element for the message identifier of the first vehicle and the previous message identifier. The travel negotiation method described in claim 11.
| 18. The response value for the negotiation includes a data element for consent (agree) or rejection (refuse), and the travel negotiation method described in claim 17.
| 19. The message for the negotiable or non-negotiable message is transmitted to the first vehicle in the form of a broadcast, and the traveling negotiation method described in the claim 11 is disclosed.
| 20. After transmitting a message to the first vehicle for the negotiable or non-negotiable message, the message is transmitted to the vehicle. The method further includes a step of receiving a message for renegotiation of identifying a current travel negotiation session identification value from the first vehicle to an identification value of a previous travel negotiation session or further a further request for cooperation request message. The travel negotiation method described in claim 11. | The device (100) has a wireless communication module supporting vehicle to everything (V2X) communication. A processor (110) is connected to the wireless communication module, and receives a cooperation request message (CooperativeRequestMsg) from the first vehicle, and a type code of additional information required for negotiation in the cooperation request message Broadcasting a cooperative relay message including a to surrounding vehicles, receiving a cooperative response message (CooperativeReponseMsg) corresponding to the cooperative relay message from a second vehicle among the surrounding vehicles, and negotiation based on the cooperative response message. The cooperation request message or the cooperation response message includes a timestamp, a vehicle identifier (id or TemporaryID), a message identifier (UniqueMSG-ID, messangeid, or MsgID). The message identifier is a data element of the cooperation request message. An INDEPENDENT CLAIM is included for a driving negotiation method. Driving negotiation device for supporting the stability of blind spots or unexpected situations. The stability and ability of autonomous vehicles to cope with blind spots or unexpected situations are improved. The driving stability of vehicles such as autonomous vehicles is improved by providing a message set and negotiation process definition for V2I communication-based driving negotiation in a C-ITS environment. The drawing shows a block diagram for explaining the main configuration of a driving negotiation apparatus for executing the driving negotiation method. (Drawing includes non-English language text) 100Driving negotiation device110Processor120Memory130Transceiver150Wireless communication module |
Please summarize the input | Systems And Methods Using Artificial Intelligence For Routing Electric VehiclesThe present invention provides specific systems, methods and algorithms based on artificial intelligence expert system technology for determination of preferred routes of travel for electric vehicles (EVs). The systems, methods and algorithms provide such route guidance for battery-operated EVs in-route to a desired destination, but lacking sufficient battery energy to reach the destination from the current location of the EV. The systems and methods of the present invention disclose use of one or more specifically programmed computer machines with artificial intelligence expert system battery energy management and navigation route control. Such specifically programmed computer machines may be located in the EV and/or cloud-based or remote computer/data processing systems for the determination of preferred routes of travel, including intermediate stops at designated battery charging or replenishing stations. Expert system algorithms operating on combinations of expert defined parameter subsets for route selection are disclosed. Specific fuzzy logic methods are also disclosed based on defined potential route parameters with fuzzy logic determination of crisp numerical values for multiple potential routes and comparison of those crisp numerical values for selection of a particular route. Application of the present invention systems and methods to autonomous or driver-less EVs is also disclosed.The embodiments of the invention in which an exclusive property or privilege is claimed are defined as follows:
| 1. An artificial intelligence (AI) Electric Vehicle (EV) route optimization method comprising:
an electronic, specifically programmed, communication computer AI system performing EV route optimization for travel of said EV from a designated origin location or EV present location to an EV designated destination location with intermediate stops at intervening battery charging stations to maintain battery charge levels;
storing in memory one or more EV attribute parameters comprising EV operational status parameters, EV location parameters, or EV battery status parameters;
derivation of EV potential route condition parameters for said EV based on information exchanges with at least two of: (1) communication network connections with application servers, (2) communication network connections with other motor vehicles, (3) communication network connections with pedestrians, and (4) communication network connections with roadside monitoring and control units;
storing in memory expert defined propositional logic inference rules specifying multiple multidimensional conditional relationships between two or more of said EV attribute parameters and EV potential route condition parameters, and with expert defined individual parameter degree of danger value ranges;
AI evaluation and assignment of expert defined value ranges to selected of said EV attribute parameters and selected of said EV potential route condition parameters and wherein said expert defined value ranges depend on individual parameter importance to EV route optimization;
storing in memory expert defined propositional logic inference rules defining multiple range dependent conditional relationships between two or more interrelated multidimensional parameters comprising selected said EV attribute parameters and selected said EV potential route condition parameters;
AI evaluation of EV potential routes of travel from said EV designated origin location or EV present location to said EV designated destination location based on said EV attribute parameters, said EV potential route condition parameters and said expert defined propositional logic inference rules, and further wherein EV potential routes of travel include visiting battery charging stations as necessary to maintain proper EV battery charge levels to reach said EV designated destination location; and,
AI expert system optimization of selection of a particular route of travel based on said AI evaluation of said EV potential routes of travel comprising expert system analysis of one or more multidimensional combinations of said two or more interrelated multidimensional parameters of said EV attribute parameters and said EV potential route condition parameters.
| 2. The AI EV route optimization method of claim 1 further comprising accessing said EV potential route condition parameters using Internet telecommunications technology.
| 3. The AI EV route optimization method of claim 1 further comprising accessing of said EV potential route condition parameters using cellular communication technology to receive or transmit information between said EV and said external information sources.
| 4. The AI EV route optimization method of claim 1 further comprising exchanging selected of said EV attribute parameters of said EV with other motor vehicles or via remote information source facilities.
| 5. The AI EV route optimization method of claim 1 wherein said EV route optimization is further based upon battery charging station usage and actual or probable requests for route guidance from other EVs traveling within a defined distance from said EV present location, and further wherein such information that is accessed from said other EV's affects the expected waiting times or queues encountered at battery charging stations on possible routes of travel.
| 6. The AI EV route optimization method of claim 1 wherein said EV present location information is derived from motor vehicle GPS (Global Positioning System) signal sensors or from determination of the distance of said EV from cellular telephone towers or other known fixed locations transmitting signals received by one of the EV receivers.
| 7. The AI EV route optimization method of claim 1 wherein said potential route condition parameters comprise dynamic roadway conditions further comprising one or more of traffic congestion, weather conditions, police reported concerns, or other dynamic roadway condition information received from external information source database or data processing units.
| 8. The AI EV route optimization method of claim 1 wherein EV route selection decisions comprise consideration of potential dynamically changing charging requirements from other vehicles within a defined radius or distance from said EV present location.
| 9. The AI EV route optimization method of claim 1 wherein communicating with said external information sources further comprises operating an RFID (radio frequency identification) tag device used to identify the EV and communicate information with RFID tag readers located along highways tollways or roadways along which the EV is traveling.
| 10. The AI EV route optimization method of claim 7 wherein said external information source database or data processing units are cloud based and are accessed through the Internet or cellular telephone communication networks.
| 11. The AI EV route optimization method of claim 1 further comprising Bluetooth, Wi-Fi or other voice or data telecommunication capabilities for communicating with charging stations or other nearby vehicles present in ongoing traffic or waiting for use of charging stations.
| 12. The AI EV route optimization method of claim 1 wherein said EV potential route condition parameters from external information sources comprise pedestrian or crowd information.
| 13. The AI EV route optimization method of claim 1 wherein said EV accesses information from communication network applications.
| 14. The AI EV route optimization method of claim 13 wherein EV access of said communication network applications further comprising one or more of a Navigation System Application, Traffic Database Application, EV Account Application, Battery Charger/Replacement Station Application, Weather Data Application, Police Report Application, Special Event Application, or Road Condition Application.
| 15. The AI EV route optimization method of claim 14 wherein said Traffic Database Application comprises EV vehicle traffic congestion or density data.
| 16. The AI EV route optimization method of claim 14 wherein said Special Event Application comprises traffic or crowd congestion arising from special events along potential routes of travel.
| 17. The AI EV route optimization method of claim 1 wherein said EV attribute parameters and said EV potential route condition parameters from external information sources are stored in a remote database and wherein said remote database may be accessed and updated via vehicle-to-network connections.
| 18. The AI EV route optimization method of claim 1 wherein said EV is a driverless or autonomous driving vehicle.
| 19. An artificial intelligence (AI) Electric Vehicle (EV) route optimization system comprising:
an electronic, specifically programmed, communication computer AI system performing EV route optimization for travel of said EV from an EV designated origin location or EV present location to an EV designated destination location with intermediate stops at intervening battery charging stations to maintain battery charge levels;
a memory for storing one or more EV attribute parameters comprising EV operational status parameters, EV location parameters, or EV battery status parameters;
derivation of EV potential route condition parameters for said EV based on information exchanges with at least two of: (1) communication network connections with application servers, (2) communication network connections with other motor vehicles, (3) communication network connections with pedestrians, and (4) communication network connections with roadside monitoring and control units;
evaluating and assigning AI expert defined value ranges to selected of said EV attribute parameters and selected of said EV potential route condition parameters and wherein said expert defined value ranges depend on individual parameter importance to EV route optimization;
a memory for storing expert defined propositional logic inference rules defining multiple range dependent conditional relationships between two or more interrelated multidimensional parameters comprising selected said EV attribute parameters and selected said EV potential route condition parameters;
AI evaluation of EV potential routes of travel from said EV designated origin location or EV present location to said EV designated destination location based on said EV attribute parameters, said EV potential route of travel parameters and said expert defined propositional logic inference rules, and further wherein said EV potential routes of travel include visiting battery charging stations as necessary to maintain proper EV battery charge levels to reach said EV designated destination location; and,
AI expert system optimization of selection of a particular route of travel based on said AI evaluation of said EV potential routes of travel comprising expert system analysis of one or more multidimensional combinations of said two or more interrelated multidimensional parameters of said EV attribute parameters and said EV potential route condition parameters.
| 20. The artificial intelligence (AI) Electric Vehicle (EV) route optimization system of claim 19 further comprising accessing said EV potential route condition parameters using cellular or Internet telecommunications technology. | The method involves performing electric vehicle (EV) route optimization for travel of EV from a designated origin location or EV present location to an EV designated destination location with intermediate stops at intervening battery charging stations (104) to maintain battery charge levels. One or more EV attribute parameters comprising EV operational status parameters, EV location parameters, or EV battery status parameters are stored in memory. The EV potential route condition parameters for EV are derivation based on information exchanges with at least two of communication network connections with application servers. The AI expert system optimization of selection of a particular route of travel based on artificial intelligence (AI) evaluation of EV potential routes of travel is performed. An INDEPENDENT CLAIM is included for an AI EV route optimization system. AI EV route optimization method. The method enables providing efficient routing algorithms that can be employed in real-time without excessive and complex computation and that consider multiple factors such as battery charging-replacement station locations, required time of travel, roadway conditions, traffic congestion, weather conditions and/or emergency traffic considerations, thus improving EV operational usefulness through determination of preferred routes of travel where the preferred routes include intermediate charging or replacement of EV batteries as required. The drawing shows a schematic diagram illustrating configuration of a driving situation with recharging stations benefiting from a routing and control system without limitation.101Driving area 102GPS satellite 103Destination 104Charging station 105Particular automotive vehicle |
Please summarize the input | Method and apparatus for vehicle-mounted enhanced visualization of sensor range and field of viewSome embodiments of the methods disclosed herein may include: receiving the predicted driving route, the sensor range of the sensor on the autonomous vehicle (AV) and the sensor field of view (FOV) data; determining whether a minimum sensor visibility requirement is met along the predicted driving route; predicting a blind area along the predicted driving route, wherein the predicted blind area is determined to have potentially reduced sensor visibility; and AR visualization using augmented reality (AR) display devices to display blind areas.|1. A method, comprising: receiving sensor range and sensor field FOV data of the sensor on the first vehicle; receiving blind area information from the second vehicle; predicting a blind area along the predicted driving path, wherein the predicted blind area is determined to have a potentially weakened sensor visibility; and using the augmented reality AR display device to display the AR visualization of the blind area, the method further comprising: receiving the predicted driving route; and determining whether the minimum sensor visibility requirement is satisfied along the predicted driving route, wherein determining whether the minimum sensor visibility requirement is satisfied comprises: determining a percentage of a minimum visibility region covered by the field of view FOV of the one or more sensors; and determining whether the percentage exceeds a visibility region threshold.
| 2. The method according to claim 1, wherein the first vehicle is a partially autonomous vehicle having at least one of a manual mode or a driver-assisted mode.
| 3. The method according to claim 1 or 2, wherein the first vehicle is a fully autonomous vehicle.
| 4. The method according to claim 1, further comprising: receiving map data; and updating the blind area, wherein updating the blind area comprises comparing the received map data with local dynamic map data.
| 5. The method according to claim 1 or 2, further comprising: The second vehicle is determined to be in a blind area in the predicted blind area based on the blind area information.
| 6. The method according to claim 5, further comprising: In response to determining that the second vehicle is in the blind area in the predicted blind area, an indication that the second vehicle is in the blind area is displayed.
| 7. The method according to claim 1 or 2, further comprising: identification blind area reduction technology; and moving the first vehicle from a first position to a second position in response to an identification blind area reduction technique.
| 8. The method according to claim 7, wherein the blind area reduction technique includes at least one of repositioning the first vehicle or adjusting an orientation of one of the sensors.
| 9. The method according to claim 1 or 2, wherein predicting a blind area comprises determining a visibility region at a plurality of locations along the predicted driving route.
| 10. The method according to claim 9, wherein determining a visibility region comprises simulating sensor visibility at the plurality of locations along the predicted driving route using three-dimensional 3D map data.
| 11. The method according to claim 1 or 2, wherein predicting the blind area comprises continuously estimating the location of the blind area based on a plurality of sensor readings.
| 12. The method according to claim 1 or 2, further comprising: The orientation of the first vehicle is tracked, wherein a prediction blind area is based on the orientation of the first vehicle.
| 13. The method according to claim 1 or 2, wherein displaying the AR visualization of the blind area comprises projecting the AR visualization using a vehicle-mounted augmented reality projection system to display the AR visualization.
| 14. The method according to claim 1 or 2, wherein displaying the AR visualization of the blind area comprises overlaying a highlighted display indicating the blind area on a map.
| 15. The method according to claim 1 or 2, wherein displaying the AR visualization of the blind area comprises displaying a contour of an area indicative of the blind area on a map.
| 16. A device, the device comprises: a processor; and a non-temporary computer-readable medium storing instructions that, when executed by the processor, are operable to cause the device to perform the method according to any one of claims 1 to 15.
| 17. The apparatus according to claim 16, further comprising: a group of sensors; a blind area prediction module configured to identify a potential blind area; a driving mode selection module configured to select a driving mode; a communication module configured to receive a vehicle-to-vehicle V2V message; and an augmented reality AR display device.
| 18. A method, comprising: receiving the predicted driving route, the sensor range of the sensor on the vehicle and the sensor field FOV data; determining whether the minimum sensor visibility requirement is satisfied along the driving route; predicting a blind area along the predicted driving route; wherein the predicted blind area is determined to have a potentially weakened sensor visibility; and displaying an AR visualization of the blind area using an augmented reality AR display device, wherein determining whether a minimum sensor visibility requirement is met comprises: determining a percentage of a minimum visibility region covered by the field of view FOV of the one or more sensors; and determining whether the percentage exceeds a visibility region threshold.
| 19. The method according to claim 18, wherein predicting a blind area along the driving route comprises determining an area where a sensor visibility range along the driving route is less than a minimum sensor visibility range requirement, and wherein the step of determining a blind area along the driving route comprises determining an area where the sensor visibility range along the driving route is less than a minimum sensor visibility range requirement. The minimum sensor visibility requirement includes the minimum sensor visibility range requirement.
| 20. The method according to claim 18, further comprising determining the minimum sensor visibility requirement along the driving route of the autonomous vehicle AV.
| 21. A device, the device comprises: a processor; and a non-temporary computer-readable medium storing instructions that, when executed by the processor, are operable to cause the device to perform the method according to any one of claims 18 to 20.
| 22. A method, comprising: predicting a blind area along the predicted driving route of the autonomous vehicle AV based on the expected limit of the sensor; and when the AV travels along the driving route, using an enhanced display AR visualization to provide an indication of the predicted blind area, the method further comprising: receiving the predicted driving route; and determining whether the minimum sensor visibility requirement is satisfied along the predicted driving route, wherein determining whether the minimum sensor visibility requirement is satisfied comprises: determining a percentage of a minimum visibility region covered by the field of view FOV of the one or more sensors; and determining whether the percentage exceeds a visibility region threshold. | The method (1700) involves receiving (1702) a predicted driving route, sensor ranges of sensors on a vehicle, and sensor field-of-view data, determining (1704) whether minimum sensor visibility requirements are met along the predicted driving route, predicting (1706) blind areas along the predicted driving route, in which the predicted blind areas are determined to have potentially diminished sensor visibility, and displaying (1708) an augmented reality visualization of the blind areas using an augmented reality display device. An INDEPENDENT CLAIM is also included for an apparatus for in-vehicle augmented reality visualization of sensor. Method for in-vehicle augmented reality visualization of sensor. Provides blind area reduction techniques, and responsive to identifying blind area reduction techniques, moving the vehicle from a first position to a second position which can help reduce significant traffic collisions. The drawing shows the flow diagram illustrating an example process for predicting blind areas and displaying a visualization corresponding to the predicted blind areas 1700Method1702Receiving predicted route sensor ranges of sensors1704Determining whether minimum sensor visibility requirement are met along predicted driving route1706Predicting blind areas along predicted driving route1708Displaying augmented reality visualization of blind areas |
Please summarize the input | System and/or method for platooningThe system can include a dispatcher and a plurality of cars. However, the system 100 can additionally or alternatively include any other suitable set of components. The system 100 functions to enable platooning of the plurality of cars (e.g., by way of the method S100).We claim:
| 1. A method for coordinated braking within a rail platoon, comprising:
determining, at an autonomous rail vehicle within the rail platoon, a compressive force at a leading end of the autonomous rail vehicle in a direction of traversal;
determining a coordinated braking event; and
in response to determining the coordinated braking event, braking at the autonomous rail vehicle while maintaining compression at the leading end, comprising autonomously controlling an independent set of brakes of the autonomous vehicle based on the compressive force at the leading end.
| 2. The method of claim 1, wherein braking at the autonomous rail vehicle comprises: based on the compressive force, independently controlling a regenerative braking of a battery-electric powertrain of the autonomous vehicle.
| 3. The method of claim 1, wherein the leading end comprises an abutment surface of a bumper which is suspended relative to a chassis of the autonomous rail vehicle.
| 4. The method of claim 1, wherein determining the coordinated braking event comprises receiving a braking command via a vehicle-to-vehicle (V2V) communication from a rail vehicle within the rail platoon.
| 5. The method of claim 4, further comprising: in response to receiving the V2V communication, relaying the braking command to a second autonomous rail vehicle within the platoon.
| 6. The method of claim 1, wherein the coordinated braking event is determined based on the compressive force.
| 7. The method of claim 1, further comprising: separating from the platoon based on a track geometry or a location of a level crossing.
| 8. The method of claim 1, wherein the compressive force is determined with a load cell.
| 9. A method, comprising:
determining, at a rail vehicle within a rail platoon, a contact force at a leading end of the rail vehicle in a direction of traversal; and
autonomously controlling the rail vehicle within the rail platoon, comprising:
determining a platoon control target; and
based on the contact force, controlling an independent powertrain of the rail vehicle to achieve the platoon control target.
| 10. The method of claim 9, wherein the independent powertrain comprises a battery-electric powertrain.
| 11. The method of claim 10, wherein autonomously controlling the rail vehicle comprises regeneratively braking with the battery-electric powertrain.
| 12. The method of claim 9, wherein the leading end comprises an abutment surface of a bumper which is damped relative to a chassis of the autonomous rail vehicle.
| 13. The method of claim 9, wherein the platoon control target comprises a target contact force.
| 14. The method of claim 13, wherein the target contact force is based on a relative energy distribution of the rail platoon.
| 15. The method of claim 9, wherein the platoon control target comprises a speed setpoint.
| 16. The method of claim 9, further comprising:
receiving, at the autonomous rail vehicle, a set of dispatch instructions associated with a coordinated separation of the rail platoon; and
based on the set of dispatch instructions, controlling the independent powertrain of the rail vehicle to separate from a leading portion of the rail platoon at the leading end.
| 17. The method of claim 16, wherein the coordinated separation is based on a track geometry or a location of a level crossing.
| 18. The method of claim 16, wherein the set of dispatch instructions comprises a warrant for the autonomous rail vehicle, wherein the control target is determined based on the warrant.
| 19. The method of claim 9, wherein autonomously controlling the rail vehicle within the rail platoon comprises: at the leading end of the rail vehicle, pushing an adjacent rail vehicle.
| 20. The method of claim 19, wherein no components span between the rail vehicle and the adjacent rail vehicle. | The method involves determining a compressive force at a leading end of an autonomous rail vehicle within the rail platoon in a traversal direction. A coordinated braking event is determined. Autonomous rail vehicle braking process is performed while maintaining compression at the leading end in response to determining the coordinated braking event. An independent set of brakes of the autonomous vehicle is autonomously controlled based on the compressive force at the leading end. An abutment surface of a bumper is suspended relative to a chassis of the autonomous rail vehicle. Method for realizing coordinated braking of self-propelling rail cars within rail platoon. The method enables reducing the risk of injury to the driver of the vehicle, preventing the vehicle from colliding with the other vehicles, and improving the operating reliability by eliminating components subject to failure and removing possibilities for human error. The drawing shows a schematic view of a structure for realizing coordinated braking of self-propelling rail cars within rail platoon.100Self-propelling rail cars coordinated braking system 110Disptacher 120Cars |
Please summarize the input | SYSTEM AND/OR METHOD FOR PLATOONINGThe system can include a dispatcher and a plurality of cars. However, the system 100 can additionally or alternatively include any other suitable set of components. The system 100 functions to enable platooning of the plurality of cars (e.g., by way of the method S100).We claim:
| 1. A method comprising:
positioning a first railway vehicle along a track;
providing a first set of instructions to a second railway vehicle; and
based on the first set of instructions, controlling traversal of the second railway vehicle until the second railway vehicle abuts the first railway vehicle;
providing a second set of instructions to the first railway vehicle;
based on the second set of instructions, controlling traversal of the first railway vehicle in a direction of transit; and
while controlling traversal of the first railway vehicle based on the second set of instruction, maintaining abutment between the second railway vehicle and the first railway vehicle by controlling the second railway vehicle to push the first railway vehicle in the direction of transit.
| 2. The method of claim 1, wherein the second railway vehicle is dynamically controlled to push the first railway vehicle in the direction of transit.
| 3. The method of claim 2, wherein the second railway vehicle is dynamically controlled based on a motion of the first railway vehicle.
| 4. The method of claim 2, wherein the second railway vehicle is controlled with a feedback controller based on a push force of the second railway vehicle applied on the first railway vehicle in the direction of transit.
| 5. The method of claim 1, wherein the second set of instructions is received from a remote dispatch system.
| 6. The method of claim 5, wherein controlling the second railway vehicle to push the first railway vehicle in the direction of transit comprises receiving, at the second railway vehicle, a third set of instructions from the remote dispatch system.
| 7. The method of claim 1, wherein the second set of instructions corresponds to both the first and second rail vehicles.
| 8. The method of claim 1, wherein positioning a first railway vehicle along a track comprises: controlling a powertrain of the first railway vehicle with an autonomous controller of the first railway vehicle based on a location of the railway vehicle.
| 9. The method of claim 1, wherein the second railway vehicle initially contacts the first railway vehicle while the first railway vehicle is substantially stationary.
| 10. The method of claim 1, wherein the first railway vehicle is autonomous vehicle.
| 11. The method of claim 1, wherein the second railway vehicle comprises an autonomous electric bogie.
| 12. A method comprising:
forming a platoon of rail vehicles comprising: independently controlling each rail vehicle of the platoon to arrange the rail vehicles in series along a track, with abutment between each pair of adjacent rail vehicles of the platoon; and
controlling traversal of the platoon in a first direction, comprising: for each pair of adjacent rail vehicles, controlling a trailing rail vehicle of the pair to push against a leading vehicle of the pair in the first direction.
| 13. The method of claim 12, wherein the abutment between each pair of adjacent rail vehicles of the platoon is continuously maintained during traversal of the platoon.
| 14. The method of claim 12, wherein forming the platoon comprises simultaneously maneuvering a plurality of the rail vehicles within a rail yard.
| 15. The method of claim 12, wherein controlling traversal of the platoon comprises, at a forwardmost rail vehicle of the platoon relative to the first direction: controlling traversal of the forwardmost rail vehicle according to a set of commands, wherein the set of commands are propagated rearwardly through the rail vehicles in series based on the motion of the forwardmost rail vehicle.
| 16. The method of claim 15, wherein the forwardmost rail vehicle is controlled via a velocity controller or torque controller.
| 17. The method of claim 12, further comprising: while controlling traversal of the platoon in the first direction, executing a coordinated deceleration of the platoon.
| 18. The method of claim 17, wherein the coordinated deceleration is based on a plurality of wireless vehicle-to-vehicle (V2V) communications.
| 19. The method of claim 12, wherein each rail vehicle of the platoon comprises a pair of electric bogies.
| 20. The method of claim 19, wherein each electric bogie is autonomous and configured to be independently maneuverable. | The method involves positioning first railway vehicle (120) along a track, and providing first set of instructions to second railway vehicle. Traversal of the second railway vehicle is controlled based on the first set of instructions until the second railway vehicle abuts the first railway vehicle. Second set of instructions is provided to the first railway vehicle. Traversal of the first railway vehicle is controlled in direction of transit based on the second set of instructions. Abutment is maintained between the second railway vehicle and the first railway vehicle, while controlling traversal of the first railway vehicle based on the second set of instructions by controlling the second railway vehicle to move the first railway vehicle in direction of transit. The first railway vehicle is autonomous vehicle. The second railway vehicle comprises an autonomous electric bogie. Method for platooning a vehicle e.g. autonomous vehicle such as self-propelling railcar, and autonomous electric bogie (all claimed), in a payload transportation field. The method enables reducing risk of injury to the driver of the vehicle and preventing the vehicle from colliding with other vehicles. The method enables maintaining an energy source of the lead vehicle to facilitate continuous autonomous protection at the lead vehicle and maintain continuous energy supply without power contributions from a powertrain of the lead vehicle. The drawing shows a schematic block diagram of a system for platooning a vehicle.100System for platooning vehicle 110Dispatcher 120Railway vehicle V2IVehicle-to-infrastructure V2VVehicle-to-vehicle communications |
Please summarize the input | Driving auxiliary method, road photographic image collecting method and road side deviceWithout significantly increasing the load of ITS communication, the load of control for detecting an event of interest to perform a danger avoidance action can be reduced, thereby appropriately assisting the driving control of an automatic driving vehicle. a vehicle-mounted terminal (4) loaded on a vehicle (1) that has ended the passage in the object interval of the road as an information providing source, when an event of interest is detected during the passage in the object interval, sending a photographic image and additional information related to the event of interest to a roadside machine (6) on the end point side of the end point of the object section, the roadside machine (6) on the end point side sends the photographic image and additional information related to the event of interest to a roadside machine (6) on the start point side of the start point of the object section, The road side machine (6) on the starting point side sends the photographic image and additional information related to the attention event to the vehicle-mounted terminal which is carried as the information providing destination to start to pass in the object interval, The vehicle terminal as the information providing destination performs the processing related to the driving control of the vehicle based on the shooting image related to the attention event and the additional information.|1. A driving assistance method, wherein a vehicle-mounted device mounted on a vehicle passing in an object section of a road as an information providing source transmits vehicle-mounted information indicating that the vehicle is mounted with a driving recorder and an image issuing function to a road-side device. the vehicle-mounted device sends a photographic image and additional information related to the interest event to the road side device when the interest event is detected during the passage of the object interval, when the road side device does not receive the vehicle-mounted information within a predetermined time from the vehicle-mounted information received by the vehicle-mounted device, the detection of the attention event is started, when the attention event is detected, the photographic image and additional information related to the attention event are collected, the collected captured image and the additional information are transmitted directly or via other road side devices to a vehicle-mounted device mounted on a vehicle to be started to pass in the object interval as an information providing destination, the road side device receives the vehicle-mounted information from the vehicle-mounted device, stopping the detection of the attention event and the transmission of the photographic image and additional information collected by the device, sending the photographic image and additional information related to the attention event received from the vehicle-mounted device directly or via other road side device to the vehicle-mounted device as the information providing destination of the vehicle to be started to pass in the object interval, The vehicle-mounted device as the information providing destination performs processing related to the travel control of the vehicle based on a photographic image related to the event of interest and additional information.
| 2. The travel assistance method according to claim 1, wherein the additional information comprises the interval information of the interval where the vehicle is currently located, the information related to the travel direction of the vehicle, the information related to the content of the attention event, and the position information of the vehicle on the map when the attention event is detected. Position information of a place where an event of interest occurs on a photographic image and time information when the event of interest is detected.
| 3. The driving assistance method according to claim 1, wherein the vehicle-mounted device as the information providing destination is based on the similarity between the shot image related to the attention event and the shot image at the current time point. to make a determination that the vehicle is approaching the place of occurrence of the event of interest.
| 4. The travel assistance method according to claim 3, wherein the vehicle-mounted device as the information providing destination outputs a risk avoidance operation instruction to the travel control device when it is determined that the vehicle is approaching the occurrence point of the event of interest.
| 5. The travel assistance method according to claim 1, wherein the on-vehicle device as the information supply source transmits the photographic image and additional information related to the event of interest to a road side device located at the end point side of the end point of the object section. the road side device at the end point side sends the photographic image and additional information related to the attention event to the road side device at the starting point side of the starting point of the object interval, The road side device at the starting point side sends the photographic image and additional information related to the attention event to the vehicle-mounted device as the information providing destination.
| 6. The travel assistance method according to claim 1, wherein the vehicle-mounted device as an information supply source is mounted on a vehicle passing in the first direction in the object section. The vehicle-mounted device as an information providing destination is mounted on a vehicle passing in a second direction opposite to the first direction in the object section, the road side device extracts the photographic image of the moving object at the front side of the shielding object from the photographic image obtained from the vehicle-mounted device as the information providing source passing to the first direction, calculating the position of the moving object on the photographic image in the second direction obtained by the vehicle-mounted device of the vehicle passing from the past to the second direction, generating a composite image obtained by superposing the photographic image of the moving object on the photographic image in the second direction based on the calculated position, The roadside device transmits the composite image as a photographic image related to the event of interest to the on-vehicle device as an information providing destination.
| 7. The driving assistance method according to claim 1, wherein the attention event is a traffic accident, the road side device accumulates the captured image and additional information related to the attention event obtained in the vehicle-mounted device or the device as the information providing source in the device. In addition to the latest photographic image and additional information related to the event of interest, the roadside device also transmits the photographic image and additional information related to the event of interest accumulated in the local device to the on-vehicle device as an information providing destination.
| 8. A road photographic image collecting method, wherein the vehicle-mounted device loaded on the vehicle passing in the object interval of the road sends the vehicle-mounted information indicating that the vehicle is loaded with the driving recorder and the image issuing function to the road side device, the vehicle-mounted device sends the shot image and additional information related to the interest event to the road side device when the interest event is detected during the passage of the object interval, when the road side device does not receive the vehicle-mounted information within a predetermined time from the vehicle-mounted information received by the vehicle-mounted device, the detection of the attention event is started, when the attention event is detected, the photographic image and additional information related to the attention event are collected, the collected photographic image and the additional information are transmitted to the server device, the road side device stops the detection of the attention event and the transmission of the photographic image and the additional information collected by the device under the condition that the vehicle carrying information is received from the vehicle-mounted device, sending the photographic image and additional information related to the event of interest received from the vehicle-mounted device to the server device, the server device accumulates the photographic image and additional information related to the event of interest, the server device according to the browsing request of the designated place from the user, A photographic image and additional information relating to the event of interest of the designated location is presented to the user.
| 9. A road side device, comprising: a road-to-vehicle communication part for communicating with the vehicle-mounted device; and a processor, wherein the processor receives vehicle-mounted information indicating that the vehicle is mounted with a driving recorder and an image publishing function from a vehicle-mounted device which is used as an information providing source and is mounted on a vehicle which has passed in an object interval of the road, when the processor does not receive the vehicle loading information within a predetermined time from the last receiving of the vehicle loading information, the detection of the attention event is started, when the attention event is detected, the photographic image and additional information related to the attention event are collected, sending the collected photographic image and the additional information directly or via other road side device to the vehicle-mounted device loaded on the vehicle to be started to pass in the object interval as the information providing destination, the processor receives the vehicle loading information, stopping the detection of the attention event and the transmission of the photographic image and additional information collected by the device, receiving a photographic image and additional information related to an event of interest detected by the vehicle-mounted device as the information providing source during the passage of the object interval from the vehicle-mounted device as the information providing source through the road-to-vehicle communication part, The processor transmits the received photographic image and additional information related to the attention event to the vehicle-mounted device as the information providing destination of the vehicle to be started to pass in the object interval directly or via other road-side devices via the road-room communication part.
| 10. The road side device according to claim 9, wherein when the device is located at the end point of the object area, the road side communication part receives the shot image and additional information related to the attention event from the vehicle-mounted device as the information providing source. when the device is located at the starting point of the object interval, the road-to-vehicle communication part sends the shot image and additional information related to the attention event to the vehicle-mounted device as the information providing destination.
| 11. The roadside device according to claim 9, wherein the road-to-vehicle communication unit receives a photographic image and additional information related to the event of interest from the on-vehicle device as an information providing source mounted on a vehicle passing in the object section in the first direction. the road-to-vehicle communication part sends the shot image and additional information related to the attention event to the vehicle-to-vehicle device as the information providing destination of the vehicle passing in the second direction opposite to the first direction in the object interval, the processor extracts the photographic image of the moving object at the front side of the current shielding object from the photographic image obtained from the vehicle-mounted device as the information providing source passing to the first direction, calculating the position of the moving object on the photographic image in the second direction obtained by the vehicle-mounted device of the vehicle passing from the past to the second direction, generating a composite image obtained by superposing the photographic image of the moving object on the photographic image in the second direction based on the calculated position, The processor transmits the composite image as a photographic image related to the event of interest to the on-vehicle device as an information providing destination.
| 12. The roadside device according to claim 9, wherein the attention event is a traffic accident, and the processor accumulates the captured image and additional information related to the attention event obtained from the on-vehicle device as the information providing source in the storage unit of the device. In addition to the latest photographic image and additional information related to the event of interest, the processor also transmits the photographic image and additional information related to the event of interest accumulated in the storage unit to the on-vehicle device as the information providing destination.
| 13. A road side device, comprising: a road-to-vehicle communication part for communicating with the vehicle-mounted device; and a processor, wherein the processor receives vehicle-mounted information indicating that the vehicle is mounted with a driving recorder and an image publishing function from a vehicle-mounted device which is used as an information providing source and is mounted on a vehicle which has passed in an object interval of the road, when the processor does not receive the vehicle loading information within a predetermined time from the last receiving of the vehicle loading information, the detection of the attention event is started, when the attention event is detected, the photographic image and additional information related to the attention event are collected, sending the collected photographic image and the additional information to the server device, the processor stops the detection of the attention event and the sending of the photographic image and the additional information collected by the device, receiving a photographic image and additional information related to an event of interest detected by the vehicle-mounted device as the information providing source during the passage of the object interval from the vehicle-mounted device as the information providing source through the road-to-vehicle communication part, The processor transmits the received photographic image and additional information relating to the event of interest to a server device. | The method involves transmitting the photographed image and additional information regarding the noteworthy event to a roadside device, and the roadside device sends the photographed image and additional information when a noteworthy event is detected while passing through the target section. The captured image and additional information related to the attention event are transmitted directly or through another roadside device to the in-vehicle device of the information providing destination mounted on a vehicle (1) that starts the passage of the target section. The process related to travel control of the own vehicle is performed based on a photographed image and additional information related to the attention event. INDEPENDENT CLAIMS are included for the following:a road picked-up image collection method; anda roadside apparatus. Driving assistance method for assisting traveling control of vehicle. The safe driving of an automatic drive vehicle can be assisted appropriately. The load of the control performs danger avoidance movement can be reduced. The drawing shows a schematic view of the driving assistance system. (Drawing includes non-English language text) 1Vehicle2Camera4Vehicle-mounted terminal6Traveling control apparatus |
Please summarize the input | METHOD FOR DETERMINING TRAFFIC VIOLATION VEHICLE WHICH EXISTS ON DRIVING ROUTE OF AUTONOMOUS VEHICLE AND THE DEVICE USING THE SAMEAccording to the present invention, there is provided a method for determining whether a vehicle violating traffic laws is present on a driving path of an autonomous vehicle in motion, comprising (a) at least one of (i) a camera, RADAR, and LIDAR in the autonomous vehicle A state in which a first information acquisition module, (ii) a second information acquisition module including a V2X (Vehicle to Everything) communication module, and (iii) a third information acquisition module including a GPS (Global Positioning System) module is mounted In the above, the offending vehicle determination device corresponding to the autonomous vehicle (i) analyzes the first data acquired by the first information acquisition module or causes the first information acquisition module to analyze, A process of acquiring detection information about at least one other vehicle driving in the vicinity, (ii) analyzing the second data acquired by the second information acquisition module or causing the second information acquisition module to analyze, a process of acquiring signal information of at least one traffic light existing in the vicinity of the autonomous vehicle, and (iii) analyzing the third data acquired by the third information acquisition module or causing the third information acquisition module to analyze performing a process of acquiring first location information of the autonomous vehicle; (b) the violating vehicle judging device determines whether the traffic law violating vehicle exists on the driving path of the autonomous vehicle by referring to at least a part of the detection information, the signal information, and the first location information to do; and (c) when there is a vehicle violating traffic laws, the violating vehicle judging device, among the detection information, the signal information, and the first location information, specific detection information related to the traffic law violation vehicle, specific signal information and allowing the traffic law violation evidence collection module to store and manage traffic law violation evidence information including at least specific first location information; There is provided a method comprising a.|1. A method for determining whether a vehicle violating traffic laws is present on a driving route of an autonomous driving vehicle in driving, comprising: (a) a first information acquisition module including (i) a camera module, a RADAR module, and a LIDAR module in the autonomous driving vehicle , (ii) a second information acquisition module including a V2X (Vehicle to Everything) communication module, and (iii) a third information acquisition module including a GPS (Global Positioning System) module are mounted, the autonomous driving The offending vehicle judging device corresponding to the vehicle (i) analyzes the first data acquired by the first information acquisition module or causes the first information acquisition module to analyze at least while driving in the vicinity of the autonomous vehicle A process of acquiring detection information for one other vehicle, (ii) analyzing the second data acquired by the second information acquisition module or causing the second information acquisition module to analyze, a process of obtaining signal information of at least one existing traffic light; and (iii) analyzing the third data acquired by the third information acquisition module or performing a process of acquiring the first location information of the autonomous vehicle by causing the third information acquisition module to analyze;
(b) the violating vehicle judging device determines whether the traffic law violating vehicle exists on the driving path of the autonomous vehicle by referring to at least a part of the detection information, the signal information, and the first location information to do; and (c) when there is a vehicle violating traffic laws, the violating vehicle judging device, among the detection information, the signal information, and the first location information, specific detection information related to the traffic law violation vehicle, specific signal information and allowing the traffic law violation evidence collection module to store and manage traffic law violation evidence information including at least specific first location information;
and, when image data photographing the surrounding conditions of the autonomous vehicle is obtained from the camera module, the offending vehicle determination device causes the deep learning module interlocked with the offending vehicle determination device with respect to the image data to perform a predetermined to detect the other vehicle using an image object detection algorithm of If present, support to use a Single Shot Multibox Detector (SSD) as the image object detection algorithm, and (ii) the specific unit area area in which the other vehicle exists over the first threshold in the entire area of the image data It is characterized in that it supports YOLO (You Only Look Once) to be used as the image object detection algorithm when it does not exist, in the step (a), The offending vehicle determination device fuses each of the detection information of the other vehicle obtained from each of the first information acquisition module using a Kalman filter-based sensor fusion algorithm, (i) an environment factor applying the sensor fusion algorithm In a situation where it is determined that the weight of the time factor is high, the EKF (Extended Kalman Filter) algorithm is supported to be used, and (ii) it is determined that the weight of the accuracy factor is high among the environmental factors to which the sensor fusion algorithm is applied. is characterized in that it supports the UKF (Unscented Kalman Filter) algorithm to be used, and in step (b), the offending vehicle determination device includes (i) the detection information, the signal information, and the first location information a process of generating a first LDM (Local Dynamic Map) in which dynamic driving information of each of the autonomous vehicle and the other vehicle is linked with predetermined map data with reference to at least a part of the information; (ii) a process of obtaining each of the second LDMs generated by each of the other vehicles using the V2X communication module; and (iii) combining each of the first LDM and the second LDM to generate one expanded map; characterized in that to perform, characterized in that the violating vehicle determination device, with additional reference to the expansion map, to determine whether the traffic law violation vehicle exists, in the step (a), the violating vehicle determination The device allows the V2X communication module to transmit each SPAT (Signal Phase and Timing) message from each RSU (Rode Side Unit) interlocked with each of the at least one traffic light located in the vicinity of the autonomous vehicle as 2-1 data. It is characterized in that the signal information is obtained by receiving and analyzing or directly analyzing, and the offending vehicle determination device causes the V2X communication module to obtain a BSM (Basic Safety Message) of each of the other vehicles from each of the other vehicles. It is characterized in that the driving information of each of the other vehicles is additionally acquired by receiving and analyzing as the 2-2 data, or directly analyzing it, and in the step (b), the offending vehicle determining apparatus By additionally referring to each of the driving information of each of the other vehicles, it is determined whether there is a specific collision-anticipated other vehicle expected to collide on the driving path of the autonomous vehicle among the other vehicles, and the specific collision-anticipated other vehicle is determined to be the vehicle violating traffic laws when there is, and in step (c), the traffic law violation evidence information corresponds to the traffic law violating vehicle among the driving information of each of the other vehicles. When it is possible to receive each of the BSMs of each of the other vehicles using the V2X communication module, the offending vehicle determination device may further include specific driving information that is By predicting the trajectory of each of the other vehicles using a constant turn rate and acceleration (CTRA) model assuming that the (yaw rate) value is constant, It is characterized in that it is determined whether each of the other vehicles has a risk of colliding with the autonomous vehicle, and after step (c), (d) the offending vehicle determination device causes the traffic law violation evidence collection module to The traffic law violation evidence information corresponding to the traffic law violation vehicle is transmitted to a report module, so that the report module sends the report module to a specific enforcement agency corresponding to the traffic law violation type of the traffic law violation vehicle among a plurality of enforcement agencies. Supporting to report at least a part of the traffic law violation evidence information;
It characterized in that it further comprises, the report module, characterized in that the information on each traffic law violation report interface provided from each of the enforcement agencies is stored in advance, the violation vehicle judging device, the report Support the module to automatically input and report specific traffic law violation evidence information that can be entered into the specific traffic law violation report interface among the traffic law violation evidence information through the specific traffic law violation report interface provided by the specific enforcement agency A method characterized in that
| 2. delete
| 3. The method according to claim 1, wherein when image data photographing the surrounding conditions of the autonomous vehicle is obtained from the camera module, the offending vehicle determination device uses a deep learning module interlocked with the offending vehicle determination device for the image data. to detect the other vehicle using a predetermined image object detection algorithm, but before the step (a), the offending vehicle determining device performs learning of the deep learning module using predetermined learning data characterized in that, the learning of the deep learning module causes the weight to be increased for at least one specific first class from which the correct rate of the class prediction value is less than the second threshold is derived, and the class prediction value is the correct rate above the second threshold For at least one specific second class from which this is derived, a loss value is calculated using a focal loss that causes the weight to be reduced, The process of optimizing a plurality of parameters included in the deep learning module is repeated by performing backpropagation so that the loss value is minimized, and the class prediction value is the object included in the image data of the corresponding class. A method characterized in that it is a probability value predicted whether it is an object.
| 4. delete
| 5. delete
| 6. The MLE (Maximum Likelihood) of claim 1, wherein when the offending vehicle determination device combines each of the first LDM and the second LDM, when there is a specific other vehicle having different positional coordinates among the other vehicles. Estimation) method, characterized in that the correction in one absolute position coordinates.
| 7. delete
| 8. delete
| 9. delete
| 10. delete
| 11. A vehicle judging device for judging whether a vehicle violating traffic laws exists on a driving route of an autonomous vehicle in motion, comprising: at least one memory for storing instructions; and at least one processor configured to execute the instructions. including, wherein the processor includes (I) a first information acquisition module including (i) a camera module, a RADAR module, and a LIDAR module in the autonomous vehicle, (ii) a V2X (Vehicle to Everything) communication module. In a state in which a third information acquisition module including a second information acquisition module and (iii) a global positioning system (GPS) module is mounted, (i) analyzes the first data acquired by the first information acquisition module or a sub-process of obtaining detection information for at least one other vehicle driving in the vicinity of the autonomous vehicle by causing the first information acquisition module to analyze; (ii) the second information acquisition module acquired by the second information acquisition module. 2 A sub-process of analyzing data or causing the second information acquisition module to analyze to acquire signal information of at least one traffic light existing in the vicinity of the autonomous vehicle; and (iii) analyzing the third data acquired by the third information acquisition module or having the third information acquisition module analyze to perform a sub-process of acquiring the first location information of the autonomous vehicle; (II) a process of determining whether the traffic law violation vehicle exists on the driving path of the autonomous vehicle by referring to at least a part of the detection information, the signal information, and the first location information; and (III) when the traffic law violation vehicle exists, specific detection information related to the traffic law violation vehicle among the detection information, the signal information, and the first location information, the specific signal information, and the specific first location information a process for allowing the traffic law violation evidence collection module to store and manage traffic law violation evidence information including at least; characterized in that, when the image data photographing the surrounding situation of the autonomous vehicle is obtained from the camera module, the processor causes the deep learning module interlocked with the offending vehicle determination device with respect to the image data The other vehicle is detected using a predetermined image object detection algorithm, and the processor is configured to: (i) the other vehicle corresponding to the number equal to or greater than the first threshold within a specific unit area area among the entire area of the image data. In this case, a Single Shot Multibox Detector (SSD) can be used as the image object detection algorithm, and (ii) the specific unit area area in which the other vehicle exists above the first threshold exists in the entire area of the image data. If not, it is characterized in that it supports YOLO (You Only Look Once) to be used as the image object detection algorithm, and in the (I) process, The processor fuses each of the detection information of the other vehicle obtained from each of the first information acquisition module using a Kalman filter-based sensor fusion algorithm, (i) a time factor among environmental factors to which the sensor fusion algorithm is applied In a situation where the weight of is determined to be high, the EKF (Extended Kalman Filter) algorithm is supported to be used, and (ii) in a situation where the weight of the accuracy factor is determined to be high among the environmental factors to which the sensor fusion algorithm is applied, the UKF ( Unscented Kalman Filter) algorithm is supported to be used, and in the process (II), the processor, (i) referring to at least a part of the detection information, the signal information, and the first location information , a process of generating a first LDM (Local Dynamic Map) in which dynamic driving information of each of the autonomous vehicle and the other vehicle is linked with predetermined map data; (ii) a process of obtaining each of the second LDMs generated by each of the other vehicles using the V2X communication module; and (iii) combining each of the first LDM and the second LDM to generate one expanded map; characterized in that to perform, characterized in that the processor, with further reference to the extension map, characterized in that the determination of whether the traffic violation vehicle exists, in the (I) process, the processor, the V2X communication Let the module receive and analyze each of the Signal Phase and Timing (SPAT) messages as 2-1 data from each of the RSUs (Rode Side Units) interlocked with each of the at least one traffic light located in the vicinity of the autonomous vehicle, or Direct analysis, characterized in that the signal information is obtained, and the processor causes the V2X communication module to receive the BSM (Basic Safety Message) of each of the other vehicles from each of the other vehicles as 2-2 data. It is characterized in that each of the driving information of each of the other vehicles is additionally obtained by analyzing it or directly analyzing it, and in the process (II), the processor, By additionally referring to each of the driving information of each of the other vehicles, it is determined whether there is a specific collision-anticipated other vehicle expected to collide on the driving path of the autonomous vehicle among the other vehicles, and the specific collision-anticipated other vehicle is determined as the traffic law-violating vehicle when there is, and in the process (III), the traffic law violation evidence information corresponds to the traffic law-violating vehicle among the driving information of each of the other vehicles. It is characterized in that it further includes specific driving information to be used, and when each of the BSMs of each of the other vehicles can be received using the V2X communication module, the processor, the yaw rate included in the BSM ) by predicting the trajectory of each of the other vehicles using a constant turn rate and acceleration (CTRA) model assuming that the value is constant, It is characterized in that it is determined whether each of the other vehicles has a risk of colliding with the autonomous vehicle, and after the (III) process, (IV) the processor causes the traffic law violation evidence collection module to cause the traffic law The traffic law violation evidence information corresponding to the violating vehicle is transmitted to the report module, and the report module causes the report module to transmit the traffic law violation to a specific enforcement agency corresponding to the traffic law violation type of the traffic law violating vehicle among a plurality of enforcement agencies. a process of supporting to report at least some of the evidence information; characterized in that it further performs, the report module, characterized in that the information on each traffic law violation report interface provided from each of the enforcement agencies is stored in advance, and the processor causes the report module to Through the specific traffic law violation report interface provided by the specific enforcement agency, the specific traffic law violation evidence information that can be inputted into the specific traffic law violation report interface among the traffic law violation evidence information is automatically input to support reporting Violation vehicle judgment system.
| 12. delete
| 12. The method of claim 11, wherein when the image data photographing the surrounding conditions of the autonomous vehicle is obtained from the camera module, the processor causes the deep learning module interlocked with the offending vehicle determination device to set a predetermined value on the image data. Detect the other vehicle using an image object detection algorithm, characterized in that before the (I) process, the processor performs learning of the deep learning module using predetermined learning data, The learning of the learning module is such that the weight is increased for at least one specific first class from which the correct rate of the class prediction value is less than the second threshold value is derived, and the class prediction value is at least one from which the correct rate higher than the second threshold value is derived For a specific second class, the loss value is calculated using the focal loss that causes the weight to be reduced, The process of optimizing a plurality of parameters included in the deep learning module is repeated by performing backpropagation so that the loss value is minimized, and the class prediction value is the object included in the image data of the corresponding class. Violation vehicle determination device, characterized in that the probability value predicted whether the object is.
| 14. delete
| 15. delete
| 12. The method of claim 11, wherein when the processor combines each of the first LDM and the second LDM, if there is a specific other vehicle having different positional coordinates among the other vehicles, a Maximum Likelihood Estimation (MLE) technique Violation vehicle determination device, characterized in that the correction to one absolute position coordinates through.
| 17. delete
| 18. delete
| 19. delete
| 20. delete | The method involves analyzing data acquired by an information acquisition module by an offending vehicle judging device (100) corresponding to a vehicle. Signal information of an existing traffic light is acquired, and location information of the autonomous vehicle is acquired. A determination is made whether a traffic law violating vehicle exists on a driving path of an autonomous vehicle by referring to a part of the detection information, signal information and the location information. A traffic law violation evidence collection module (150) is utilized to store and manage traffic law violation evidence information including specific first location information. An INDEPENDENT CLAIM is also included for vehicle judging device for judging whether a vehicle violating traffic laws exists on a driving route of an autonomous vehicle in motion. The method is useful for determining whether a vehicle violating traffic laws exists on a driving path of an autonomous vehicle. The information on a vehicle violating traffic laws existing on a driving path of an autonomous vehicle is automatically reported to an enforcement agency. The drawing shows a diagram schematically illustrating an overall system in which an offending vehicle determination device for determining whether a vehicle violating traffic laws exists on a driving path of an autonomous vehicle (Drawing includes non-English language text).100Offending vehicle judging device150Traffic law violation evidence collection module |
Please summarize the input | Method and monitoring server for verifying the operation of autonomous vehicles using THE Quality Control verification appIn the method of verifying the operation of an autonomous vehicle using a QC (Quality Control) verification app, a specific verification scenario for a specific road section is divided into a plurality of verification sections, and the autonomous vehicle for each verification section In the state that at least one reference PVD (Probe Vehicle Data) for each operation event to be operated is stored in the database, and the driving PVD (Prove Vehicle Data) of the autonomous vehicle is being transmitted to the control server, (a) the Specific driving PVD (Prove Vehicle Data) transmitted from the autonomous driving vehicle by the control server - The specific driving PVD performs a specific operation event included in a specific verification section in which the autonomous driving vehicle is one of the plurality of verification sections acquiring - which is the driving PVD of the autonomous vehicle corresponding to one; (b) the control server, (i) the first verification result information transmitted by the user riding in the autonomous vehicle - The first verification result information is the specific operation input by the user through the QC verification app A process of obtaining - which is information on whether the operation performed by the autonomous driving vehicle for the event is successful performing a process of obtaining second verification result information, which is a result of determining whether the specific reference PVD is matched; and (c) the control server determines whether the first verification result information and the second verification result information match, and a third information that is final verification result information of the operation performed by the autonomous vehicle with respect to the specific operation event A method and a control server for verifying the operation of an autonomous vehicle, comprising: obtaining verification result information; are disclosed.|1. In the method of verifying the operation of an autonomous vehicle using a QC (Quality Control) verification app, a specific verification scenario for a specific road section is divided into a plurality of verification sections, and the autonomous vehicle for each verification section In a state that at least one reference PVD (Probe Vehicle Data) for each operation event to be operated is stored in the database, and the driving PVD (Prove Vehicle Data) of the autonomous vehicle is being transmitted to the control server, (a) the Specific driving PVD (Prove Vehicle Data) transmitted from the autonomous driving vehicle by the control server - The specific driving PVD performs a specific operation event included in a specific verification section in which the autonomous driving vehicle is one of the plurality of verification sections acquiring - which is the driving PVD of the autonomous vehicle corresponding to one;
(b) the control server, (i) the first verification result information transmitted by the user riding in the autonomous vehicle - The first verification result information is the specific operation input by the user through the QC verification app A process of obtaining - which is information on whether the operation performed by the autonomous driving vehicle for the event is successful performing a process of obtaining second verification result information, which is a result of determining whether the specific reference PVD is matched; and (c) the control server determines whether the first verification result information and the second verification result information match, and a third information that is final verification result information of the operation performed by the autonomous vehicle with respect to the specific operation event obtaining verification result information;
A method of verifying the operation of an autonomous vehicle comprising a.
| 2. The method according to claim 1, wherein each of the reference PVD and the driving PVD is used for verification of (i) a standard field indicating a data item of the autonomous vehicle standardized for V2X communication and (ii) each of the plurality of operation events A method for verifying the operation of an autonomous vehicle, characterized in that it includes a non-standard field indicating a data item of the autonomous vehicle.
| 3. The method according to claim 1, wherein the control server additionally acquires traffic signal information transmitted from a plurality of traffic signal controllers installed in the specific road section, (i) the autonomous vehicle communicates with the plurality of traffic signal controllers V2X an indirect acquisition method of acquiring the traffic signal information through A method of verifying the operation of an autonomous vehicle, characterized in that the traffic signal information is acquired by at least one of direct acquisition methods for acquiring the traffic signal information through direct communication with a traffic signal controller of
| 4. According to claim 3, wherein the control server, with reference to the specific driving PVD to display or support to display on the control display of the control system a screen on which the location of the autonomous vehicle is displayed on a map including the specific road section However, the second connection information allowing the manager of the control system to access (i) the vehicle status information of the autonomous driving vehicle and (ii) the verification status information of the autonomous driving vehicle A method of verifying the operation of an autonomous vehicle, comprising at least one of connection information, and displaying or supporting display on the control display.
| 5. According to claim 4, When the first connection information is selected by the manager, the control server, (i) information related to the state of the internal device of the autonomous vehicle included in the vehicle state information and the autonomous driving at least one of information related to the driving state of the vehicle is displayed or supported on the control display; The method of verifying the operation of an autonomous vehicle, characterized in that the display or support to display the traffic signal information including at least one of specific speed limit information and specific traffic sign information on the control display.
| 5. The method of claim 4, wherein when the second connection information is selected by the manager, the control server, the information on the specific verification section, the information on the specific operation event, the first verification result information, the second verification The method of verifying the operation of the autonomous vehicle, characterized in that the verification state information including at least one of result information and the third verification result information is displayed or supported on the control display.
| 7. According to claim 1, wherein the first verification result information is specific information input by the user - The specific information is detailed information about the failure (fail) with respect to the specific operation event of the autonomous vehicle Information corresponding to - A method of verifying the operation of an autonomous vehicle comprising:
| 8. According to claim 1, wherein the control server, (i) When obtaining the third verification result information corresponding to the case where the first verification result information and the second verification result information match, the control server responds to the specific operation event the verification is completed, and (ii) when the third verification result information corresponding to the case where the first verification result information and the second verification result information do not match is obtained, the first verification result information, the A method of verifying the operation of the autonomous vehicle, characterized in that the second verification result information and the third verification result information are reported and stored.
| 9. The method of claim 1, wherein the control server repeats steps (a) to (c) for each of the plurality of verification sections.
| 10. The method of claim 9, wherein the verification of the operation of the autonomous vehicle is repeated for each of the plurality of verification sections to count the cycle order for the specific road section, and the control server sets as a mission for the specific road section With reference to the mission order data, it is determined whether the circulating order satisfies the mission order data, and if the order does not satisfy the mission order data, verification of the operation of the autonomous vehicle is performed on the plurality of A method of verifying the operation of an autonomous vehicle, characterized in that the number of cycles is increased for the specific road section by repeating for each verification section.
| 11. In the control server that verifies the operation of an autonomous vehicle using a QC (Quality Control) verification app, in a state in which the driving PVD (Prove Vehicle Data) of the autonomous vehicle is being transmitted to the control server, in a specific road section a database in which a specific verification scenario is divided into a plurality of verification sections, and a reference PVD (Probe Vehicle Data) for each operation event in which the autonomous vehicle must operate for each verification section is stored; at least one memory storing instructions; and at least one processor configured to execute the instructions.
, wherein the processor includes: (1) specific driving PVD (Prove Vehicle Data) transmitted from the autonomous driving vehicle - The specific driving PVD is included in a specific verification section in which the autonomous driving vehicle is one of the plurality of verification sections The process of acquiring - which is the driving PVD of the autonomous vehicle corresponding to the execution of a specific motion event; (2) (i) First verification result information transmitted by the user riding in the autonomous vehicle - The first verification result information is the autonomy for the specific operation event input by the user through the QC verification app A process of obtaining - which is information on the success or failure of the operation performed by the driving vehicle; A process of obtaining second verification result information, which is a result of determining whether or not matching, and (3) determining whether the first verification result information and the second verification result information match are performed by the autonomous vehicle for the specific operation event A control server for verifying the operation of an autonomous vehicle, characterized in that it performs a process of obtaining third verification result information, which is the final verification result information of one operation.
| 12. The method of claim 11 , wherein each of the reference PVD and the driving PVD is used for (i) a standard field indicating a data item of the autonomous vehicle standardized for V2X communication and (ii) verifying each of the plurality of operation events A control server for verifying the operation of the autonomous vehicle, characterized in that it includes a non-standard field indicating the data item of the autonomous vehicle.
| 13. The method of claim 11, wherein the processor additionally acquires traffic signal information transmitted from a plurality of traffic signal controllers installed in the specific road section, (i) the autonomous vehicle performs V2X communication with the plurality of traffic signal controllers. an indirect acquisition method of acquiring the traffic signal information through the A control server for verifying the operation of an autonomous vehicle, characterized in that the traffic signal information is acquired by at least one of direct acquisition methods for acquiring the traffic signal information through direct communication with a traffic signal controller.
| 14. The method according to claim 13, wherein the processor supports displaying or displaying a screen on which the location of the autonomous vehicle is displayed on a map including the specific road section with reference to the specific driving PVD on the control display of the control system. , (i) first connection information allowing the manager of the control system to access vehicle status information of the autonomous driving vehicle, and (ii) second connection allowing access to verification status information of the autonomous driving vehicle A control server for verifying the operation of an autonomous vehicle, comprising at least one of information and displaying or supporting display on the control display.
| 15. The method of claim 14, wherein the processor, when the first connection information is selected by the manager, (i) information related to the state of the internal device of the autonomous vehicle included in the vehicle state information and the autonomous vehicle at least one of information related to the driving state of the vehicle is displayed or supported on the control display; The control server for verifying the operation of the autonomous vehicle, characterized in that the display or support to display the traffic signal information including at least one of speed limit information and specific traffic sign information on the control display.
| 16. The method of claim 14, wherein the processor, when the second connection information is selected by the manager, information on the specific verification section, information on the specific operation event, the first verification result information, and the second verification result The control server for verifying the operation of the autonomous vehicle, characterized in that the display or support to display the verification state information including at least one of information and the third verification result information on the control display.
| 12. The method of claim 11, wherein the first verification result information is specific information input by the user - the specific information is detailed information about the failure of the specific operation event of the autonomous vehicle It is information corresponding to - A control server that verifies the operation of the autonomous vehicle, characterized in that it includes.
| 18. The method of claim 11 , wherein the processor (i) obtains the third verification result information corresponding to a case in which the first verification result information and the second verification result information are identical to each other, When the verification is processed as completed, and (ii) the third verification result information corresponding to the case where the first verification result information and the second verification result information do not match is obtained, the first verification result information, the second verification result information A control server that verifies the operation of the autonomous vehicle, characterized in that the second verification result information and the third verification result information are reported and stored.
| 19. The control server of claim 11 , wherein the processor repeats steps (1) to (3) for each of the plurality of verification sections.
| 20. The method of claim 19, wherein the processor repeats the verification of the operation of the autonomous vehicle for each of the plurality of verification sections, counts the circulation order for the specific road section, and is set as a mission for the specific road section With reference to the existing mission order data, it is determined whether the circulating order satisfies the mission order data, and if the order does not satisfy the mission order data, the verification of the operation of the autonomous vehicle is performed by the plurality of verifications. A control server for verifying the operation of an autonomous driving vehicle, characterized in that by repeating for each section and increasing the circulation order for the specific road section. | The method involves transmitting the specific driving prove vehicle data (PVD) from an autonomous driving vehicle (600) by a control server (100). The second verification result information which determines whether the operation performed by the autonomous driving vehicle for the event is successful is obtained and a second verification result information, which is a result of determining whether the specific reference PVD is matched is obtained. A determination is made to check whether the first verification result information and the second verification result information match, and a third information that is final verification result information of the operation performed by the autonomous vehicle with respect to the specific operation event obtaining verification result information. An INDEPENDENT CLAIM is included for a control server for verifying operation of autonomous vehicle using quality control verification app. Method for verifying operation of autonomous vehicle using quality control verification app. The method enables verifying the operation of the autonomous driving vehicle by using data directly confirmed by a user riding in the autonomous vehicle, so that monitoring of information related to verifying operation of an autonomous vehicle through a control display of a control system can be enabled. The drawing shows a schematic view of a control server that verifies the operation of an autonomous vehicle. 100Control server110Memory600Autonomous driving vehicle700Terminal900Database |
Please summarize the input | METHOD FOR PREVENTING POSSIBLE MALFUNCTIONS OF DCU OCCURING DURING AUTONOMOUS DRIVING BY REFERRING TO ADS USING OUTPUTS OF HETEROGENEOUS DCUS AND METHOD USING THE SAMEDisclosed is a method of supporting the analysis of the DCUs in order to prevent a misjudgment situation of the DCUs that may occur in autonomous driving by using an Anomaly Detection System (ADS) for a heterogeneous Domain Control Unit (DCU). That is, (a) the computing device operating in conjunction with the target vehicle that performs autonomous driving, a predetermined first to Nth time point corresponding to the autonomous driving state-N is an integer greater than or equal to 1-to the target vehicle. Allowing the mounted sensor module to acquire first situation information to Nth situation information about a situation around the target vehicle; (b) the computing device allows at least some of the first DCU to M-th DCU operating in conjunction with the computing device-M is an integer of 2 or more-which is one of the first context information to the N-th context information. Generating at least a part of K_1th determination information to K_Mth determination information with reference to K context information-K is an integer of 1 or more and N or less; And (c) the computing device causes the ADS to operate in conjunction with the K_1-th determination information to at least part of the first DCU to the M-th DCU with reference to at least a part of the K_1th determination information to the K_Mth determination information. At least a part of the first DCU to the M-th DCU can be analyzed by calculating the K-th determination match degree for and by causing an edge logger to tag and store the K-th situation information with reference to the K-th decision match degree. Disclosed is a method comprising the step of assisting to perform.|1. (a) A computing device that operates in conjunction with a target vehicle performing autonomous driving, a predetermined first point to an Nth point in time corresponding to the autonomous driving state-N is an integer greater than or equal to 1-a sensor mounted on the target vehicle for each Allowing the module to acquire first situation information to Nth situation information about a situation around the target vehicle;
(b) the computing device allows at least some of the first DCU to M-th DCU operating in conjunction with the computing device-M is an integer of 2 or more-which is one of the first context information to the N-th context information. Generating at least a part of K_1th determination information to K_Mth determination information with reference to K context information-K is an integer of 1 or more and N or less; And (c) the computing device causes the ADS operating in conjunction therewith to refer to at least a portion of the K_1th determination information to the K_Mth determination information to determine at least a portion of the first DCU to the Mth DCU. Calculate the K-th decision match degree, and cause the edge logger to tag and store the K-th situation information with reference to the K-th decision match degree so that at least a portion of the first DCU to the M-th DCU can be analyzed. A method comprising the step of supporting.
| 2. The method of claim 1, wherein in step (c), the computing device causes the ADS to apply a Dynamic Time Warping algorithm according to the following equation to at least a part of the K_1th determination information to the K_Mth determination information Calculate the degree of agreement of the K-th determination for at least some of the first DCU to the M-th DCU, In the above formula, Denotes a first specific time series vector including information on at least one of the K_1th determination information to the K_Mth determination information, Means a second specific time series vector including information on the other one of the K_1th determination information to the K_Mth determination information.
| 3. The method of claim 1, wherein (d) the computing device causes the edge logger to analyze a predetermined log when tag information of the K-th context information indicates that at least a part of the matching degree of the K-th determination is less than a threshold value. The Kth situation information and the K_1th determination information to the K_Mth determination information are transmitted to the system, and the log analysis system causes the Kth situation information and the K_1th determination information to the Kth determination information through a predetermined display device. The step of supporting the manager to analyze at least some of the process processes of the first DCU to the M-th DCU at the K-th point in time corresponding to the K-th situation information by transmitting K_M determination information to the manager. How to characterize.
| 4. The method of claim 3, wherein (e) when the computing device obtains analysis information about a problem in a specific DCU among the first DCU to the Mth DCU from the log analysis system, the analysis information is referred to And modifying the algorithm of the specific DCU.
| 5. The method of claim 1, wherein in the step (a), the computing device causes the sensor module including at least some of a camera, a radar, a lidar, a GPS, and a V2X communication module to cause the first situation information to the Nth situation. A method, characterized in that to obtain information.
| 6. The method of claim 1, wherein at least some of the first DCU to the M-th DCU are implemented in the form of a neural network consisting of a plurality of layers each including a plurality of virtual neurons, and the other part is in the form of a rule-based algorithm. Implemented, wherein each of the first DCU to the Mth DCU outputs results according to different logics.
| 7. The method of claim 1, wherein the step (b) comprises: at least one of K_1th determination information to K_Mth determination information generated from at least one preset main DCU among the first DCU to the Mth DCU. And transmitting one main determination information to an actuator of the target vehicle to support the target vehicle to perform the autonomous driving according to the main determination information.
| 8. In a computing device that supports analysis of the DCUs to prevent a misjudgment situation of the DCUs that may occur in autonomous driving by using an Anomaly Detection System (ADS) for heterogeneous Domain Control Units (DCUs), instructions are provided. One or more memories to store; And one or more processors configured to perform the instructions, wherein the processor includes: (I) a predetermined first to Nth time point corresponding to the autonomous driving state-N is an integer greater than or equal to 1-each mounted on the target vehicle A process of causing the sensor module to acquire first situation information to Nth situation information about a situation around the target vehicle; (II) The first DCU to the M-th DCU operating in conjunction with the computing device-M is an integer greater than or equal to 2-allowing at least some of the K-th context information, which is one of the first context information to the N-th context information-K Is an integer equal to or greater than 1 and equal to or less than N; a process of generating at least some of the K_1th determination information to the K_Mth determination information with reference to; And (III) the ADS operating in conjunction therewith, with reference to at least a part of the K_1-th determination information to the K_M-th determination information, and the K-th determination match degree with respect to at least a portion of the first DCU to the M-th DCU. A process for supporting at least some of the first DCU to the M DCU to be analyzed by calculating and storing the K-th context information by tagging and storing the K-th situation information with reference to the K-th determination matching degree by an edge logger. Device characterized in that performing.
| 9. The method of claim 8, wherein in the (III) process, the processor causes the ADS to apply a Dynamic Time Warping algorithm according to the following equation to at least a part of the K_1th determination information to the K_Mth determination information 1 DCU to calculate the K-th determination agreement for at least a portion of the M-th DCU, In the above formula, Denotes a first specific time series vector including information on at least one of the K_1th determination information to the K_Mth determination information, Means a second specific time series vector including information on the other one of the K_1-th determination information to the K_M-th determination information.
| 10. The system of claim 8, wherein (IV) the processor, when the edge logger causes the tag information of the K-th context information to indicate that at least a part of the matching degree of the K-th determination is less than or equal to a threshold value, The K-th situation information and the K_1-th determination information to the K_M-th determination information are transmitted to the K-th situation information and the K-th determination information to the K_Mth determination information through a predetermined display device to the log analysis system. By transmitting the determination information to the manager, further performing a process of supporting the manager to analyze at least some of the process processes of the first DCU to the M-th DCU at the K-th time point corresponding to the K-th situation information. Device.
| 11. The method of claim 10, wherein, when the processor (V) obtains analysis information about a problem in a specific DCU among the first DCU to the Mth DCU from the log analysis system, the analysis information is referred to An apparatus, characterized in that further performing a process of modifying an algorithm of a specific DCU.
| 12. The method of claim 8, wherein in the (I) process, the processor causes the sensor module including at least some of a camera, a radar, a lidar, a GPS, and a V2X communication module to cause the first situation information to the Nth situation information. Device, characterized in that to obtain.
| 13. The method of claim 8, wherein at least some of the first DCU to the M-th DCU are implemented in the form of a neural network consisting of a plurality of layers each including a plurality of virtual neurons, and the other part is in the form of a rule-based algorithm. Embodied, wherein each of the first DCU to the Mth DCU outputs results according to different logics.
| 14. The method of claim 8, wherein in the (II) process, the processor comprises at least one of K_1th determination information to K_Mth determination information generated from at least one preset main DCU among the first DCU to the Mth DCU. And transmitting the main determination information of the target vehicle to an actuator of the target vehicle to support the target vehicle to perform the autonomous driving according to the main determination information. | The method involves allowing a sensor module (210) mounted on the target vehicle to obtain first situation information to N-th situation information about a situation around the target vehicle where N is integer number. The portion of first domain control unit (DCU) to M DCU-M (220-1-220-M) is caused to operate in conjunction with the computing device to generate portion of the K-1 determination information to the K-M determination information where K is an integer of 1 or more and M is integer of 2 or more. The anomaly detection system (ADS) (130) is allowed to calculate a K decision match for portion of the first DCU to the M DCU with reference to portion of the K-1 determination information to the K-M determination information. The edge logger (140) is supported to tag the K state information with reference to the K determination match so that portion of the first DCU to the M DCU is analyzed. An INDEPENDENT CLAIM is included for a device for preventing misjudgment situation of DCU occurring during autonomous driving of vehicle. Method for preventing misjudgment situation of DCU occurring during autonomous driving of vehicle on road. The misjudgment situation of DCU occurring during autonomous driving of vehicle is prevented effectively by using ADS for heterogeneous DCU. The problem of a specific DCU is checked to correct the algorithm of a specific DCU. The drawing shows a schematic diagram of the computing device. (Drawing includes non-English language text) 130ADS140Edge logger210Sensor module220-1-220-MDCU230Actuator |
Please summarize the input | SYSTEMS AND METHODOLOGY FOR VOICE AND/OR GESTURE COMMUNICATION WITH DEVICE HAVING V2X CAPABILITYA system includes a first communication module for receiving a user message, a processing unit for converting the user message to a vehicle-to-everything (V2X) message, and a second communication module. The first communication module, the processing unit, and the second communication modules are implemented in a first vehicle. The second communication module is configured to transmit the V2X message from the first vehicle via a wireless communication link. The first vehicle may be a drone configured to communicate with a user device positioned on or near a user, and the user message may be an audible message or user gestures. Alternatively, the first vehicle may be inhabited by the user, with the user message being an audible message. The system may enable communication with an autonomous vehicle or another device equipped with V2X capability.|1. A system comprising:
* a first communication module configured to receive a user message;
* a processing unit configured to convert the user message to a vehicle-to-everything (V2X) message; and
* a second communication module, wherein the first communication module, the processing unit, and the second communication module are implemented in a first vehicle, and the second communication module is configured to transmit the V2X message from the first vehicle via a wireless communication link.
| 2. The system of claim 1 wherein the wireless communication link is a first wireless communication link, and the system further comprises an electronic device configured to be positioned proximate a user, the electronic device including a third communication module, wherein the first and third communication modules are configured to enable a second wireless communication link between first vehicle and the electronic device for communication of the user message from the user to the first vehicle.
| 3. The system of claim 2 wherein the first vehicle is an unmanned vehicle, and:
* the electronic device comprises:
* a first wearable structure configured to be positioned on the user, the first wearable structure including the third communication module, wherein the first and third communication modules are configured to enable the second wireless communication link between the unmanned vehicle and the first wearable structure; and
* a second wearable structure configured to be positioned on the user, the second wearable structure being physically displaced away from the first wearable structure, the second wearable structure including a fourth communication module, wherein the first and fourth communication modules are configured to enable a third wireless communication link between the unmanned vehicle and the second wearable structure;
* the processing unit implemented in the unmanned vehicle is further configured to determine a current location of the unmanned vehicle relative to the user in response to the second and third wireless communication links; and
* the system further comprises a drive control unit in communication with the processing unit and configured to adjust a speed and a position of the unmanned vehicle to move the unmanned vehicle from the current location to a predefined location relative to the user.
| 4. The system of claim 3 wherein the predefined location is included in the user message, the user message is an audible message from the user, and at least one of the first and second wearable structures comprises a microphone configured to capture the audible message from the user and at least one of the third and fourth communication modules is configured to communicate the audible message with the predefined location via at least one of the second and third communication links.
| 5. The system of any preceding clam further comprising a camera implemented in the first vehicle and configured to capture motion of a user and provide visual information of the user to the processing unit, wherein the processing unit is further configured to determine the user message from the visual information.
| 6. The system of any preceding claim wherein the first vehicle is an unmanned vehicle, and the system further comprises a camera implemented in the unmanned vehicle and configured to capture an ambient environment visible from the camera and provide visual information of the ambient environment to the user, and the user message is an audible message from the user responsive to the visual information.
| 7. The system of any preceding claim wherein the user message is a first user message, the V2X message is a first V2X message, and:
* the second communication module is further configured to receive a second V2X message via the first wireless communication link;
* the processing unit is further configured to convert the second V2X message to a second user message for communication of the second user message from the first vehicle to the electronic device; and
* the electronic device further comprises a speaker configured to output the second user message as an audible message to the user.
| 8. The system of any preceding claim wherein a user is positioned in the first vehicle and the first communication system comprises a microphone for capturing the user message as an audible message from the user, wherein the processing unit is configured to convert the user message to the V2X message for transmission via the wireless communication link.
| 9. The system of claim 8 wherein the V2X message is configured for transmission to a second vehicle having at least semi-autonomous motion capability, the user message includes a voice command from the user configured to influence navigation of the second vehicle, and the V2X message includes the voice command for commanding navigation of the second vehicle.
| 10. The system of claim 8 or 9 wherein the user message is a first user message, the V2X message is a first V2X message, and:
* the second communication module is further configured to receive a second V2X message via the wireless communication link;
* the processing unit is further configured to convert the second V2X message to a second user message; and
* the system further comprises a speaker implemented in the first vehicle configured to output the second user message as an audible message to the user.
| 11. The system of any preceding claim wherein:
* the first communication module is configured to implement a first wireless communication technology to enable receipt of the user message; and
* the second communication module is configured to implement a second wireless communication technology to enable transmission of the V2X message, the second wireless communication technology differing from the first wireless communication technology.
| 12. A method comprising:
* receiving a user message at a first vehicle;
* converting the user message to a vehicle-to-everything (V2X) message at the first vehicle; and
* transmitting the V2X message from the first vehicle via a wireless communication link.
| 13. The method of claim 12 wherein the wireless communication link is a first wireless communication link, and the method further comprises:
* enabling a second wireless communication link between the first vehicle and an electronic device positioned proximate a user for communication of the user message from the user to the first vehicle.
| 14. The method of claim 13 wherein the first vehicle is an unmanned vehicle, and the method further comprises:
* positioning first and second wearable structures of the electronic device on the user, the first and second wearable structures being physically displaced away from one another;
* enabling a second wireless communication link between the first wearable structure and the unmanned vehicle;
* enabling a third wireless communication link between the second wearable structure and the unmanned vehicle;
* determining a current location of the unmanned vehicle relative to the target in response to the second and third wireless communication links; and
* adjusting a speed and a position of the unmanned vehicle to move the unmanned vehicle from the current location to a predefined location relative to the user.
| 15. The method of claim 14 further comprising:
* capturing an audible message from the user at the electronic device, the predefined location being included in the audible message; and
* communicating the audible message with the predefined location via at least one of the second and third communication links. | The system comprises a first communication module to receive a user message. A processing unit is provided to convert the user message to a vehicle-to-everything message. The first and second communication modules, the processing unit, and the second communication module are implemented in a first vehicle. The second communication is provided for transmitting the V2X message from the first vehicle through a wireless communication link. A camera implemented in vehicle and provided to capture motion of user and provide visual information of user to processing unit. An INDEPENDENT CLAIM is included for a method for enabling communication between human users and vehicles. System for enabling communication between human users and vehicles. System for enabling communication between human users and vehicles having semi-autonomous motion capability and other devices equipped with V2X capability by conversion of user messages e.g. voice and gesture, to vehicle-to-everything messages and vice versa. The drawing shows a schematic view of a system for enabling communication between human users and vehicles.22Electronic device 24Appropriate user 26On-board drone 30First wearable structure 41First location |
Please summarize the input | IDENTIFYING A STOPPING PLACE FOR AN AUTONOMOUS VEHICLEAmong other things, a vehicle is caused to drive autonomously through a road network toward a defined goal position. Current information is analyzed about potential stopping places in the vicinity of the goal position, to make a choice of a currently selected stopping place that is acceptable and feasible. The vehicle is caused to drive autonomously toward the currently selected stopping place. The activities are repeated until the vehicle stops at a currently selected stopping place. | The method involves causing a vehicle (10) to drive autonomously through a road network toward a defined goal position (102). Current information about potential stopping places in vicinity of the goal position is analyzed to make choice of a currently selected stopping place that is acceptable and feasible by applying a predefined strategy for choosing the currently selected stopping place. The vehicle is caused to drive autonomously toward the currently selected stopping place until the vehicle stops at the currently selected stopping place. An INDEPENDENT CLAIM is also included for an autonomous vehicle. Method for identifying stopping places for an autonomous vehicle (claimed). The method enables providing the passenger with option of switching the autonomous vehicle from an autonomous mode to a partially or fully manual mode, so that the passenger can locate an acceptable feasible stopping place. The method enables providing autonomous driving capability to safely and reliably drive through a road environment to the goal position while avoiding vehicles, pedestrians, cyclists, and other obstacles and obeying rules of the road. The drawing shows a schematic view of a map. 10Vehicle100Selected stopping point102Defined goal position104Passenger132Obstacle |
Please summarize the input | Identifying a stopping place for an autonomous vehicleAmong other things, stored data is maintained indicative of potential stopping places that are currently feasible stopping places for a vehicle within a region. The potential stopping places are identified as part of static map data for the region. Current signals are received from sensors or one or more other sources current signals representing perceptions of actual conditions at one or more of the potential stopping places. The stored data is updated based on changes in the perceptions of actual conditions. The updated stored data is exposed to a process that selects a stopping place for the vehicle from among the currently feasible stopping places.The invention claimed is:
| 1. A computer-implemented method comprising:
receiving, by one or more processors, static map data for a region, wherein the static map data identifies one or more potential stopping places for a vehicle within the region;
maintaining, by the one or more processors, stored data indicative of one or more currently feasible stopping places for the vehicle within the region, wherein the one or more currently feasible stopping places area subset of the one or more potential stopping places, and wherein at least one potential stopping place of the one or more potential stopping places is determined to be a currently feasible stopping place based on:
an amount of time elapsed since the potential stopping place was determined to be infeasible for parking stopping exceeding a first threshold value,
a reason for the determination that the potential stopping place is infeasible for stopping, and
at least one of a historical level of demand for parking in a vicinity of the potential stopping place being less than a second threshold value or traffic volume in the vicinity of the potential stopping place being less than a third threshold value;
receiving from one or more sensors or one or more other sources current signals representing perceptions of actual conditions at the one or more currently feasible stopping places;
updating, by the one or more processors, the stored data based on the perceptions of actual conditions to include one or more updated currently feasible stopping places; and
exposing, by the one or more processors, the updated stored data to a process that selects a stopping place for the vehicle from among the one or more updated currently feasible stopping places.
| 2. The method of claim 1 comprising:
discretizing, by the one or more processors, the one or more potential stopping places as a finite number of points within the region.
| 3. The method of claim 2 comprising: defining, by the one or more processors, the potential stopping place as a shape containing at least one of the points, the potential stopping place configured to accommodate a footprint of the vehicle.
| 4. The method of claim 3 comprising: attributing, by the one or more processors, an orientation to the shape, the orientation corresponding to a direction of traffic flow at the potential stopping place.
| 5. The method of claim 2 comprising:
initializing, by the one or more processors, the one or more potential stopping places as one or more stopping places expected to be feasible based on prior signals from the one or more sensors, the prior signals representing past perceptions of past actual conditions at some of the one or more potential stopping places.
| 6. The method of claim 1 in which the one or more sensors comprise at least one sensor that is physically located on the vehicle.
| 7. The method of claim 1 in which the one or more sensors comprise at least one sensor that is physically remote from the vehicle,
wherein the at least one sensor is located inside a parking garage.
| 8. The method of claim 1 in which the current signals received from the one or more sensors are received through vehicle-to-vehicle or vehicle-to-infrastructure communication.
| 9. The method of claim 1 in which the one or more other sources comprise crowd-sourced data sources.
| 10. The method of claim 1 in which the vehicle is part of a fleet of vehicles managed from a central server and the method comprises the server distributing information received from sensors at one of the vehicles to other vehicles of the fleet.
| 11. An autonomous vehicle, comprising:
one or more processors;
one or more sensors; and
one or more data storage devices including instructions that when executed by the one or more processors, cause the autonomous vehicle to perform functions comprising:
receiving static map data for a region, wherein the static map data identifies one or more potential stopping places for a vehicle within the region;
maintaining stored data indicative of one or more currently feasible stopping places for the vehicle within the region, wherein the one or more currently feasible stopping places are a subset of the one or more potential stopping places, and wherein a at least one potential stopping place of the one or more potential stopping places is determined to be a currently feasible stopping place based on:
an amount of time elapsed since the potential stopping place was determined to be infeasible for stopping exceeding a first threshold value,
a reason for the determination that the potential stopping place is infeasible for stopping, and
at least one of a historical level of demand for parking in a vicinity of the potential stopping place being less than a second threshold value or traffic volume in the vicinity of the potential stopping place being less than a third threshold value;
receiving from the one or more sensors or one or more other sources current signals representing perceptions of actual conditions at the one or more currently feasible stopping places;
updating the stored data based on the perceptions of actual conditions to include one or more updated currently feasible stopping places; and exposing the updated stored data to a process that selects a stopping place for the vehicle from among the one or more updated currently feasible stopping places.
| 12. The autonomous vehicle of claim 11, wherein the functions comprise:
initializing the potential stopping places as all of the potential stopping places identified as part of the static map data for the region.
| 13. The autonomous vehicle of claim 11, wherein the functions comprise:
discretizing the potential stopping places as a finite number of points within the region corresponding to potential stopping places.
| 14. The autonomous vehicle of claim 13, wherein the functions comprise:
defining a potential stopping place as a shape containing one of the points, the shape corresponding to a footprint of the vehicle.
| 15. The autonomous vehicle of claim 14, wherein the functions comprise:
attributing an orientation to the shape, the orientation corresponding to a direction of traffic flow.
| 16. The autonomous vehicle of claim 13, wherein the functions comprise:
initializing the potential stopping places as potential stopping places expected to be feasible based on prior signals from the one or more sensors representing perceptions of actual conditions at one or more of the potential stopping places.
| 17. The autonomous vehicle of claim 11 in which the current signals received from the one or more sensors are received through vehicle-to-vehicle or vehicle-to-infrastructure communication.
| 18. A non-transitory computer readable medium storing instructions thereon that, when executed by one or more processors, cause the one or more processors to perform functions comprising:
receiving static map data for a region, wherein the static map data identifies one or more potential stopping places for a vehicle within the region;
maintaining stored data indicative of one or more currently feasible stopping places for the vehicle within the region, wherein the one or more currently feasible stopping places are a subset of the one or more potential stopping places, and wherein at least one potential stopping place of the one or more potential stopping places determined to be a currently feasible stopping place based on:
an amount of time elapsed since the potential stopping place was determined to be infeasible for stopping exceeding a first threshold value,
a reason for the determination that the potential stopping place is infeasible for stopping, and
at least one of a historical level of demand for parking in a vicinity of the potential stopping place being less than a second threshold value or traffic volume in the vicinity of the potential stopping place being less than a third threshold value;
receiving from one or more sensors or one or more other sources current signals representing perceptions of actual conditions at the one or more currently feasible stopping places;
updating the stored data based on the perceptions of actual conditions to include one or more updated currently feasible stopping places; and
exposing the updated stored data to a process that selects a stopping place for the vehicle from among the one or more updated currently feasible stopping places. | The computer-based method involves maintaining stored data indicative of potential stopping places that are currently feasible stopping places for a vehicle within a region. The sensors (24) or several other sources current signals are received that represents perceptions of actual conditions at several potential stopping places. The stored data is updated based on changes in the perceptions of actual conditions. The updated stored data is exposed to a process that selects a stopping place for the vehicle from among the currently feasible stopping places. Method for identifying stopping places for autonomous vehicle. The information received from the device of the passenger includes an indication that the time spent searching for an acceptable stopping place is acceptable to the passenger. The set of elements or components located on an autonomous vehicle or at other locations that enables an autonomous vehicle to operate. The stopping place that are closer to curbs are generally preferred as they allow the passenger to access the activity AV more easily. The drawing shows a block diagram of the method of identifying stopping places for autonomous vehicle. 10Automonus vehicle12Road environment14Global position24Sensors34Data base |
Please summarize the input | INTERVENTION IN OPERATION OF A VEHICLE HAVING AUTONOMOUS DRIVING CAPABILITIESAmong other things, a determination is made that intervention in an operation of one or more autonomous driving capabilities of a vehicle is appropriate. Based on the determination, a person is enabled to provide information for an intervention. The intervention is caused in the operation of the one or more autonomous driving capabilities of the vehicle.|1. A vehicle comprising:
at least one processor; and
a non-transitory computer-readable storage medium storing instructions which when executed by the at least one processor cause the at least one processor to:
operate the vehicle in an autonomous mode;
receive a command using a vehicle-to-infrastructure (V2I) communication device of the vehicle, the command instructing the vehicle to maneuver to a goal location;
determine that the vehicle is unable to convert the command into machine instructions to operate the vehicle to maneuver to the goal location; and
responsive to determining that the vehicle is unable to convert the command into machine instructions, transmit a teleoperation request to a teleoperation server.
| 2. The vehicle of claim 1, wherein the teleoperation request comprises a current location of the vehicle.
| 3. The vehicle of claim 1, wherein the teleoperation request comprises one or more trajectory sampling points for the vehicle.
| 4. The vehicle of claim 1, wherein maneuvering to the goal location comprises:
treating a current location of the vehicle as prior knowledge; and
using an inference algorithm to update the a current location of the vehicle based on the command.
| 5. The vehicle of claim 1, wherein maneuvering to the goal location comprises inferring a speed profile from the command.
| 6. The vehicle of claim 1, wherein maneuvering to the goal location comprises inferring a steering angle from the command using a learning algorithm.
| 7. The vehicle of claim 1, wherein converting the command into machine instructions comprises enabling, editing or disabling a hardware component or a software process.
| 8. The vehicle of claim 1, wherein converting the command into machine instructions comprises overwriting a travel preference or a travel rule.
| 9. The vehicle of claim 1, wherein converting the command into machine instructions comprises editing data comprising one or more of a map, sensor data in the vehicle or a related AV system, trajectory data in the vehicle or a related AV system, vision data in the vehicle or a related AV system, or any past data in the vehicle or a related AV system.
| 10. A non-transitory computer-readable storage medium storing instructions, which when executed by one or more processors cause the one or more processors to:
operate a vehicle in an autonomous mode;
receive a command using a vehicle-to-infrastructure (V2I) communication device of the vehicle, the command instructing the vehicle to maneuver to a goal location;
determine that the vehicle is unable to convert the command into machine instructions to operate the vehicle to maneuver to the goal location; and
responsive to determining that the vehicle is unable to convert the command into machine instructions, transmit a teleoperation request to a teleoperation server.
| 11. The non-transitory computer-readable storage medium of claim 10, wherein the teleoperation request comprises a current location of the vehicle.
| 12. The non-transitory computer-readable storage medium of claim 10, wherein the teleoperation request comprises one or more trajectory sampling points for the vehicle.
| 13. The non-transitory computer-readable storage medium of claim 10, wherein maneuvering to the goal location comprises:
treating a current location of the vehicle as prior knowledge; and
using an inference algorithm to update the a current location of the vehicle based on the command.
| 14. The non-transitory computer-readable storage medium of claim 10, wherein maneuvering to the goal location comprises inferring a speed profile from the command.
| 15. The non-transitory computer-readable storage medium of claim 10, wherein maneuvering to the goal location comprises inferring a steering angle from the command using a learning algorithm.
| 16. The non-transitory computer-readable storage medium of claim 10, wherein converting the command into machine instructions comprises enabling, editing or disabling a hardware component or a software process.
| 17. The non-transitory computer-readable storage medium of claim 10, wherein converting the command into machine instructions comprises overwriting a travel preference or a travel rule.
| 18. The non-transitory computer-readable storage medium of claim 10, wherein converting the command into machine instructions comprises editing data comprising one or more of a map, sensor data in the vehicle or a related AV system, trajectory data in the vehicle or a related AV system, vision data in the vehicle or a related AV system, or any past data in the vehicle or a related AV system.
| 19. A method comprising:
operating, by one or more processors, a vehicle in an autonomous mode;
receiving, by the one or more processors, a command using a vehicle-to-infrastructure (V2I) communication device of the vehicle, the command instructing the vehicle to maneuver to a goal location;
determining, by the one or more processors, that the vehicle is unable to convert the command into machine instructions to operate the vehicle to maneuver to the goal location; and
responsive to determining that the vehicle is unable to convert the command into machine instructions, transmitting, by the one or more processors, a teleoperation request to a teleoperation server.
| 20. The method of claim 19, wherein the teleoperation request comprises a current location of the vehicle. | The vehicle (10) comprises one processor. A non-transitory computer-readable storage medium stores instructions which is executed by the one processor. A command is received using a vehicle-to-infrastructure (V2I) communication device of the vehicle. The command instructs the vehicle to maneuver to a goal location. Determines that the vehicle is unable to convert the command into machine instructions to operate the vehicle to maneuver to the goal location. A teleoperation request is transmitted to a teleoperation server. The teleoperation request provides a current location of the vehicle and multiple trajectory sampling points for the vehicle. INDEPENDENT CLAIMS are included for the following:a non-transitory computer-readable storage medium storing instructions; anda method involves operating a vehicle in an autonomous mode. Vehicle. Vehicle ensures the resulting transition exhibits smooth and gradual changes in driving orientations. The drawing shows a block diagram of the AV system. 10Vehicle24Sensor28Communication devices40Computing device42Processor44Interface devices |
Please summarize the input | V2V latency measurement reporting to traffic server for optimizing the inter vehicle distance for self-driving carsMethods and apparatus, including computer program products, are provided for autonomous vehicles. In one aspect there is provided a method. The method may include detecting, at an autonomous vehicle, at least one vehicle within a certain range of the autonomous vehicle; measuring a latency representative of a time to communicate via a wireless link to the at least one detected vehicle; reporting the measured latency to the network; and receiving, by the autonomous vehicle, information to enable the autonomous vehicle to determine an intervehicle distance for configuration at the autonomous vehicle. Related apparatus, systems, methods, and articles are also described.What is claimed:
| 1. A method, comprising:
detecting, at an autonomous vehicle, at least one vehicle within a certain range of the autonomous vehicle;
measuring a latency representative of a time for the at least one detected vehicle to respond to a message sent by the autonomous vehicle via a wireless link;
reporting, to a network, the measured latency; and
receiving, at the autonomous vehicle, information from the network, the information including an intervehicle distance for configuration at the autonomous vehicle, the intervehicle distance being determined at the network, and the intervehicle distance being determined based at least on the measured latency reported to the network.
| 2. The method of claim 1, wherein the measuring of the latency includes sending, by the autonomous vehicle, the message to the at least one detected vehicle, and wherein the latency is determined based at least on a first time when the autonomous vehicle sent the message and a second time when the autonomous vehicle receives, from the at least one detected vehicle, a response to the message.
| 3. The method of claim 2, further comprising:
in response to the at least one detected vehicle failing to respond to the message within a threshold quantity of time, reporting, to the network, an indication that the at least one detected vehicle is non-autonomous, wherein the intervehicle distance being determined further based on the reported indication.
| 4. The method of claim 1, wherein the information includes a value representative of the intervehicle distance for configuration at the autonomous vehicle.
| 5. The method of claim 1, further comprising:
configuring, by the autonomous vehicle, operation based on the intervehicle distance.
| 6. The method of claim 1, wherein the intervehicle distance represents a minimum and/or an optimum distance between the autonomous vehicle and the at least one vehicle.
| 7. An apparatus, comprising:
at least one processor; and
at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following:
detect, at the apparatus, at least one vehicle within a certain range of the apparatus;
measure a latency representative of a time for the at least one detected vehicle to respond to a message sent by the apparatus via a wireless link;
report, to a network, the measured latency; and
receive, at the apparatus, information from the network, the information including an intervehicle distance for configuration at the apparatus, the intervehicle distance being determined at the network, and the intervehicle distance being determined based at least on the measured latency reported to the network.
| 8. The apparatus of claim 7, wherein the apparatus measures the latency by at least sending, to the at least one detected vehicle, the message.
| 9. The apparatus of claim 8, wherein the latency is determined based at least on a first time when the apparatus sent the message and a second time when the apparatus receives, from the at least one detected vehicle, a response to the message.
| 10. The apparatus of claim 9, wherein the apparatus is further configured to at least:
in response to the at least one detected vehicle failing to respond to the message within a threshold quantity of time, report, to the network, an indication that the at least one detected vehicle is non-autonomous, the intervehicle distance being determined further based on the reported indication.
| 11. The apparatus of claim 8, wherein the apparatus reports, to the network, the measured latency in response to receiving the response from the at least one detected vehicle.
| 12. The apparatus of claim 7, wherein the received information includes a value representative of the intervehicle distance for configuration at the apparatus.
| 13. The apparatus of claim 7, wherein the apparatus is further configured to at least configure, based on the intervehicle distance, an operation of the apparatus.
| 14. The apparatus of claim 7, wherein the intervehicle distance represents a minimum and/or an optimum distance between the apparatus and the at least one detected vehicle.
| 15. An apparatus, comprising:
at least one processor; and
at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following:
receive, at the apparatus, a latency measurement representative of a time for at least one vehicle to respond to a message sent by an autonomous vehicle, via a wireless link, the at least one vehicle detected at the autonomous vehicle to be within a certain range of the autonomous vehicle, the latency measurement being determined by the autonomous vehicle;
determine, based at least on the received latency measurement, an intervehicle distance; and
send, to the autonomous vehicle, information including the intervehicle distance for configuration at the autonomous vehicle.
| 16. The apparatus of claim 15, wherein the intervehicle distance is determined further based on an indication that the at least one detected vehicle is non-autonomous.
| 17. The apparatus of claim 16, wherein the apparatus is further configured to a least receive, from the autonomous vehicle, the indication that the at least one detected vehicle is non-autonomous, and wherein the autonomous vehicle sends the indication in response to the at least one detected vehicle failing to respond to a message from the autonomous vehicle within a threshold quantity of time.
| 18. The apparatus of claim 15, wherein the intervehicle distance is determined further based on a road condition, a weather condition, a characteristic of the autonomous vehicle, and/or a characteristic of the at least one detected vehicle.
| 19. The apparatus of claim 15, wherein the information includes a value representative of the intervehicle distance for configuration at the autonomous vehicle. | The method (400) involves detecting (405) a vehicle within a certain range of an autonomous vehicle and measuring (415) a latency representative of a time to communicate through a wireless link to the detected vehicle. The measured latency is reported (420) to the network. The information is received (425) to enable the autonomous vehicle to determine an inter-vehicle distance for configuration at the autonomous vehicle. INDEPENDENT CLAIMS are included for the following:an apparatus for controlling autonomous vehicles; anda non-transitory computer-readable storage medium with program code for controlling autonomous vehicles. Method for controlling autonomous vehicles. The information is received to enable the autonomous vehicle to determine an inter-vehicle distance for configuration at the autonomous vehicle, thus traffic congestion is alleviated while improving road safety. The drawing shows a flowchart of a process for latency measurement reporting. 400Autonomous vehicle controlling method405Detecting a vehicle415Measuring a latency420Reporting the measured latency425Receiving the information |
Please summarize the input | Positioning system based on geofencing frameworkThis provides methods and systems for the global navigation satellite system (GNSS) combined with the dead-reckoning (DR) technique, which is expected to provide a vehicle positioning solution, but it may contain an unacceptable amount of error due to multiple causes, e.g., atmospheric effects, clock timing, and multipath effect. Particularly, the multipath effect is a major issue in the urban canyons. This invention overcomes these and other issues in the DR solution by a geofencing framework based on road geometry information and multiple supplemental kinematic filters. It guarantees a road-level accuracy and enables certain V2X applications which does not require sub-meter accuracy, e.g., signal phase timing, intersection movement assist, curve speed warning, reduced speed zone warning, and red-light violation warning. Automated vehicle is another use case. This is used for autonomous cars and vehicle safety, shown with various examples/variations.The invention claimed is:
| 1. A method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle, said method implemented by one or more processors, said method comprising:
receiving vehicle states for said vehicle, said vehicle states including at least data of a position, a speed, a heading angle and a yaw rate of said vehicle, said yaw rate having a yaw rate bias;
removing said yaw rate bias of said yaw rate;
responsive to removing said yaw rate bias of said yaw rate, determining whether a reference road exists, said reference road providing data of at least a road heading angle and a road curvature;
in case said reference road exists, determining whether said existing reference road is valid;
in case said reference road does not exist or said existing reference road is invalid, searching for said reference road;
in case said reference road is found based on said search or said existing reference road is valid, determining whether a lane change is detected for said vehicle;
in case said lane change is detected for said vehicle, performing retrospective integrations of said speed and yaw rate for said vehicle;
determining a reference yaw rate based on said road curvature and said speed of said vehicle;
determining whether a yaw rate error between said yaw rate and said reference yaw rate is less than a yaw rate threshold;
in case said yaw rate error is less than said yaw rate threshold, forcing said heading angle of said vehicle to said road heading angle;
updating said vehicle states;
determining geofencing conditions of said position, speed, heading angle and yaw rate of said vehicle;
determining whether said geofencing conditions are met;
in case said geofencing conditions are met, applying geofencing to limit said position of said vehicle between road boundaries of said reference road;
updating said data of said vehicle's position; and
outputting said data of said vehicle's position to an upper layer of said road vehicle navigation system for said vehicle.
| 2. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said road vehicle navigation system works with or communicates with a global navigation satellite system.
| 3. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said vehicle is interior to said reference road.
| 4. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein a distance from said vehicle to a next intersection is greater than a first threshold.
| 5. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said vehicle's speed is greater than a second threshold.
| 6. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein applying geofencing comprises: timely geofencing.
| 7. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein applying geofencing comprises: predicted geofencing.
| 8. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining an incorrect position of said vehicle and correcting the determined position for said vehicle based on reducing a lateral error.
| 9. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein in case said reference road does not exist or said existing reference road is invalid, said search for said reference road comprises:
determining candidate reference roads where said vehicle's position is interior to end points of said candidate reference roads;
determining, from said candidate reference roads, a candidate reference road satisfying a heading error below a threshold; and
adjusting an order of said end points to be consistent with a travel direction of said vehicle.
| 10. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining a lateral error for said vehicle.
| 11. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: in case said reference road is not found based on said search, outputting said data of said vehicle's position to said upper layer of said road vehicle navigation system for said vehicle.
| 12. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining a longitudinal error for said vehicle.
| 13. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining a predicted position for said vehicle based on at least one of lateral correction and longitudinal correction.
| 14. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining an average value of said yaw rate bias within a moving time window, and correcting said yaw rate bias based on said determined average value.
| 15. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining a sensor temperature, and determining said yaw rate bias that varies with said sensor temperature.
| 16. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining vibration or noise for removing said yaw rate bias.
| 17. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: applying a security layer for said road vehicle navigation system for said vehicle.
| 18. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: applying an application layer for said road vehicle navigation system for said vehicle.
| 19. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: applying a network layer for said road vehicle navigation system for said vehicle.
| 20. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: applying a physical layer for said road vehicle navigation system for said vehicle. | The method involves determining whether a yaw rate error is small by a processor. A vehicle's heading angle is forced to a road heading angle when the yaw rate error is small. Vehicle states are updated by the processor. Geo-fencing conditions are evaluated by the processor. Determination is made whether the geo-fencing conditions are met. Geo-fencing for a vehicle is applied in case the geo-fencing conditions are met. Vehicle's position data is updated. The vehicle's position data is outputted to an upper layer of a road vehicle navigation system for the vehicle. Method for positioning a vehicle i.e. autonomous car, based on a geo-fencing framework for a road vehicle navigation system for safety. The method enables identifying vehicle movements accurately and generating better results in a shorter time period. The method enables guaranteeing road-level accuracy and providing vehicle-to-everything (V2X) applications to eliminate need to require signal phase timing, intersection movement assist, curve speed warning, reduced speed zone warning and red-light violation warning. The method enables providing a reference road i.e. line or curve, for connecting two adjacent intersections to correct a vehicle position and heading, thus providing necessary information for coordinates of end points of the reference road, road heading angle, curvature and road width. The method enables performing weighted-averaging process based on redundancies between coverage of different units, weighted-average of data for accurate results and more weights for more reliable units or sources or higher weights for results that are closer to a center of curve representing distribution of values, thus eliminating or reducing fringe results or erroneous data. The drawing shows a flow diagram illustrating a development of fully automated vehicles. |
Please summarize the input | DRIVE CONTROL METHOD AND DRIVE CONTROL DEVICEIn a drive control method for using a drive control device to control the operation of a host vehicle using at least two autonomous driving modes that have different levels of driving assistance. The drive control method includes shifting the autonomous driving mode from a first mode to a second mode in which the driving assistance level of the second mode is higher than the driving assistance level of the first mode upon detecting a preceding vehicle in front of a host vehicle while traveling in the first mode. In this drive control method, a detectable distance to the preceding vehicle for shifting to the second mode is greater than a followable distance to the preceding vehicle when following travel is permitted in the first mode.|1. A drive control method having at least two autonomous driving modes having different driving assistance levels, the drive control method comprising:
shifting the autonomous driving mode from a first mode to a second mode in which the driving assistance level of the second mode is higher than the driving assistance level of the first mode upon detecting a preceding vehicle traveling in front of the host vehicle while traveling in the first mode, wherein
a detectable distance to the preceding vehicle for shifting to the second mode is greater than a followable distance to the preceding vehicle when following travel is permitted in the first mode.
| 2. The drive control method according to claim 1, further comprising
calculating a reliability of the preceding vehicle using the drive control device based on a behavior of the preceding vehicle upon detecting the preceding vehicle while the operation of the host vehicle is controlled using the first mode, and
the drive control device not shifting the autonomous driving mode to the second mode upon determining the reliability of the preceding vehicle is less than a predetermined defined value.
| 3. The drive control method according to claim 2, wherein
the drive control device maintains the autonomous driving mode in the first mode upon determining the reliability of the preceding vehicle is less than the defined value.
| 4. The drive control method according to claim 2, wherein
the calculating of the reliability of the preceding vehicle using the drive control device is based on at least one of a lateral displacement amount of the preceding vehicle, a frequency of acceleration or deceleration, and a frequency of an illumination of brake lights.
| 5. (canceled)
| 6. The drive control method according to claim 1, wherein
an upper limit distance of the followable distance is a distance at which the host vehicle and the preceding vehicle can carry out vehicle-to-vehicle communication.
| 7. The drive control method according to claim 1, wherein
the drive control device detects another vehicle as a preceding vehicle traveling in front of the host vehicle when travel history information about the other vehicle is received and the travel history information that is received includes information indicating that the other vehicle was traveling at a point in front of the host vehicle within a prescribed period of time.
| 8. The drive control method according to claim 1, wherein
the drive control device does not shift the autonomous driving mode to the second mode when the host vehicle is traveling in the first mode and a vehicle speed of the host vehicle is greater than or equal to a prescribed speed.
| 9. The drive control method according to claim 1, wherein
the first mode is an autonomous driving mode that requires a driver to visually monitor the surrounding conditions of the host vehicle, and
the second mode is an autonomous driving mode in which the drive control device executes monitoring of the surrounding conditions of the host vehicle.
| 10. The drive control method according to claim 1, wherein
the first mode is a hands-on mode in which steering control by the control device does not operate when the driver is not holding the steering wheel, and
the second mode is a hands-off mode in which steering control by the drive control device operates even if the driver's hands leave the steering wheel.
| 11. The drive control method according to claim 1, wherein
another vehicle is excluded as a preceding vehicle when a ride height of the other vehicle traveling in front of the host vehicle is greater than a ride height of the host vehicle.
| 12. The drive control method according to claim 1, wherein
another vehicle is excluded as a preceding vehicle when the other vehicle traveling in front of the host vehicle is a two-wheeled vehicle.
| 13. The drive control method according to claim 1, wherein
the drive control device is configured to
shift the autonomous driving mode from the first mode to the second mode when a first preceding vehicle is present as the preceding vehicle in a first lane in which the host vehicle travels while the operation of the host vehicle is controlled using the first mode,
determine whether or not the first preceding vehicle and a second preceding vehicle are traveling in the first lane upon detecting a second preceding vehicle traveling in front of the first preceding vehicle is also present in the first lane,
cause the host vehicle to travel behind the second preceding vehicle upon determining that the first preceding vehicle has changed lanes to another lane that is different from the first lane and that the second preceding vehicle continues to travel in the first lane, and
cause the vehicle to travel behind the first preceding vehicle and to change lanes to the other lane upon determining that the first preceding vehicle and the second preceding vehicle changed lanes to the other lane.
| 14. A drive control device comprising:
a control unit configured to control an operation of a host vehicle using at least two autonomous driving modes including a first mode and a second mode that has a driving assistance level that is higher than that of the first mode; and
a preceding vehicle detection unit configured to detect a preceding vehicle traveling in front of the host vehicle,
the control unit being configured to shift the autonomous driving mode from the first mode to the second mode when the operation of the host vehicle is controlled using the first mode and the preceding vehicle detection unit detects the preceding vehicle, wherein
the control unit is configured to use a detectable distance to the preceding vehicle for shifting to the second mode that is greater than a followable distance to the preceding vehicle when following travel is permitted in the first mode. | The method involves controlling the driving of the own vehicle by automatic driving mode in which driving assistance levels differ using an operation-control apparatus. The automatic driving mode contains a first mode and a second mode in which a driving assistance level is higher than first mode, when the operation-control apparatus is controlling the driving of the own vehicle using first mode. The automatic driving mode is changed to second mode from first mode, when the preceding vehicle which drives the front of the own vehicle is detected. An INDEPENDENT CLAIM is included for an operation-control apparatus. Operation-control method of vehicle. The apparatus is effective in the ability to make many environments where the apparatus is made to drive the own vehicle in the automatic driving mode in which a driving assistance level is relatively high. The operation-control apparatus can pull down automatic driving mode in a first mode from a second mode, when the reliability of the second preceding vehicle is less than regulation value. The drawing shows a flowchart illustrating the operation-control process. (Drawing includes non-English language text) S1Step for determining whether automatic driving mode of own vehicle is first modeS2Step for determining whether vehicle speed of own vehicle is more than prescribed speedS3Step for determining whether preceding vehicle in which preceding vehicle detection unit drives front of own vehicle is detectedS4Step for calculating reliability of preceding vehicleS5Step for determining regulation value by which reliability of preceding vehicle is defined previously |
Please summarize the input | The automatic operating method and automatic controller of a vehicleAccording to the present invention, whether congestion is detected at a predetermined distance ahead of an own vehicle on a travel route of the own vehicle is determined during travel by automatic driving in which the vehicle speed reaches a target vehicle speed, and, in congestion detection in which congestion is determined, a vehicle speed VSP is decreased to be lower than a target vehicle speed VSPt0 of automatic driving in a normal condition in which congestion is not detected.|1. It is an autonomous driving method of a vehicle which controls the vehicle equipped with an internal combustion engine as a drive source,
Comprising:
When moving by the autonomous driving which makes a vehicle speed close to a target vehicle speed,
On the driving route of a vehicle, it determines whether it detected the traffic congestion which exists ahead of more than the predetermined distance before a vehicle,
When the said traffic congestion is detected, it sets the target vehicle speed of a said automatic driving|operation lower than the normal time excepting the time of the said traffic congestion detection,
It controls the vehicle speed of a said vehicle to the said target vehicle speed,
It makes a said vehicle approach the said traffic congestion.
When a said vehicle approaches below a distance shorter than the said predetermined distance with respect to the said traffic congestion, it stops supply of the fuel with respect to the said internal combustion engine,
The autonomous driving method of the vehicle characterized by the above-mentioned.
| 2. It is an automatic operating method of the vehicle of Claim 1,
Comprising:
It is a distance longer than an inter-vehicle distance with the said preceding vehicle set at the time of the follow-up driving which the said predetermined distance tracks the preceding vehicle of vehicle front, and drive|works,
The automatic operating method of a vehicle.
| 3. It is an automatic operating method of the vehicle of Claim 1,
Comprising:
The said predetermined distance is a distance longer than the detectable distance of the vehicle-mounted sensor mounted so that recognition of the preceding vehicle of vehicle front was possible,
The automatic operating method of a vehicle.
| 4. It is an automatic operating method of the vehicle as described in any one of Claims 1-3,
Comprising:
The said predetermined distance is extended as the time when the present target vehicle speed is high, and it lengthens distance it drive|works with the target vehicle speed of a said automatic driving|operation lower than said normal time,
The automatic operating method of a vehicle.
| 5. It is an automatic operating method of the vehicle as described in any one of Claims 1-4,
Comprising:
The said predetermined distance is changed according to the attribute of the road on the said driving route,
The automatic operating method of a vehicle.
| 6. It is an automatic operating method of the vehicle as described in any one of Claims 1-5,
Comprising:
At the time of the said traffic congestion detection, the said target vehicle speed is gradually reduced from the present target vehicle speed,
The automatic operating method of a vehicle.
| 7. It is an automatic operating method of the vehicle as described in any one of Claims 1-6,
Comprising:
At the time of the said traffic congestion detection whose present target vehicle speed is higher than the optimal fuel-consumption vehicle speed of the own vehicle, the said target vehicle speed is orient|assigned to a said optimal fuel-consumption vehicle speed, and is reduced,
The automatic operating method of a vehicle.
| 8. It is an automatic operating method of the vehicle as described in any one of Claims 1-7,
Comprising:
The said traffic congestion is detected based on VICS information, vehicle-to-vehicle communication information, road-to-vehicle communication information, or the road traffic information from a portable terminal,
The automatic operating method of a vehicle.
| 9. It is an automatic operating method of the vehicle as described in any one of Claims 1-8,
Comprising:
The varying state of the said traffic congestion is estimated,
According to the prediction result of the said varying state, the deceleration speed at the time of reducing a vehicle speed is changed,
The automatic operating method of a vehicle.
| 10. It is an automatic operating method of a vehicle which controls the vehicle equipped with an internal combustion engine as a drive source,
Comprising:
It is determined whether traffic congestion was detected on the driving route of the own vehicle during driving|running|working by the automatic driving|operation which brings a vehicle speed close to a target vehicle speed,
At the time of the traffic congestion detection which detected the said traffic congestion, the target vehicle speed of a said automatic driving|operation lower than the normal time excepting the time of the said traffic congestion detection is set,
The vehicle speed of the said own vehicle is controled from near side to the said target vehicle speed above predetermined distance rather than the traffic congestion tail end,
The said own vehicle is made to approach the said traffic congestion.
When the said own vehicle approaches below a distance shorter than the said predetermined distance with respect to the said traffic congestion, supply of the fuel with respect to the said internal combustion engine is stopped,
The automatic operating method of a vehicle.
| 11. It is an automatic operation device of a vehicle which controls the vehicle equipped with an internal combustion engine as a drive source,
Comprising:
The driving state detection part which detects driving state of a vehicle,
The operation-control part which sets the control parameter regarding the automatic driving|operation which brings a vehicle speed close to a target vehicle speed based on driving state of a said vehicle,
The traffic congestion detection part which detects traffic congestion on the driving route of the own vehicle during driving|running|working by said automatic driving|operation,
These are provided,
The said operation-control part,
The said control parameter is set by the said traffic congestion detection part in both the time of the traffic congestion detection of vehicle front which detected the said traffic congestion previously above predetermined distance, and a normal time other than that,
A 1st control parameter is set in said normal time,
At the time of the said traffic congestion detection
The 2nd control parameter which makes the target vehicle speed of a said automatic driving|operation lower than said normal time reduce a vehicle speed is set,
The vehicle speed of the said own vehicle is controled by a said 2nd control parameter to the said target vehicle speed,
The said own vehicle is made to approach the said traffic congestion.
When the said own vehicle approaches below a distance shorter than the said predetermined distance with respect to the said traffic congestion, supply of the fuel with respect to the said internal combustion engine is stopped,
The automatic controller of a vehicle.
| 12. It is determined [ of vehicle front ] on the driving route of the own vehicle whether traffic congestion was detected previously above predetermined distance during driving|running|working by the automatic driving|operation which brings a vehicle speed close to a target vehicle speed,
At the time of the traffic congestion detection which detected the said traffic congestion, the target vehicle speed of a said automatic driving|operation always lower than the normal time excepting the time of the said traffic congestion detection is set,
The vehicle speed of the said own vehicle is controled to the said target vehicle speed,
The said own vehicle is made to approach the said traffic congestion.
The automatic operating method of a vehicle.
| 13. It is determined whether it exists on the driving route of the own vehicle at the time of the traffic congestion detection of vehicle front which detected traffic congestion previously above predetermined distance during driving|running|working by the automatic driving|operation which brings a vehicle speed close to a target vehicle speed,
At the time of the said traffic congestion detection that detected the said traffic congestion during driving|running|working with a vehicle speed higher than the optimal fuel-consumption vehicle speed of the own vehicle, the vehicle speed of the said own vehicle is always reduced towards a said optimal fuel-consumption vehicle speed with respect to determination that the said traffic congestion was detected,
The automatic operating method of a vehicle. | The method involves determining whether or not a traffic jam has been detected ahead of the predetermined distance on the traveling route of the own vehicle during traveling by automatic driving that brings the vehicle speed close to the target vehicle speed. The vehicle speed is reduced below the target vehicle speed of the automatic driving at the normal time other than the time when the traffic is detected. An INDEPENDENT CLAIM is included for a automatic controller of a vehicle. Automatic operating method of vehicle. The tail end of the traffic jam is reduced, and the fuel consumption through the entire automatic driving is improved. The drawing shows a graphical view of the change of the vehicle speed. (Drawing includes non-English language text) |
Please summarize the input | VEHICLE CONTROL METHOD AND CONTROL DEVICEA method for controlling a vehicle in which, when a drive source stop condition is established while a vehicle is traveling, a sailing stop control is executed in which a drive source of the vehicle is stopped, an engaging element provided between the drive source and drive wheels is released, and the vehicle travels under inertia, wherein information on a status of a road on which a host vehicle is to travel is acquired; a determination is made based on the information as to whether there is a section on a route where the sailing stop control can be executed; when a section where the sailing stop control can be executed is present, a power shortage amount, which is a shortage in an amount of power during the sailing stop control, is estimated based on the information; and a battery is charged with power equivalent to the power shortage prior to starting the sailing stop control.|1. A vehicle control method in which, when a drive source stop condition is established while a vehicle is traveling, a sailing stop control is executed in which a drive source of the vehicle is stopped, an engaging element provided between the drive source and a drive wheel is released, and the vehicle travels under inertia, the vehicle control method comprising:
* acquiring (S10) information on a status of a road on which the host vehicle will travel;
* predicting (S20) whether there is a section on a route where the sailing stop control can be executed based on the information;
* upon predicting that the section is present where the sailing stop control can be executed, estimating (S30) a power shortage amount, which is a shortage in an amount of power during the sailing stop control, based on the information; and
* charging a battery with power required to cover the power shortage amount prior to starting the sailing stop control,
* wherein
* the power shortage amount is estimated as being larger when autonomous driving is performed in the section where the sailing stop control is executed as compared to when driver-enabled driving is performed in the section and
* the information includes route information, map information acquired by a navigation system, a travel history of the vehicle, and other information acquired through road-to-vehicle communication and/or vehicle-to-vehicle communication.
| 2. The vehicle control method according to claim 1, further comprising
* estimating a frequency and an amount of operation of an operation system that includes steering or braking while in the section where the sailing stop control is executed based on the information; and
* the power shortage amount is estimated to be larger correspondingly with respect to an increase in the frequency and the amount of the operation.
| 3. The vehicle control method according to any claims 1 through 2, wherein
* when the sailing stop control is terminated due to a power deficiency in the section in which the sailing stop control can be executed,
* learning (S110) an actual electrical power consumption and an actual amount of decrease in a battery SOC for the section within which the sailing stop control can be executed is learned in association with the status of the road and travel history; and
* increasing a subsequent charge amount in the battery prior to starting the sailing stop control as compared to a present based on results of the learning.
| 4. The method for controlling a vehicle according to any claims 1 through 3, wherein
when the section in which the sailing stop control can be executed is a downwardly sloping road that has a gradient equal to or greater than a prescribed gradient and is of a distance equal to or greater than a prescribed distance, the battery is charged via regeneration with electrical power necessary to cover the power shortage amount after the downwardly sloping road has been entered, and the sailing stop control is started thereafter.
| 5. The vehicle control method according to any claims 1 through 4, wherein
when a generator is used to generate power in order to cover the power shortage amount prior to entering the section in which the sailing stop control can be executed, and a fuel economy performance has declined by at least a prescribed amount, generation of power for charging the battery with electrical power necessary to cover the power shortage amount is disallowed, and the sailing stop control in the section in which the sailing stop control can be executed is disallowed.
| 6. The vehicle control method according to any claims 1 through 4, wherein
when a generator is used to generate power in order to cover the power shortage amount prior to entering a section within which the sailing stop control can be executed, and a fuel economy performance has declined by at least a prescribed amount, generation of power for charging the battery with electrical power necessary to cover the power shortage amount is disallowed, and, in the section within which the sailing stop control can be executed, sailing idle control in which the engaging element is released without stopping the drive source and the vehicle travels under inertia is executed.
| 7. A control device for controlling a vehicle, the control device comprising:
* a drive source (1) for use in travel;
* an automatic transmission (2) connected to the drive source (1) and having a forward engaging element (3);
* a control unit (9) that performs a sailing stop control in which the drive source (1) of the vehicle is stopped, the forward engaging element (3) is released, and the vehicle travels under inertia upon a drive source stop condition being established while the vehicle is traveling; and
* an information-acquiring unit that acquires information on a status of a road on which the host vehicle will travel, wherein
* the control unit (9)
* predicts whether there is a section on a route where the sailing stop control can be executed based on the information;
* upon determining the section is present where the sailing stop control can be executed, estimates a power shortage amount, which is a shortage in an amount of power during the sailing stop control, based on the information; and
* charges a battery (5) with power required to cover the power shortage amount prior to starting the sailing stop control,
* wherein
* the power shortage amount is estimated as being larger when autonomous driving is performed in the section where the sailing stop control is executed as compared to when driver-enabled driving is performed in the section and
* the information includes route information, map information acquired by a navigation system, a travel history of the vehicle, and other information acquired through road-to-vehicle communication and/or vehicle-to-vehicle communication. | The method involves performing sailing stop control by stopping drive source of vehicle and releasing fastening element between drive source and drive wheel to travel by inertia when drive source stop condition is satisfied while vehicle is traveling. An insufficient power amount, which is amount of power lacking during execution of sailing stop control, is estimated based on acquired information on road condition when stop control is performed. A battery is charged with power necessary to cover the power amount before start of sailing stop control. An INDEPENDENT CLAIM is included for a vehicle control apparatus. Vehicle control method. The fuel-consumption improvement effect is enlarged by sailing stop control. The drawing shows a graphical view showing the vehicle control process. (Drawing includes non-English language text) |
Please summarize the input | Vehicle-to-pedestrian communication systemsVehicle-to-pedestrian information systems that use directional sound transmission on autonomous vehicles are disclosed. A cloud computing system manages messages for transmission to pedestrians via autonomous vehicles having directional speakers. The cloud computing system identifies pedestrians and identifies messages for the pedestrians. Pedestrians may be known and authenticated to the cloud computing system or may be unknown. The cloud computing system maintains profiles for known pedestrians and transmits messages to vehicles based on the profiles. The cloud computing system keeps track of the location of vehicles and causes the vehicles to use directional speakers to transmit messages to the pedestrians based on the relative positions of the vehicles and the pedestrians.What is claimed is:
| 1. A vehicle-to-pedestrian information system comprising:
a cloud computing system configured to communicate with a vehicle configured for autonomous piloting, the vehicle including a directional speaker,
wherein the cloud computing system is configured to:
identify a message for a pedestrian based on a location of the pedestrian;
transmit the message to the vehicle; and
cause the vehicle to play the message for the pedestrian via the directional speaker.
| 2. The vehicle-to-pedestrian information system of claim 1, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and wherein the cloud computing system is further configured to:
authenticate the pedestrian by communicating with a personal device associated with the pedestrian.
| 3. The vehicle-to-pedestrian information system of claim 2, wherein the cloud computing system is further configured to:
identify a user profile based on the authentication with the pedestrian,
wherein the message is identified based on the user profile, the profile storing data comprising user preferences for the pedestrian indicating message types and/or contents to be provided to the pedestrian.
| 4. The vehicle-to-pedestrian information system of claim 2, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and wherein the cloud computing system is further configured to:
identify the location of the pedestrian based on location data reported by the personal device associated with the pedestrian.
| 5. The vehicle-to-pedestrian information system of claim 1, wherein:
the message comprises a first portion of a composite message, and
the cloud computing system is further configured to:
transmit a second portion of the composite message to a different second vehicle for playback to the pedestrian, wherein the cloud computing system selects the vehicle and the second vehicle from among multiple vehicles proximate to the pedestrian based on the different driving paths of the multiple vehicles.
| 6. The vehicle-to-pedestrian information system of claim 5, wherein:
the cloud computing system is configured to instruct the first vehicle to play the first portion of the composite message and the second vehicle to play the second portion of the composite message in a manner that minimizes Doppler shift observed by the pedestrian.
| 7. The vehicle-to-pedestrian information system of claim 1, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and wherein:
the message comprises a safety message.
| 8. The vehicle-to-pedestrian information system of claim 1, wherein the vehicle is configured to:
display a first visual indicator communicating that the vehicle is operating autonomously when the vehicle is operating autonomously, the vehicle operating autonomously when the driver does not have control of steering of the vehicle; and
display a different second visual indicator communicating that the vehicle is operating non-autonomously when the vehicle is operating non-autonomously, the vehicle operating non-autonomously when the driver has control of steering of the vehicle, wherein the first and second visual indicators are positioned on the vehicle to be visible to the pedestrian.
| 9. An autonomous vehicle capable of communicating information to a pedestrian, the autonomous vehicle comprising:
a steering system and a speed control system;
a directional speaker; and
an on-board computer configured to:
autonomously control the steering system and the speed control system based on environmental conditions and navigation conditions;
receive a message for a pedestrian from a cloud computing system;
determine a location of the pedestrian; and
cause the directional speaker to play the message for the pedestrian based on the location of the pedestrian.
| 10. The autonomous vehicle of claim 9, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and wherein:
the pedestrian is authenticated to the cloud computing system via a personal device associated with the pedestrian.
| 11. The autonomous vehicle of claim 10, wherein:
the message is based on a user profile that is associated with the authenticated pedestrian, the profile storing data comprising user preferences for the pedestrian indicating message types and/or contents to be provided to the pedestrian.
| 12. The autonomous vehicle of claim 10, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and wherein determining the location comprises:
receiving the location from the cloud computing system, which previously received the location from the personal device associated with the pedestrian.
| 13. The autonomous vehicle of claim 9, wherein:
the message comprises a first portion of a composite message; and
the composite message also includes a second portion that is sent to a different autonomous vehicle for playback to the pedestrian, wherein the cloud computing system selects the vehicle and the second vehicle from among multiple vehicles proximate to the pedestrian based on the different driving paths of the multiple vehicles.
| 14. The autonomous vehicle of claim 9, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and wherein:
the message comprises a safety message.
| 15. The autonomous vehicle of claim 9, further comprising:
a visual indicator display, wherein the on-board computer is configured to:
display a first visual indicator on the visual indicator display communicating that the vehicle is operating autonomously when the vehicle is operating autonomously, the vehicle operating autonomously when the driver does not have control of steering of the vehicle; and
display a different second visual indicator on the visual indicator display communicating that the vehicle is operating non-autonomously when the vehicle is operating non-autonomously, the vehicle operating non-autonomously when the driver has control of steering of the vehicle, wherein the first and second visual indicators are positioned on the vehicle to be visible to the pedestrian.
| 16. A method for facilitating vehicle-to-pedestrian communication, the method comprising:
identifying a message for a pedestrian based on a location of the pedestrian;
transmitting the message to an autonomous vehicle that includes a directional speaker; and
causing the vehicle to play the message for the pedestrian via the directional speaker.
| 17. The method of claim 16, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and further comprising:
authenticating the pedestrian by communicating with a personal device associated with the pedestrian.
| 18. The method of claim 17, further comprising:
identifying a user profile based on the authentication with the pedestrian, wherein the message is identified based on the user profile, the profile storing data comprising user preferences for the pedestrian indicating message types and/or contents to be provided to the pedestrian.
| 19. The method of claim 17, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and further comprising:
identifying the location of the pedestrian based on location data reported by the personal device associated with the pedestrian.
| 20. The method of claim 16, wherein:
the message comprises a first portion of a composite message, and the method further comprises:
transmitting a second portion of the composite message to a different second vehicle for playback to the pedestrian, wherein the cloud computing system selects the vehicle and the second vehicle from among multiple vehicles proximate to the pedestrian based on the different driving paths of the multiple vehicles. | The system (100) has a cloud computing system (106) which is configured to communicate with a vehicle (102) including a directional speaker, and for autonomous piloting. The cloud computing system is configured to identify a message for a pedestrian (204) based on a location of the pedestrian. The message is transmitted to the vehicle. The vehicle is caused to play the message for the pedestrian through the directional speaker. The user profile is identified based on the authentication with the pedestrian. An INDEPENDENT CLAIM is included for a method for facilitating vehicle-to-pedestrian communication. Vehicle-to-pedestrian information system used in autonomous vehicle (claimed) e.g. car. The car can make decisions regarding which messages to play through the directional speakers without control of the cloud based system. The directional speakers allow the vehicles to deliver sound messages to specific pedestrians and to avoid delivering the messages to other pedestrians in different locations. The cloud computing system can select different vehicles based on the proximity to the pedestrian, based on a desire to avoid a Doppler shift based on other sound quality considerations. The drawing shows a schematic view illustrating the interactions between the vehicles of the car-to-pedestrian system and pedestrians. 100Vehicle-to-pedestrian information system102Vehicle106Cloud computing system112Communication link204Pedestrian |
Please summarize the input | FUSION AND CALIBRATION OF SENSOR SIGNALS IN A MOVING VEHICLEA sensor processing device located within a vehicle that is being driven, the processing device communicating with plural sensors, each sensor generating signal data, the processing device including a transceiver transmitting data derived by the processing device from the sensor signal data, to one or more remote servers, and receiving sensor-related information from the one or more remote servers, a synchronizer evaluating latencies of the sensors, an error estimator estimating accuracies of the sensor signal data, a sensor validator determining if one or more of the sensors are failed, and a calibrator transforming the sensor signal data to a common vehicle reference system.CLAIMS
| 1. A sensor processing device located within a vehicle that is being driven, the processing device communicating with plural sensors, each sensor generating signal data, the processing device comprising:
a transceiver transmitting data derived by the processing device from the sensor signal data, to one or more servers, and receiving sensor-related information from the one or more servers;
a synchronizer evaluating latencies of the sensors;
an error estimator estimating accuracies of the sensor signal data;
a sensor validator determining if one or more of the sensors are failed; and a calibrator transforming the sensor signal data to a common vehicle reference system.
| 2. The sensor processing device of claim 1 wherein the one or more servers comprise one or more devices within the vehicle or within the processing device.
| 3. The sensor processing device of claim 1 wherein the sensors are members of the group consisting of an accelerometer, a barometer, a beacon, a gyroscope, a magnetometer, a camera, Lidar, radar, ultrasonic radar, a microphone, a global positioning system, and on-board diagnostic sensors.
| 4. The sensor processing device of claim 1 wherein the processing device derives driver-related information from the transformed signal data, the driving-related information comprising vehicle position and orientation, autonomous vehicle feedback, driver feedback, or driver scores, and exposes the driving-related information to other devices.
| 5. The sensor processing device of claim 1 wherein the processing device exposes the synchronized and calibrated signal data through an application programming interface (API) or through a software development kit (SDK).
| 6. The sensor processing device of claim 1 wherein the processing device responds to a trigger event by logging sensor data on the one or more servers for analysis, wherein rules for trigger events are provided by the processing device or by the one or more servers.
| 7. The sensor processing device of claim 6 wherein the one or more servers conducts collision analysis or system failure analysis based on the logged sensor data.
| 8. A vehicle network data processor that receives times series data from transmitters in one or more vehicles that are being driven, the time series data based on plural sensors located in the one or more vehicles, the vehicle data processor deriving, from the received time series data, sensor- related information and driver-related information, the vehicle data processor comprising:
one or more cellular or Wi-Fi transceivers receiving time series data from the vehicles; a synchronizer evaluating latencies of the time series data;
an error estimator for estimating accuracies of the time series data; and
one or more database managers storing the sensor -related information and the driver- related information derived by the processor in one or more databases,
wherein said one or more cellular or Wi-Fi transceivers transmit the sensor-related information in the databases to the vehicles.
| 9. The vehicle network data processor of claim 8 wherein the sensors are members of the group consisting of an accelerometer, a barometer, a gyroscope, a magnetometer, a camera, a microphone, a global positioning system, on-board diagnostic sensors, and a temperature sensor.
| 10. The vehicle network data processor of claim 8 wherein the driving-related information derived by said processor comprises vehicle-to-vehicle network information, advanced driver assistance system information, autonomous driving training information, map information and fleet driver scores.
| 11. The vehicle network data processor of claim 8 wherein the sensor-related information stored by said one or more database managers includes initial sensor calibration models obtained from at least some of the plural sensors, and wherein other sensors access the initial sensor calibration models from said one or more database managers for use as their calibration models.
| 12. The vehicle network data processor of claim 11 wherein the initial calibration models are obtained from local calibrations performed by the individual sensors and uploaded to said one or more database managers.
| 13. A non-transitory computer readable medium storing instructions, which, when executed by a processing device located in a vehicle that is being driven, cause the processing device to process signal data from sensors in the vehicle, comprising causing the processing device to:
receive signal data from the sensors;
receive sensor-related information from one or more remote servers;
evaluate latencies of the sensors so as to synchronize the sensor signal data; estimate accuracies of the sensor signal data;
determine if one or more of the sensors are failed; and
transform the sensor signal data to a common vehicle reference system.
| 14. The computer readable medium of claim 13 wherein the processor evaluates latencies of the sensor signal data based on domain matching whereby the same physical quantity is derived from the sensor signal data in more than one way.
| 15. The computer readable medium of claim 13 wherein the processing device transforms the sensor signal data to the vehicle reference system by use of a rotation matrix that transforms an orthogonal set of device axes to an orthogonal set of vehicle axes, the device axes comprising two perpendicular axes in a plane of the device and a third axis normal to the plane, and the vehicle axes comprising a roof axis, a forward axis and a side axis.
| 16. The computer readable medium of claim 13 wherein the sensor-related information comprises calibration data for the sensors in the vehicle.
| 17. A non-transitory computer readable medium storing instructions, which, when executed by a vehicle network data processor, cause the data processor to receive time series data from transmitters in one or more vehicles that are being driven, the time series data being based on plural sensors located in the one or more vehicles, and to derive sensor-related information and driver- related information from the received time series data, comprising causing the data processor to:
receive time series data from the vehicles;
evaluate latencies of the sensors;
estimate accuracies of the time series data;
store the sensor-related information and the driver-related information derived by the processor in one or more databases; and
transmit the sensor-related information in the databases to the vehicles.
| 18. A vehicle sensor system, comprising a plurality of computer processing units within a vehicle that is being driven, the vehicle including plural sensors that generate signal data, the plurality of computer processing units jointly comprising circuitry for sensor calibration and fusion, the circuitry comprising:
one or more local area data connections receiving signal data from the sensors; one or more transceivers transmitting data derived from the sensor signal data to one or more remote servers, and receiving sensor-related information from the one or more remote servers;
a synchronizer evaluating latencies of the sensors;
an error estimator for estimating accuracies of the received sensor signal data;
a sensor validator for determining if one or more of the sensors are failed; and a calibrator for transforming the received sensor signal data to a common vehicle reference system.
| 19. The vehicle sensor system of claim 18 wherein said computer processing units are members of the group consisting of smartphones, Internet of things (IoT) devices, wearable devices, and a vehicle system.
| 20. The vehicle sensor system of claim 18 wherein the sensors are members of the group consisting of an accelerometer, a barometer, a gyroscope, a magnetometer, a camera, a microphone, a global positioning system, on-board diagnostic sensors and temperature sensors.
| 21. The vehicle sensor system of claim 18 wherein the plurality of computer processing units jointly derive driving-related information, the driver-related information comprising autonomous vehicle feedback, driver feedback, one or more driver scores, or vehicle orientation and positioning information.
| 22. A method for a plurality of computer processing units within a vehicle that is being driven, the vehicle comprising plural sensors that generate signal data, to jointly perform sensor data processing, the method comprising dynamically allocating among the computer processing units the following real-time tasks, the allocation being based on currently available bandwidth and computing power of each computer processing unit: evaluate latencies of the sensors so as to synchronize the sensor signal data;
estimate accuracies of the sensor signal data;
determine if one or more of the sensors are failed; and
transform the sensor signal data to a common vehicle reference system.
| 23. The method of claim 22 wherein said dynamically allocating comprises dynamically selecting one of the computer processing units to be a master over the other computer processing units.
| 24. The method of claim 22 wherein system data is shared among the processing units, and when one of the processing units is removed one or more others of the processing units perform the removed processing unit's allocated tasks.
| 25. The method of claim 22 wherein the task to evaluate latencies is performed by domain matching whereby the same physical quantity is derived from the sensor signal data in more than one way.
| 26. The method of claim 22 wherein the task to transform the sensor signal data to the vehicle reference system is performed using a rotation matrix that transforms an orthogonal set of sensor axes to an orthogonal set of vehicle axes, the sensor axes comprising two perpendicular axes in a plane of the sensor and a third axis normal to the plane, and the vehicle axes comprising a roof axis, a forward axis and a side axis. | The sensor processing device has a transceiver which transmits data derived by the processing device from the sensor signal data to one or more servers and receives sensor-related information from one or more servers. A synchronizer (152) evaluates latencies of the sensors. An error estimator (154) estimates accuracies of the sensor signal data. A sensor validator (156) determines if one or more of the sensors are failed. A calibrator (158) transforms the sensor signal data to a common vehicle reference system. The processing device derives driver-related information from the transformed signal data and exposes the driving-related information to other devices. The driving-related information comprises vehicle position and orientation, autonomous vehicle feedback, driver feedback, or driver scores. INDEPENDENT CLAIMS are included for the following:a vehicle network data processor;a non-transitory computer readable medium storing program for processing signal data from sensors in the vehicle;a vehicle sensor system; anda method for processing sensor data using several computer processing units within a vehicle. Sensor processing device used for processing data of sensor e.g. accelerometer, a barometer, a gyroscope, a magnetometer, a camera, a microphone, a global positioning system, on-board diagnostic sensors and temperature sensors of vehicle sensor system (all claimed) used in moving vehicle e.g. land vehicle such as a car or a motorcycle, a water vehicle such as a boat or a ship, or an air vehicle such as an airplane or a drone. Can also be used in time series of system measurements of inter alia fuel systems, emission systems, transmission systems, speed control systems and idle control systems of vehicle. The device provides proper unified gateway for data collection, and system to obtain, align, synchronize and calibrate data sources, and assess data validity and analyze it for an Artificial Intelligence-aware driving experience. The error, the timestamp error and the timestamp latency in the data are analyzed accurately. The new sensor immediately provides accurate calibrated results, avoiding the need to wait for the online calibration process to converge. The speed updates are propagated back to correct the gravity vector which improves accuracy and reduces error estimation. The vehicle data processor includes one or more cellular or wireless-fidelity (WiFi) transceivers which receive time series data from the vehicles. The drawing shows a simplified block diagram of a sensor processor for fusing and calibrating data received from a moving vehicle. 150Sensor processor152Synchronizer154Error estimator156Sensor validator158Calibrator |
Please summarize the input | Vehicle Localization and Identification by Map Merging in 5G and 6GAutonomous vehicles, and user-driven vehicles with an emergency intervention capability, can communicate to avoid collisions using 5G/6G technology, but this level of cooperation is possible only if the threatened vehicles have already determined the relative location and wireless address of the other vehicle. Disclosed is a method for wireless vehicles in traffic to exchange distance and angular information of the other vehicles in view, from which a position map can be prepared indicating the relative locations of each participating and non-participating vehicle. In addition, the traffic map can be annotated with the wireless addresses of the participating vehicles, thereby enabling them to communicate instantly in an emergency. The traffic map may be prepared or updated by one of the vehicles in traffic, or by a roadside access point. Satellite data is not necessary for the relative localization, but may be included if available.|1. A method for a first wireless device to determine locations of vehicles in traffic, the method comprising:
a) attempting and failing to determine a location of the first wireless device using a global navigation satellite system (GNSS);
b) then broadcasting a request message comprising a wireless address of the first wireless device, a time delay, and a request for each wireless entity within radio range to:
i) wait the time delay;
ii) then measure an angle and a distance of each vehicle in view of the wireless entity; and then
iii) transmit a reply message to the first wireless device, the reply message listing the angles and the distances, and further including a wireless address of the wireless entity; and
c) determining, according to the angles and distances, a location of at least one vehicle relative to the first wireless device.
| 2. The method of claim 1, wherein each request message is configured according to 5G or 6G technology.
| 3. The method of claim 1, wherein:
a) the first wireless entity is a vehicle or a roadside access point; and
b) each wireless entity is a vehicle or a wireless camera or a roadside access point.
| 4. The method of claim 1, further comprising:
a) after transmitting the request message, waiting the time delay; and
b) then measuring an angle and a distance of each vehicle in view of the first wireless device.
| 5. The method of claim 4, further comprising:
a) receiving, from one or more of the wireless entities, one or more reply messages, wherein each reply message indicates:
i) one or more angles and one or more distances of one or more vehicles in view of the wireless entity; and
ii) a wireless address of the wireless entity.
| 6. The method of claim 5, further comprising:
a) combining the angles and distances from the reply messages, with the angles and distances measured by the first wireless device; and
b) determining, according to the combining, a two-dimensional position of each vehicle that is viewed by at least one of the wireless entities or by the first wireless device.
| 7. The method of claim 5, further comprising:
a) determining, according to the angle and distance measurements, a two-dimensional position of each vehicle in traffic.
| 8. The method of claim 5, further comprising determining:
a) which angle and distance measurements correspond to a vehicle that is viewed by exactly one viewer; and
b) which angle and distance measurements correspond to a vehicle that is viewed by more than one viewer;
c) wherein a viewer comprises the first wireless device or one of the wireless entities.
| 9. The method of claim 8, further comprising:
a) performing a fitting analysis according to the angle and distance measurements, wherein the fitting analysis comprises determining a calculated position of each vehicle that is viewed by more than one viewer.
| 10. The method of claim 5, further comprising:
a) broadcasting a mapping message indicating two-dimensional position coordinates of each vehicle in traffic relative to the first wireless device; and
b) wherein the mapping message further indicates, for each wireless entity, which coordinates are associated with the wireless entity, and which wireless address is associated with the wireless entity.
| 11. Non-transitory computer-readable media in a wireless entity, the media containing instructions that when implemented in a computing environment cause a method to be performed, the method comprising:
a) receiving a request message from a first vehicle, the request message requesting the wireless entity to:
i) determine, at a particular time, an angle measurement and a distance measurement of each vehicle in view of the wireless entity; and
ii) transmit, to a particular wireless address of the first vehicle, a reply message comprising the angle and distance measurements;
b) at the particular time, measuring an angle and a distance of each vehicle in view of the wireless entity:
c) transmitting a reply message to the first vehicle, the reply message comprising the angles and distances measured by the wireless entity; and
d) receiving, from the first vehicle, a traffic map message comprising two-dimensional coordinates of vehicles in traffic.
| 12. The non-transitory computer-readable media of claim 11, wherein the traffic map message further comprises a wireless address of each vehicle that transmitted a reply message.
| 13. The non-transitory computer-readable media of claim 11, wherein the traffic map message further comprises one or more visible characteristics of each vehicle in traffic.
| 14. The non-transitory computer-readable media of claim 13, wherein the visible characteristics comprises a vehicle type encoded in a predetermined code comprising at most six bits.
| 15. The non-transitory computer-readable media of claim 11, wherein the request message, the reply messages, and the traffic map message are transmitted on a sidelink channel allocated for vehicle-to-vehicle or vehicle-to-anything communications.
| 16. The non-transitory computer-readable media of claim 11, the method further comprising:
a) determining, according to an electronic compass, a direction of geographical north; and
b) for each vehicle in view of the wireless entity, measuring an angle of a centroid of the vehicle in view, relative to geographical north.
| 17. The non-transitory computer-readable media of claim 11, the method further comprising:
a) using a radar or lidar or sonar distance-measuring sensor, measuring a distance between the wireless entity and a closest part of each vehicle in view of the wireless entity;
b) for each vehicle in view, calculating a centroid correction distance comprising, wherein the centroid correction distance comprises an angle subtended by the vehicle in view times the distance to the closest part of the vehicle in view; and
c) adding the centroid correction distance to the measured distance between the first vehicle and the closest part of the particular vehicle.
| 18. A processor comprising an AI (artificial intelligence) model, the processor configured to:
a) take, as input, a plurality of sets of distances and angles; and
b) provide, as output, a traffic map;
c) wherein each set of distances and angles is measured by a participating vehicle, of a plurality of participating vehicles in traffic;
d) wherein each distance and angle corresponds to a measured vehicle viewed by one of the participating vehicles; and
e) wherein the traffic map comprises a list of current position coordinates of the measured vehicles in traffic.
| 19. The processor of claim 18, wherein the AI model is further configured to be implemented in a processor of one of the participating vehicles.
| 20. The processor of claim 18, wherein the AI model is further configured to:
a) for each measured vehicle, determine a difference between the current position coordinates and previously determined position coordinates;
b) for each measured vehicle, calculate a velocity according to the difference and a time difference between the current position coordinates and the previously determined position coordinates;
c) for each pair of measured vehicles, calculate a distance between the measured vehicles of the pair and a relative velocity between the measured vehicles of the pair; and
d) predict, according to the distance and the relative velocity, when an imminent collision is expected to occur. | The method involves attempting and failing to determine a location of a wireless device using a global navigation satellite system (GNSS ). The request message comprising a wireless address of the wireless device, a time delay, and a request for each wireless entity is broadcasted within radio range to measure an angle and a distance of each vehicle (201) in view of the entity and transmit a reply message to the wireless device, where the reply message includes the angles, the distances, and a wireless address of the wireless entity. The location of the vehicle is determined relative to the wireless device is determined according to the angles and distances. The request message is configured according to 5G or 6G technology. The first wireless entity is a vehicle, a wireless camera or a roadside access point. INDEPENDENT CLAIMS are included for: (1) non-transitory computer-readable media containing instructions for short-range locating and identification of vehicles; (2) a processor comprising an artificial intelligence model for short-range locating and identification of vehicles. Method for short-range locating and identification of vehicles such as autonomous and semi-autonomous vehicles relative to a first wireless device in traffic by map merging based on fifth-generation (5G ) or sixth-generation (6G ) technology to avoid collisions and facilitate flow of traffic. Uses include but are not limited to sedan, delivery van, pickup truck, sports car, motor cycle, semi-trailer, etc. The method enables determining the wireless address of each proximate vehicle in traffic, so that the vehicles can communicate for traffic management and collision avoidance. The vehicles may cooperate to manage the flow of traffic, avoid hazards, and minimize energy consumption. The drawing shows a top view of vehicles. 200Freeway 201, 202, 203Vehicles 204Truck |
Please summarize the input | V2V and V2X Communications in 5G and 6G Based on Displayed MatrixDisclosed is a “connectivity matrix” that wireless entities (vehicles, fixed assets, etc.) can display indicating the 5G/6G wireless address of the entity. Other wireless devices can then image the connectivity matrix, determine the wireless address, and then communicate in sidelink, on frequencies allocated for ad-hoc networking. Alternatively, the two entities can communicate through a local base station, on managed channels, using the displayed wireless address. The matrix can provide additional information, such as the frequency, bandwidth, and modulation scheme favored by the entity. Alternatively, the matrix can provide a key code maintained by a central authority, so that a second wireless entity can read the code and request the associated wireless address (and frequency, bandwidth, etc.) from the central authority. By either method, the two wireless entities can then communicate explicitly thereafter.|1. Non-transitory computer-readable media containing instructions that, when executed by a computing environment, cause a method to be performed, the method comprising:
a) maintaining, in further non-transitory computer-readable media, a tabulation of entries, each entry comprising an index value and a wireless address of a vehicle or a fixed asset;
b) receiving a request message specifying a code, wherein the code is indicated by a matrix comprising black and white rectangular fields visibly displayed by a particular vehicle or fixed asset;
c) determining a particular index value according to the code;
d) selecting a particular entry of the tabulation according to the particular index value;
e) determining a particular wireless address comprising the particular entry; and
f) transmitting a reply message indicating the particular wireless address.
| 2. The non-transitory computer-readable media of claim 1, wherein the reply message is transmitted according to 5G or 6G technologies.
| 3. The non-transitory computer-readable media of claim 1, the method further comprising:
a) determining that the request message indicates that the particular entry should be transmitted in entirety; and
b) transmitting, in the reply message, the particular entry in entirety.
| 4. The non-transitory computer-readable media of claim 1, the method further comprising:
a) determining that the request message indicates that only the particular wireless address should be transmitted; and
b) transmitting, in the reply message, the particular wireless address without transmitting other information, if any, comprising the particular entry.
| 5. The non-transitory computer-readable media of claim 1, the method further comprising:
a) receiving, from the particular vehicle or fixed asset, a change message, the change message specifying the code and providing additional or changed information;
b) determining a particular index according to the code;
c) determining a particular entry according to the particular index; and
d) revising the particular entry according to the additional or changed information.
| 6. The non-transitory computer-readable media of claim 1, the method further comprising:
a) receiving a joining message from a new vehicle or fixed asset, wherein the new vehicle or fixed asset is not associated with any entry in the tabulation;
b) determining that the joining message specifies a wireless address of the new vehicle or fixed asset, and requests that a new entry be added to the tabulation;
c) generating the new entry in the tabulation, the new entry comprising the wireless address of the new vehicle or fixed asset;
d) determining a new index value, and associating the new entry with the new index value;
e) generating a new code according to the new index value; and
f) transmitting a welcome message to the new vehicle or fixed asset, the welcome message indicating the new code.
| 7. The non-transitory computer-readable media of claim 1, wherein:
a) the joining message further indicates a frequency and a bandwidth; and
b) the new entry further comprises the frequency and the bandwidth.
| 8. The non-transitory computer-readable media of claim 1, wherein:
a) the joining message further indicates an MCS (modulation and coding scheme); and
b) the new entry further comprises the MCS.
| 9. The non-transitory computer-readable media of claim 1, wherein:
a) the joining message further indicates one or more capabilities or limitations of the new vehicle or fixed asset; and
b) the new entry further comprises the one or more capabilities or limitations of the new vehicle or fixed asset.
| 10. A method for a first vehicle, in traffic comprising a second vehicle, the method comprising:
a) observing, using a camera or sensor in or on the first vehicle, a matrix displayed by the second vehicle, the matrix comprising a plurality of fields colored black or white according to a binary code;
b) determining the binary code of the matrix;
c) determining an entry in a tabulation, the entry associated with the code;
d) determining, according to the entry, a wireless address of the second vehicle; and
e) transmitting a wireless message, according to the wireless address, to the second vehicle.
| 11. The method of claim 10, wherein:
a) the code comprises a predetermined number of code bits;
b) the matrix comprises a plurality of border fields surrounding a plurality of data fields; and
c) the number of code bits equals a number of data fields.
| 12. The method of claim 10, further comprising:
a) determining, according to the entry, a frequency and a bandwidth; and
b) transmitting the wireless message according to the frequency and the bandwidth.
| 13. The method of claim 10, further comprising:
a) determining, according to the entry, an MCS (modulation and coding scheme); and
b) transmitting the wireless message according to the MCS.
| 14. The method of claim 10, further comprising:
a) determining, according to the matrix, whether the second vehicle is autonomously driven or human-driven.
| 15. The method of claim 10, further comprising:
a) displaying, on the first vehicle, a further matrix comprising a plurality of fields colored black or white according to a further binary code;
b) wherein the further binary code is associated with a further entry in the tabulation, and the further entry comprises a further wireless address of the first vehicle.
| 16. The method of claim 15, further comprising:
a) turning off the further matrix, by depowering illuminators in the further matrix, while the first vehicle is human-operated; and
b) turning on the further matrix, by repowering the illuminators in the further matrix, while the first vehicle is computer-operated.
| 17. A system comprising a blockchain comprising information about wireless addresses, wherein:
a) a first wireless entity comprises a camera, a processor, and a first matrix of black and white rectangular fields;
b) the black and white rectangular fields are configured to display a first code associated with the first wireless entity;
c) the camera is configured to image a second matrix displayed by a second wireless entity;
d) the processor is configured to determine, according to the second matrix, a second code associated with the second wireless entity; and
e) the processor is further configured to determine, according to the blockchain, a second wireless address associated with the second code.
| 18. The system of claim 17, wherein:
a) the first code comprises a first wireless address of the first wireless entity; and
b) the second code comprises a second wireless address, a frequency, and a bandwidth.
| 19. The system of claim 18, wherein the processor is further configured to determine, according to the blockchain, the second wireless address, the frequency, and the bandwidth associated with the second code.
| 20. The system of claim 19, wherein the processor is further configured to transmit a message to the second wireless entity, the second message transmitted according to the second wireless address, the frequency, and the bandwidth. | The non-transitory computer-readable medium comprises a set of instructions for maintaining a tabulation of entries in a non-transitory computer-readable media, where each entry comprises an index value and a wireless address of a vehicle or a fixed asset. A request message specifying a code is received, where the code is indicated by a matrix comprising black and white rectangular fields visibly displayed by a particular vehicle or fixed asset, and a particular index value is determined according to the code. A reply message indicating the particular wireless address is transmitted according to fifth generation or sixth generation technologies. A determination is made that the request message indicates that the particular entry is transmitted in entirety. The step of determining that the request message indicates that the particular entry should be transmitted in entirety. It is determined that the request message indicates that only the particular wireless address should be transmitted. INDEPENDENT CLAIMS are included for: (1) a method for a first vehicle; (2) a system comprising a blockchain comprising information about wireless addresses. Non-transitory computer-readable media for localizing, identifying, and communicating with vehicles in traffic and fixed assets. The non-transitory computer-readable medium ensures that the collision avoidance and traffic efficiency are improved. The drawing shows a schematic sketch of a wireless address tabulation according to the non-transitory computer-readable media for localizing, identifying, and communicating with vehicles in traffic and fixed assets.600Variable connectivity matrix 601Illuminator 602Diffuser 603Opaque separator 610Connectivity matrix 611Variable-transmissive window 612Illuminators 613Optional diffuser |
Please summarize the input | V2X and vehicle localization by local map exchange in 5G or 6GAutonomous vehicles may communicate with each other in 5G or 6G to avoid hazards, mitigate collisions, and facilitate the flow of traffic. However, for cooperative action, each vehicle must determine the wireless address of other vehicles in proximity, so that they can communicate directly with each other. It is not sufficient to know the wireless address alone; the wireless address must be associated with an actual vehicle in view. Methods disclosed herein enable vehicles to exchange messages that specify the distances and angles of other vehicles in view. Then, each vehicle compares the other vehicle's measurements with its own, along with each vehicle's wireless address. Using an AI-based map-merging algorithm, one or more vehicles can produce a full traffic map from the fragmentary local maps of each vehicle's viewpoint.The invention claimed is:
| 1. A method for a first vehicle to communicate with a second vehicle, the second vehicle proximate to a third vehicle, the method comprising:
a. measuring a first plurality of angles, the first plurality of angles comprising an angle of the second vehicle and an angle of the third vehicle, as viewed by the first vehicle;
b. transmitting a request message to the second vehicle, the request message requesting measurement data from the second vehicle;
c. receiving, from the second vehicle, a reply message comprising a second plurality of angles, the second plurality of angles comprising an angle of the first vehicle and an angle of the third vehicle, as viewed by the second vehicle; and
d. determining, according to the first plurality of angles and the second plurality of angles, a merged map, the merged map comprising a position of the first vehicle, a position of the second vehicle, and a position of the third vehicle.
| 2. The method of claim 1, wherein the request message and the reply messages are transmitted according to 5G or 6G technology.
| 3. The method of claim 1, wherein the angles are measured relative to a direction of a road occupied by the first and second vehicles.
| 4. The method of claim 1, further comprising:
a. measuring a third plurality of distances, the third plurality of distances comprising a distance from the first vehicle to the second vehicle and a distance from the first vehicle to the third vehicle;
b. receiving, from the second vehicle, a fourth plurality of distances, the fourth plurality of distances comprising a distance from the second vehicle to the first vehicle and a distance from the second vehicle to the third vehicle; and
c. determining the merged map according to the first plurality of angles, the second plurality of angles, the third plurality of distances, and the fourth plurality of distances.
| 5. The method of claim 4, wherein the request message further indicates the first plurality of angles and the third plurality of distances, and the reply message further indicates the second plurality of angles and the fourth plurality of distances.
| 6. The method of claim 1, further comprising:
a. determining, according to the reply message, a color or a vehicle type, or both, of the second vehicle;
b. comparing the color or vehicle type, or both, of the second vehicle to each of the vehicles visible to the first vehicle; and
c. determining the merged map according to the color or vehicle type, or both, of the second vehicle.
| 7. The method of claim 1, wherein:
a. the request message further indicates a wireless address of the first vehicle and at least one of a GPS location, a vehicle type, a color, or a lane position of the first vehicle; and
b. the reply message further indicates a wireless address of the second vehicle and at least one of a GPS location, a vehicle type, a color, or a lane position of the second vehicle.
| 8. The method of claim 1, wherein the merged map further comprises a wireless address of the first vehicle and a wireless address of the second vehicle.
| 9. The method of claim 1, further comprising transmitting the merged map to the second vehicle and the third vehicle.
| 10. The method of claim 1, further comprising:
a. determining that a traffic collision with the second vehicle is imminent;
b. determining, according to the merged map, which wireless address corresponds to the second vehicle; and
c. transmitting, to the second vehicle, an emergency message.
| 11. The method of claim 1, wherein the merged map includes a fourth vehicle which is not visible to the first vehicle.
| 12. The method of claim 1, further comprising:
a. measuring data comprising angles and distances of vehicles in traffic, relative to the first vehicle, and angles and distances of further vehicles in the traffic, relative to the second vehicle;
b. providing the data to a computer containing an artificial intelligence model; and
c. determining, according to the artificial intelligence model, a merged map comprising predicted positions of the vehicles.
| 13. The method of claim 11, further comprising:
a. measuring further data comprising angles and distances of further vehicles in traffic;
b. receiving at least one message from at least one proximate vehicle, the at least one message comprising additional data comprising angles and distances of vehicles visible to the proximate vehicle or vehicles;
c. providing the further data and the additional data as input to the algorithm; and
d. determining, as output from the algorithm, the merged map.
| 14. Non-transitory computer-readable media in a second vehicle in traffic comprising a first vehicle and at least one other vehicle, the media containing instructions that when implemented by a computing environment cause a method to be performed, the method comprising:
a. receiving, from the first vehicle, a request for geometric traffic data;
b. determining one or more “visible” vehicles, the visible vehicles being visible to the second vehicle;
c. measuring, for each of the visible vehicles, an angle of the visible vehicle and a distance of the visible vehicle from the second vehicle;
d. transmitting, to the first vehicle, a message comprising the measured angles and the measured distances; and
e. receiving, from the first vehicle, a merged map comprising positions of the first vehicle, the second vehicle, and the at least one other vehicle.
| 15. The media of claim 14, the method further comprising:
a. determining, for each of the visible vehicles, a vehicle type or a vehicle color; and
b. transmitting, to the first vehicle, a message comprising the determined vehicle types or vehicle colors.
| 16. The media of claim 14, the method further comprising transmitting, to the first vehicle, a wireless address of the second vehicle.
| 17. The media of claim 16, wherein:
a. the merged map further indicates, in association with the position of the second vehicle, the wireless address of the second vehicle; and
b. the merged map further indicates, in association with the position of the first vehicle, a wireless address of the first vehicle. | The method involves measuring a first set of angles, where the first set of angles comprise an angle of a second vehicle (202) and an angle of a third vehicle (203) as viewed by a first vehicle (201). A request message is transmitted to the second vehicle, and the request message requests measurement data from the second vehicles. A reply message is received from the second vehicles, where the reply message comprises the second angles. A merged map is determined according to the first set of angles and a second set of angles, where the merged map comprises a position of the first vehicles, a position of the second vehicle, and a position of the third vehicle. The request message and the reply messages are transmitted according to fifth generation or sixth generation technology. The angles are measured relative to a direction of a road. The reply messages are transmitted according to fifth generation( 5G) or sixth generation (6G) technology. INDEPENDENT CLAIMS are included for:(1) a non-transitory computer-readable medium comprising a set of instructions for performing a method for short-range locating and identification of vehicles; and(2) a computer. Method for short-range locating and identification of vehicles i.e. autonomous and semi-autonomous vehicles, in traffic. Uses include but are not limited to a sedan, a delivery van, a pickup lorry, a sports car, a motorcycle and a semi-trailer. The method enables determining wireless address of each proximate vehicle in traffic such that the vehicles can be communicated for traffic management and collision avoidance. The drawing shows a schematic view of structure for vehicles to avoid collisions. 200Freeway 201,202,203Vehicles 204Truck |
Please summarize the input | Vehicle connectivity, V2X communication, and 5G/6G sidelink messagingCommunication between autonomous vehicles, in 5G or 6G, is necessary for cooperative hazard avoidance and to coordinate the flow of traffic. However, before cooperative action, each vehicle must determine the wireless address of other vehicles in proximity, so that they can communicate directly with each other. Methods and systems disclosed herein include a computer-readable wireless “connectivity matrix”, an array of black and white squares showing a connectivity code. The connectivity code may be the vehicle's wireless address, an index code, or other information about the vehicle. The connectivity code may be an index in a tabulation of information that provides the wireless address, among other data. Other vehicles, or their cameras, may read the connectivity matrix, determine the code therein, and find the vehicle's wireless address. After determining the wireless address of the other vehicles, the vehicles can then communicate and cooperate to avoid accidents and facilitate the flow of traffic.The invention claimed is:
| 1. A wireless entity comprising:
a) a matrix comprising a plurality of square or rectangular fields, the matrix displayed visually in or on the wireless entity, the fields arranged in a rectangular array, each field colored either black or white according to a binary code, the binary code comprising data related to a wireless address of the wireless entity;
b) wherein the binary code indicates an index of a particular entry of a tabulation, the tabulation comprising a plurality of entries, each entry related to a wireless vehicle or a wireless fixed asset, respectively, and each entry indicating the wireless address of the related wireless vehicle or wireless fixed asset.
| 2. The wireless entity of claim 1, wherein the wireless entity is configured to communicate according to 5G or 6G technology.
| 3. The wireless entity of claim 1, the matrix further comprising a border comprising further fields arranged peripherally around the matrix and colored black or white according to a predetermined pattern.
| 4. The wireless entity of claim 1, wherein the binary code further indicates the wireless address of the wireless entity.
| 5. The wireless entity of claim 1, wherein the wireless entity comprises a first vehicle, and the matrix is further configured to indicate, to a second vehicle, the wireless address of the first vehicle.
| 6. The wireless entity of claim 5, wherein the matrix is configured to be readable by a camera on the second vehicle from a predetermined distance, the predetermined distance in the range of 20 to 100 meters.
| 7. The wireless entity of claim 1, wherein each field has a predetermined size in the range of 5 to 20 millimeters.
| 8. The wireless entity of claim 1, wherein the wireless entity is a base station of a wireless network, and the code comprises a frequency of a broadcast channel of the base station.
| 9. The wireless entity of claim 1, wherein the entity is a fixed asset comprising a traffic signal or a highway sign or a roadside building and the code further indicates a wireless address of a receiver associated with the wireless entity.
| 10. The wireless entity of claim 1, wherein:
a) each field comprises an illuminator, respectively;
b) each illuminator is powered individually; and
c) the code is determined by which of the illuminators are powered and which of the illuminators are unpowered.
| 11. The wireless entity of claim 1, wherein:
a) each field comprises a filter, respectively, each filter having an individually controllable opacity; and
b) the code is determined by which of the filters are controlled to have a high opacity and which filters are controlled to have a low opacity.
| 12. The wireless entity of claim 1, wherein the matrix further comprises 48 fields colored black or white according to a MAC (medium access code) address, surrounded by a border comprising 32 fields colored black or white according to a predetermined pattern.
| 13. Non-transitory computer-readable media in a first vehicle, the media including instructions that when executed by a computing environment cause a method to be performed, the method comprising:
a) detecting, in or on a second vehicle, a connectivity matrix comprising a plurality of fields colored black or white according to a code; and
b) determining, from the code, a wireless address of the second vehicle;
c) wherein the determining of the wireless address comprises:
d) retrieving, from a tabulation of entries, a particular entry according to the code; and
e) determining, from the particular entry, the wireless address of the second vehicle.
| 14. The media of claim 13, the method further comprising transmitting, according to the wireless address, a message to the second vehicle.
| 15. The media of claim 13, wherein the code is configured to indicate whether the second vehicle is autonomous or semi-autonomous or human-driven.
| 16. A base station of a wireless network, the base station comprising:
a) a visibly displayed connectivity matrix comprising a plurality of fields arranged in a rectangular array, each field colored black or white according to a code, the code configured to indicate a particular entry, in a tabulation of entries, according to the code, the particular entry comprising a particular frequency; and
b) a transmitter configured to transmit system information messages on the particular frequency.
| 17. The base station of claim 16, wherein the system information messages indicate how user devices can become registered with the base station.
| 18. The base station of claim 16, further comprising a receiver configured to receive messages on a second frequency, different from the particular frequency, the second frequency indicated in the system information messages. | The wireless entity has a matrix including multiple square or rectangular fields, where the matrix is displayed visually in or on the wireless entity. The fields are arranged in a rectangular array, where each field is colored either black or white according to a binary code. The binary code comprises data related to a wireless address of the entity. The wireless entity is configured to communicate according to 5G or 6G technology. The matrix has a border having fields arranged peripherally around the matrix and colored black or white according to predetermined pattern, and indicates the wireless address. INDEPENDENT CLAIMS are included for: (1) non-transitory computer-readable media includes instructions for localizing, identifying, and communicating with vehicles in traffic and fixed assets; (2) a base station for a wireless network. Wireless entity for localizing, identifying, and communicating with autonomous or semi-autonomous or human-driven vehicles, in traffic and fixed assets for cooperative hazard avoidance and to coordinate the flow of traffic. The wireless entity enables the autonomous and semi-autonomous vehicles to communicate and cooperate to prevent or mitigate collisions, saving countless lives and manage the flow of traffic in an efficient manner, after determining the wireless address of the other vehicles. The drawing shows a schematic view of a computer-readable wireless identification matrix.100Connectivity matrix |
Please summarize the input | V2X with 5G/6G Image Exchange and AI-Based Viewpoint FusionAutonomous vehicles are required to communicate with each other in 5G or 6G, to avoid hazards, mitigate collisions, and facilitate the flow of traffic. However, for cooperative action, each vehicle must determine the wireless address and position of other vehicles in proximity, so that they can communicate directly with each other. It is not sufficient to know the wireless address alone; the wireless address must be associated with an actual vehicle in view. Methods disclosed herein enable vehicles to simultaneously acquire 360-degree images of other vehicles in traffic, and transmit those images wirelessly along with their wireless addresses. The various images are then “fused” by identifying objects that are viewed from at least two directions, and calculating their positions by triangulation. The resulting traffic map, or a listing of the vehicle positions, is then broadcast along with the wireless addresses of the vehicles The vehicles can then determine which wireless address belongs to which of the vehicles in proximity, and can thereby cooperate with each other to avoid accidents and facilitate the flow of traffic.|1. A method for a first vehicle to communicate with a second vehicle, the second vehicle proximate to a third vehicle, the method comprising:
a. broadcasting a planning message specifying a particular time;
b. at the particular time, acquiring a first image depicting the second vehicle and the third vehicle;
c. receiving, from the second vehicle, an imaging message comprising a second image, the second image acquired by the second vehicle at the particular time, the second image depicting the first vehicle and the third vehicle; and
d. determining, according to the first image and the second image, a coordinate listing comprising a position of the first vehicle, a position of the second vehicle, and a position of the third vehicle.
| 2. The method of claim 1, wherein the planning message and the imaging message are transmitted according to 5G or 6G technology.
| 3. The method of claim 1, wherein the second image further includes an indication of a direction of travel of the second vehicle.
| 4. The method of claim 1, further comprising:
a. determining, from the imaging message, a wireless address of the second vehicle; and
b. adding, to the coordinate listing, the wireless address of the second vehicle and a wireless address of the first vehicle.
| 5. The method of claim 1, further comprising:
a. measuring a distance from the first vehicle to either the second vehicle or the third vehicle; and
b. determining the coordinate listing according to the distance.
| 6. The method of claim 1, further comprising:
a. providing, according to the coordinate listing, a traffic map comprising a two-dimensional image indicating the position of the first vehicle, the position of the second vehicle, and the position of the third vehicle; and
b. indicating, on the traffic map, a wireless address of the first vehicle.
| 7. The method of claim 1, wherein the imaging message further indicates at least one of a vehicle type, a color, or a lane position of the second vehicle.
| 8. The method of claim 1, wherein the coordinate listing further indicates at least one of a vehicle type, a color, or a lane position of the first vehicle.
| 9. The method of claim 1, further comprising broadcasting the coordinate listing.
| 10. The method of claim 1, further comprising:
a. determining that a traffic collision with the second vehicle is imminent;
b. determining, according to the coordinate listing, which wireless address corresponds to the second vehicle; and
c. transmitting, to the second vehicle, an emergency message.
| 11. The method of claim 1, wherein the coordinate listing includes a fourth vehicle which is not depicted in the first image.
| 12. The method of claim 1, further comprising:
a. acquiring a plurality of images of vehicles in traffic;
b. providing the plurality of images to a computer containing an artificial intelligence model; and
c. determining, according to the artificial intelligence model, a predicted coordinate listing comprising predicted positions of the vehicles.
| 13. The method of claim 12, further comprising:
a. acquiring a further image of further vehicles in traffic;
b. receiving at least one message from at least one proximate vehicle, the at least one message comprising an additional image of the vehicles in traffic;
c. providing the further image and the additional image as input to an algorithm based at least in part on the artificial intelligence model; and
d. determining, as output from the algorithm, an updated coordinate listing comprising predicted positions of the further vehicles.
| 14. Non-transitory computer-readable media in a second vehicle, the second vehicle in traffic, the traffic comprising a first vehicle and at least one other vehicle, the media containing instructions that when implemented by a computing environment cause a method to be performed, the method comprising:
a. receiving, from the first vehicle, a planning message specifying a time;
b. acquiring, at the specified time, an image comprising the first vehicle and the at least one other vehicle;
c. transmitting, to the first vehicle, an imaging message comprising the image; and
d. receiving, from the first vehicle, a coordinate listing or a traffic map comprising positions of the first vehicle, the second vehicle, and the at least one other vehicle.
| 15. The media of claim 14, the method further comprising:
a. determining, for each of the first, second, and third vehicles, a vehicle type or a vehicle color; and
b. transmitting, to the first vehicle, a message comprising the determined vehicle types or vehicle colors.
| 16. The media of claim 14, the method further comprising transmitting, to the first vehicle, a wireless address of the second vehicle.
| 17. The media of claim 16, wherein:
a. the coordinate listing or the traffic map further indicates, in association with the position of the second vehicle, the wireless address of the second vehicle; and
b. the coordinate listing or the traffic map further indicates, in association with the position of the first vehicle, a wireless address of the first vehicle.
| 18. A computer containing an artificial intelligence structure comprising;
a. one or more inputs, each input comprising an image of traffic, the traffic comprising a plurality of vehicles;
b. one or more internal functions, each internal function operably linked to one or more of the inputs; and
c. an output operably linked to the one or more of the internal functions, the output comprising a prediction of a two-dimensional position of each vehicle of the plurality.
| 19. The computer of claim 18, the artificial intelligence structure further comprising one or more adjustable variables associated with the one or more internal functions, the one or more adjustable variables adjusted by supervised learning according to a plurality of individually recorded inputs.
| 20. The computer of claim 18, further comprising an algorithm, based at least in part on the artificial intelligence structure, the algorithm configured to take, as input, one or more images of further vehicles in traffic, and to provide, as output, a two-dimensional position of each of the further vehicles. | The method involves broadcasting (301) a planning message specifying a particular time. A first image depicting a first vehicle and a second vehicle is acquired (302) at the particular time. An imaging message comprising a second image is received from the first vehicle, where the second image depicts the second vehicle and a third vehicle. A coordinate listing comprising a position of the first vehicle, a position of the second vehicle, and a position of the third vehicle is determined according to the first image and the second image. The planning message and the imaging message are transmitted according to fifth-generation (5G ) or sixth-generation (6G ) technology. The imaging message indicates one of a vehicle type, a color, or a lane position of the second vehicle. The coordinate listing indicates one of a vehicle type, a color, or a lane position of the first vehicle. INDEPENDENT CLAIMS are included for: (1) non-transitory computer-readable media for performing short-range locating and wireless addresses identification of vehicles; and (2) a computer containing an artificial intelligence structure for performing short-range locating and wireless addresses identification of vehicles. Method for performing short-range locating and wireless addresses identification of vehicles i.e. autonomous and semi-autonomous vehicles in traffic by a computing device over a 5G or 6G network. Uses include but are not limited to a personal computer, a laptop computer, a notebook computer, a net book computer, a handheld computer, a personal digital assistant, a mobile phone, a smart phone and a tablet computer. The method enables determining the locations and the wireless addresses of other proximate vehicles in the traffic in an efficient manner. The drawings shows a flow diagram of a procedure for determining a traffic map derived by viewpoint fusion.301Broadcasting a planning message specifying a particular time 302Acquiring a first image depicting a first vehicle and a second vehicle at the particular time 303Broadcasting a imaging message including wireless address after randomly-selected delay 304Receiving the imaging messages from the participating vehicles by the first vehicle 305Calculating locations of objects in a two-dimensional coordinate system by the first vehicle |
Please summarize the input | System and method for vulnerable road user detection using wireless signalsA method for detecting vulnerable road users (VRUs) using wireless signals includes receiving, by a wireless receiver, wireless signals from mobile devices and determining received signal strength indication (RSSI) levels of the wireless signals. The wireless signals and the RSSI levels of the wireless signals received by the wireless receiver are analyzed so as to determine at least one location of the VRUs. A notification is issued to the vehicle or a driver of the vehicle based on the at least one determined location of the VRUs.What is claimed is:
| 1. A method for detecting a vulnerable road user (VRU) using wireless signals, the method comprising:
receiving, by a wireless receiver, wireless signals from a mobile device at a plurality of time intervals and determining received signal strength indication (RSSI) levels of the wireless signals;
analyzing the wireless signals and the RSSI levels of the wireless signals received by the wireless receiver so as to determine a location of the VRU, wherein an estimation area for the VRU is determined at each of the time intervals and a calibrated estimation area comprising an overlap of the estimation areas is determined as a measurement of the location of the VRU; and
issuing a notification to a vehicle or a driver of the vehicle based on the determined location of the VRU.
| 2. The method according to claim 1, wherein the wireless receiver is disposed at a first static location, and wherein a second wireless receiver is disposed at a second static location, the first and second locations being known with respect to each other, and wherein wireless signals received by the second wireless receiver and associated RSSI levels are analyzed together with the wireless signals received at the first static location to determine the location of the VRU.
| 3. The method according to claim 1, wherein the wireless receiver is attached to or embedded in the vehicle.
| 4. The method according to claim 3, wherein a second wireless receiver is disposed at a static location, and wherein wireless signals received by the second wireless receiver and associated RSSI levels are analyzed together with the wireless signals received at the vehicle to determine the location of the VRU.
| 5. The method according to claim 3, wherein the plurality of time intervals are less than one second apart for determining the estimation areas and the location of the VRU from the calibrated estimation area.
| 6. The method according to claim 5, further comprising:
comparing a distance from the vehicle to the determined location of the VRU to an estimated stopping distance of the vehicle;
determining a behavior of the VRU based on further wireless signals that are received by the wireless receiver at later time intervals; and
determining whether the behavior of the VRU is expected at the determined location of the VRU,
wherein the notification to the vehicle or the driver includes a description of the behavior where it is determined that the behavior is not expected for the VRU at the determined location of the VRU.
| 7. The method according to claim 6, wherein the vehicle is an autonomous vehicle, the method further comprising issuing a control action for stopping the vehicle or diverting a path of the vehicle based on a determination that the behavior is not expected for the VRU at the determined location of the VRU.
| 8. The method according to claim 6, further comprising storing the behavior and the determined location of the VRU in a database, wherein the determining whether the behavior of the VRU is expected at the determined location of the VRU is performed by checking the database.
| 9. The method according to claim 3, wherein the time intervals are less than 0.5 seconds apart for determining the estimation areas and the location of the VRU from the calibrated estimation area.
| 10. The method according to claim 3, wherein each of the estimation areas has a circular area comprising an estimated location at the center and a radius representing an expected error range, the estimated locations being based on the RSSI levels received at the respective time intervals.
| 11. The method according to claim 1, further comprising receiving, by a wireless transceiver, wireless signals sent by the wireless transceiver and reflected back to the wireless transceiver from objects in the vicinity of the vehicle, wherein the wireless signals reflected back to the wireless transceiver are used to determine at least one location of at least one additional VRU which does not have a mobile device.
| 12. The method according to claim 1, further comprising identifying the mobile device from the wireless signals received by the wireless receiver and determining that the VRU carries at least one additional mobile device based on the wireless signals from the VRU being received by the wireless receiver indicating a single entity carrying the mobile devices.
| 13. The method according to claim 1, wherein the wireless receiver includes a plurality of antennas which change directions during the receiving of the wireless signals from the mobile device, the method further comprising using trilateration on the received wireless signals to determine the location of the VRU.
| 14. The method according to claim 1, further comprising the vehicle self-enforcing a dynamic speed limit which was changed in the vehicle based on the VRU detection and broadcasting the changed speed limit to other vehicles in the vicinity using vehicle-to-vehicle communications.
| 15. A system for detecting a vulnerable road user (VRU), the system being configured to communicate with a wireless receiver configured to receive wireless signals from mobile devices, the system comprising:
a processing server configured to analyze the wireless signals received at a plurality of time intervals from one of the mobile devices and received signal strength indication (RSSI) levels of the wireless signals received by the wireless receiver so as to determine a location of the VRU, wherein an estimation area for the VRU is determined at each of the time intervals and a calibrated estimation area comprising an overlap of the estimation areas is determined as a measurement of the location of the VRU; and
an alert system configured to issue a notification to a vehicle or a driver of the vehicle based on the determined location of the VRU.
| 16. The system according to claim 15, wherein the wireless receiver is attached to or embedded in the vehicle.
| 17. The system according to claim 16, wherein the processing server is configured to analyze wireless signals received by a second wireless receiver disposed at a static location and associated RSSI levels together with the wireless signals received at the vehicle to determine the location of the VRU. | The method involves receiving wireless signals from mobile devices by a first wireless receiver (14). Received signal strength indication (RSSI) levels of the wireless signals are determined. The wireless signals and the RSSI levels of the wireless signals received by the first wireless receiver are analyzed to determine a location of vulnerable road users (VRUs). A notification is issued to a vehicle (12) or a driver of the vehicle based on the determined location of the VRUs. The first wireless receiver is arranged at a first static location and a second static location. The wireless signals received by a second wireless receiver and associated RSSI levels are analyzed together with the wireless signals received at the first static location to determine the location of the VRUs. An INDEPENDENT CLAIM is also included for a system for detecting VRUs. Method for detecting VRUs e.g. cyclists or pedestrians, around an autonomous or non-autonomous vehicle i.e. car, based on wireless signals. Can also be used for bus, lorry, motorbike and bicycle. The method enables reducing computational costs and constraints for associated hardware, facilitating faster and/or reliable detection of VRUs and choosing a short time interval between measurements to allow for quick and accurate location prediction by leveraging high speed of the vehicle. The drawing shows a schematic view of a system for detecting VRUs based on wireless signals. 12Vehicle14Wireless receiver15Transceiver16Processing server22Cloud server |
Please summarize the input | Global navigation satellite system, navigation terminal, navigation method and programIn a satellite navigation system a navigation terminal continuously receives navigation signals from navigation satellites and continuously implements navigation calculations, thereby obtaining navigation calculation results, and executes in parallel: using clock offset values determined through the navigation calculations, calculates, in real time, changes in difference between time differences with regard to difference between time differences, which are differences between a clock offset value and a standard deviation value, which is the value of the standard deviation of fluctuation amounts of the clock offset values; determines, in real time, two navigation precision indices of the calculated navigation calculation results on the basis of each change in the calculated difference between time differences and standard deviation value; associates, in real time, the determined two navigation precision indices with the calculated navigation calculation results; and outputs, in real time, the navigation calculation results associated with the at least two navigation precision indices.What is claimed is:
| 1. A positioning terminal, comprising a Global Navigation Satellite System (GNSS) receiver,
wherein the GNSS receiver is configured to execute processing in parallel while continuously acquiring respective navigation signals from navigation satellites, each navigation satellite configured to broadcast a navigation signal for GNSS and continuously performing positioning computation in real time to obtain a positioning computation result, the processing including:
i) calculating a standard deviation value of a clock offset value exhibited at a current epoch based on each clock offset value obtained by the positioning computation of each epoch, which is a value of a jitter amount of the each clock offset value;
ii) calculating a change amount of a most recent difference between time differences based on each value of difference between time differences of each epoch, which is the each value of difference between time differences that is a difference between each current clock offset value and a clock offset value immediately preceding the current clock offset value;
determining two positioning accuracy indices of the current epoch based on respective values of the standard deviation value of the clock offset value exhibited at the current epoch and the change amount of the most recent difference between the time differences; and
associating the two positioning accuracy indices with the positioning computation result of the current epoch.
| 2. The positioning terminal according to claim 1, wherein the GNSS receiver is configured to:
calculate, from clock offset values exhibited at respective epochs, the standard deviation value of difference between time differences at the current epoch and a predetermined number of past epochs;
calculate, from the clock offset values exhibited at the respective epochs, an average value of difference between time differences at a predetermined number of past epochs that do not include the clock offset value exhibited at the current epoch; and
associate, as the two positioning accuracy indices, i) the calculated standard deviation value and ii) a value of the difference between time differences at the current epoch and the average as the change amount of the positioning computation result at the current epoch in real time.
| 3. The positioning terminal according to claim 1,
wherein the GNSS receiver positioning module includes at least:
a broadcast wave signal processor configured to continuously acquire the respective navigation signals from the navigation satellites each configured to broadcast the navigation signal for GNSS; and
a processor configured to associate the determined two positioning accuracy indices with the positioning computation result.
| 4. The positioning terminal according to claim 1, wherein the GNSS receiver is configured to:
calculate, in a process of performing the positioning computation at each epoch, a value of a magnitude of jitter of the clock offset value exhibited at the current epoch with respect to clock offset values exhibited at respective epochs;
further determine, based on the calculated value of the magnitude of the jitter of the clock offset value exhibited at the current epoch, iii) a positioning accuracy index of the positioning computation result at the current epoch; and
associate the positioning accuracy index with the positioning computation result at the current epoch.
| 5. The positioning terminal according to claim 1, wherein the GNSS receiver is configured to calculate the standard deviation value and the change amount by excluding a clock offset value that fails to fall within a threshold value range of a jitter change amount from clock offset values exhibited at a predetermined number of last epochs when the two positioning accuracy indices for the current epoch is determined through calculation.
| 6. The positioning terminal according to claim 1, wherein the GNSS receiver is configured to process a unit of the two positioning accuracy indices, which are to be assigned to the positioning computation result, into a unit of a distance through use of the speed of light.
| 7. The positioning terminal according to claim 1, wherein the GNSS receiver is configured to perform the positioning computation based on a Precise Point Positioning scheme, and assign the two positioning accuracy indices to the positioning computation result based on the Precise Point Positioning scheme.
| 8. The positioning terminal according to claim 1,
wherein the positioning terminal is mounted to a vehicle including a communication unit, and
wherein the positioning terminal is configured to use the communication unit to notify a communication counterpart device of, together with positional information on the positioning terminal being the positioning computation result, at least one of the two positioning accuracy indices each indicating a degree of reliability of the positional information.
| 9. The positioning terminal according to claim 1,
wherein the positioning terminal is mounted to a vehicle including a communication unit, and
wherein the positioning terminal is configured to use the communication unit to receive, together with positional information on a communication counterpart device being the positioning computation result obtained by the communication counterpart device, at least one of the two positioning accuracy indices each indicating a degree of reliability of the positional information from the communication counterpart device, and use the at least one of the two positioning accuracy indices as a discrimination index for automatic driving.
| 10. The positioning terminal according to claim 1,
wherein the positioning terminal is mounted to an automobile or an autonomous driving vehicle capable of performing vehicle-to-vehicle communication, and
wherein the positioning terminal is configured to notify another vehicle of, together with positional information on an own vehicle being the positioning computation result, the two positioning accuracy indices each being an index indicating a degree of reliability of the positional information via the vehicle-to-vehicle communication.
| 11. The positioning terminal according to claim 1,
wherein the positioning terminal is mounted to an automobile or an autonomous driving vehicle capable of performing vehicle-to-vehicle communication, and
wherein the positioning terminal is configured to receive, together with positional information on another vehicle being the positioning computation result of another vehicle, at least one of the two positioning accuracy indices each indicating a degree of reliability of the positional information from the another vehicle via the vehicle-to-vehicle communication, and use the at least one of the two positioning accuracy indices as a discrimination index for automatic driving of an own vehicle.
| 12. The positioning terminal according to claim 1, wherein the GNSS receiver is configured to:
determine, in a process of performing the positioning computation at each epoch, whether to advance to processing for deriving a combination of navigation satellites excluding one or a plurality of navigation satellites being a jitter factor based on one or both of a magnitude of jitter in continuity of the clock offset value and a stability thereof;
perform, when a combination of navigation satellites excluding one or a plurality of navigation satellites being a jitter factor is derived, re-positioning-computation based on a navigation signal group that has been transmitted from the navigation satellites excluding the one or plurality of navigation satellites being a jitter factor; and
perform processing in parallel while obtaining the positioning computation result of the re-positioning-computation, the processing including:
calculating, from the navigation signal group for the re-positioning-computation, the standard deviation value;
determining two positioning accuracy indices of the positioning computation result of re-positioning-computation based on the respective values of the standard deviation value and the change amount; and
associating the determined two positioning accuracy indices with the positioning computation result of the re-positioning-computation.
| 13. A positioning method, performed by a positioning terminal of a global navigation satellite system,
the positioning method comprising executing, by the positioning terminal, processing in parallel while continuously acquiring respective navigation signals from navigation satellites, each navigation satellite configured to broadcast a navigation signal for GNSS and continuously performing positioning computation in real time to obtain a positioning computation result, the processing including:
i) calculating a standard deviation value of a clock offset value exhibited at a current epoch based on each clock offset value obtained by the positioning computation of each epoch, which is a value of a jitter amount of the each clock offset value;
ii) calculating a change amount of a most recent difference between time differences based on each value of difference between time differences of each epoch, which is the each value of difference between time differences that is a difference between each current clock offset value and a clock offset value immediately preceding the current clock offset value;
determining two positioning accuracy indices of the current epoch based on respective values of the standard deviation value of the clock offset value exhibited at the current epoch and the change amount of the most recent difference between the time differences;
associating the two positioning accuracy indices with the positioning computation result of the current epoch; and
outputting in real time the positioning computation result associated with at least the two positioning accuracy indices.
| 14. The positioning method according to claim 13, further comprising:
determining, by the positioning terminal, in a process of performing the positioning computation at each epoch, whether to advance to processing for deriving a combination of navigation satellites excluding one or a plurality of navigation satellites being a jitter factor based on one or both of a magnitude of jitter in continuity of the clock offset value and a stability thereof;
performing, by the positioning terminal, when a combination of navigation satellites excluding one or a plurality of navigation satellites being a jitter factor is derived, re-positioning-computation based on a navigation signal group that has been transmitted from the navigation satellites excluding the one or plurality of navigation satellites being a jitter factor; and
performing, by the positioning terminal, processing in parallel while obtaining the positioning computation result of the re-positioning-computation, the processing including:
calculating, from the navigation signal group for the re-positioning-computation, the standard deviation value;
determining two positioning accuracy indices of the positioning computation result of re-positioning-computation based on the respective values of the standard deviation value and the change amount; and
associating the determined two positioning accuracy indices with the positioning computation result of the re-positioning-computation.
| 15. The positioning method according to claim 13,
wherein the positioning terminal is mounted to an automobile or an autonomous driving vehicle capable of performing vehicle-to-vehicle communication, and
wherein the positioning method further comprises notifying, by the positioning terminal, another vehicle of, together with positional information on an own vehicle being the positioning computation result, the two positioning accuracy indices each being an index indicating a degree of reliability of the positional information.
| 16. The positioning method according to claim 13,
wherein the positioning terminal is mounted to an automobile or an autonomous driving vehicle capable of performing vehicle-to-vehicle communication, and
wherein the positioning method further comprises receiving, by the positioning terminal, together with positional information on another vehicle being the positioning computation result of another vehicle, at least one of the two positioning accuracy indices each indicating a degree of reliability of the positional information from the another vehicle via the vehicle-to-vehicle communication, and using the at least one of the two positioning accuracy indices as a discrimination index for automatic driving of an own vehicle.
| 17. A non-transitory computer-readable recording medium having a program recorded thereon, the program for positioning, for causing a processor of a positioning terminal to be operated to execute processing in parallel while continuously acquiring respective navigation signals from navigation satellites, each navigation satellite configured to broadcast a navigation signal for GNSS and continuously performing positioning computation in real time to obtain a positioning computation result, the processing including:
i) calculating a standard deviation value of a clock offset value exhibited at a current epoch based on each clock offset value obtained by the positioning computation of each epoch, which is a value of a jitter amount of the each clock offset value;
ii) calculating a change amount of a most recent difference between time based on each value of difference between time differences of each epoch, which is the each value of difference between time differences that is a difference between each current clock offset value and a clock offset value immediately preceding the current clock offset value;
determining two positioning accuracy indices of the current epoch based on respective values of the standard deviation value of the clock offset value exhibited at the current epoch and the change amount of the most recent difference between the time differences;
associating the two positioning accuracy indices with the positioning computation result of the current epoch; and
outputting in real time the positioning computation result associated with at least the two positioning accuracy indices.
| 18. The non-transitory computer-readable recording medium according to claim 17, wherein the program is configured to cause the processor of the positioning terminal to be operated to:
determine, in a process of performing the positioning computation at each epoch, whether to advance to processing for deriving a combination of navigation satellites excluding one or a plurality of navigation satellites being a jitter factor based on one or both of a magnitude of jitter in continuity of the clock offset value and a stability thereof;
perform, when a combination of navigation satellites excluding one or a plurality of navigation satellites being a jitter factor is derived, re-positioning-computation based on a navigation signal group that has been transmitted from the navigation satellites excluding the one or plurality of navigation satellites being a jitter factor; and
perform processing in parallel while obtaining the positioning computation result of the re-positioning-computation, the processing including:
calculating, from the navigation signal group for the re-positioning-computation, the standard deviation value;
determining two positioning accuracy indices of the positioning computation result of re-positioning-computation based on the respective values of the standard deviation value and the change amount; and
associating the determined two positioning accuracy indices with the positioning computation result of the re-positioning-computation.
| 19. The non-transitory computer-readable recording medium according to claim 17, wherein the program is configured to cause the positioning terminal, which is mounted to an automobile or an autonomous driving vehicle capable of performing vehicle-to-vehicle communication, to be operated to notify another vehicle of, together with positional information on an own vehicle being the positioning computation result, the two positioning accuracy indices each being an index indicating a degree of reliability of the positional information via the vehicle-to-vehicle communication.
| 20. The non-transitory computer-readable recording medium according to claim 17, wherein the program is configured to cause the positioning terminal, which is mounted to an automobile or an autonomous driving vehicle capable of performing vehicle-to-vehicle communication, to be operated to receive, together with positional information on another vehicle being the positioning computation result of another vehicle, at least one of the two positioning accuracy indices each indicating a degree of reliability of the positional information from the another vehicle via the vehicle-to-vehicle communication, and use the at least one of the two positioning accuracy indices as a discrimination index for automatic driving of an own vehicle. | The system determines two positioning precision parameters of the calculated positioning calculation result in real time, respectively based on the variation of the calculated standard deviation value and a time difference. The positioning precision parameter is matched with the positioning calculation result which calculated two determined positioning precision parameters in real time. The positioning calculation result which matched the positioning precision parameters is output in real time. INDEPENDENT CLAIMS are included for the following:positioning terminal;positioning method; andpositioning program. Satellite positioning system for vehicle, mobile telephone, global positioning system (GPS) apparatus, ship, farming machine, mining machinery and drone. The satellite positioning system which relate the probability of the positioning calculation result with a positioning calculation result in real time can be provided about the positioning calculation result based on a navigation signal. The drawing shows a block diagram of the satellite positioning system. (Drawing includes non-English language text) 10Positioning terminal20Navigation satellite |
Please summarize the input | Decision-making method of lane change for self-driving vehicles using reinforcement learning in a motorway environment, recording medium thereofThe present invention relates to a method for determining a lane change of an autonomous vehicle using reinforcement learning in an automobile-only road environment. Selecting an important vehicle, which is a nearby vehicle that has the greatest influence on a lane change determination of an autonomous vehicle, calculating a lane change probability of the important vehicle, adding the lane change probability to vehicle information, Performing pre-processing necessary for reinforcement learning through a pre-processing network on information obtained by adding the lane change probability to vehicle information, and performing reinforcement learning by adding autonomous vehicle information to information pre-processed in the pre-processing network, and outputting a change determination result. According to the present invention, there is an effect of ensuring real-time performance and flexibly coping with motion changes of other vehicles by using the lane change probability of the main vehicle.|1. In a lane change determination method of an autonomous vehicle using reinforcement learning in an automobile-only road environment, the method comprising: receiving vehicle information of surrounding vehicles through V2X communication (Vehicle to Everything communication); using the received vehicle information to select an important vehicle, which is a neighboring vehicle that has the greatest influence in determining a lane change of the autonomous vehicle;
calculating a lane change probability of the important vehicle; adding the lane change probability to vehicle information; performing preprocessing necessary for reinforcement learning through a preprocessing network on information obtained by adding the lane change probability to the vehicle information; performing reinforcement learning by adding autonomous vehicle information to information preprocessed in the preprocessing network, and outputting a lane change determination result; and performing a safety check on the determination result through the reinforcement learning and outputting a determination result confirmed to be safe, wherein the preprocessing network and the reinforcement learning are composed of a fully connected layer, wherein the Vehicle information received through V2X communication (Vehicle to Everything communication) (only,= 1, 2,..., n-1, n),= longitudinal relative distance between the ego vehicle and the i-th surrounding vehicle,= relative speed,= relative acceleration,= Relative Lane,= When the left roadway, the current roadway, and the right roadway exist, respectively,(Equation 1)(Equation 2), where R denotes a real number domain and N denotes an integer domain, and in the step of calculating the lane change probability of the important vehicle, the lane change probability of the important vehicle is calculated, but the time t (initial 0), the longitudinal position of the vehicle, the vehicle speed, the vehicle acceleration, the vehicle heading angle, the lateral error derivative, the effective distance to both sides of the lane, the relative distance to the vehicle in front, Relative speed, relative acceleration, relative distance to rear vehicle, relative speed, relative acceleration, left lane ahead, relative distance to rear vehicle, relative speed, relative acceleration, right lane front, relative distance to rear vehicle, relative speed, A lane change determination method for an autonomous vehicle, characterized in that feature information including relative acceleration is input to the LSTM network.
| 2. delete
| 3. delete
| 4. A computer-readable recording medium in which a program capable of executing the method of claim 1 by a computer is recorded. | The method involves receiving (S101) vehicle information of surrounding vehicles through vehicle to everything (V2X) communication. The received vehicle information is used (S103) to select an important vehicle, which is a neighboring vehicle that has the greatest influence in determining a lane change of the autonomous vehicle. A lane change probability of the important vehicle is calculated (S105). The lane change probability is added (S107) to vehicle information. The pre-processing necessary for reinforcement learning is performed (S109) through a pre-processing network on the information obtained by adding the lane probability to the vehicle information. The reinforcement learning is performed (S111) by adding information about the autonomous driving vehicle to the information preprocessed in the preprocessing network, and a lane change determination result is outputted. An INDEPENDENT CLAIM is included for a computer-readable recording medium storing program for determining lane change of autonomous vehicle. Method for determining lane change of autonomous vehicle using reinforcement learning in vehicle-only road environment. The method enables determining the lane change of the autonomous vehicle using reinforcement learning in an automobile-only road environment so as to ensure real-time performance and flexibly respond flexibly to movement change of other vehicles. The method enables exhibiting better performance even in a road environment in which lanes change is performed by adding direct characteristic information on lane changes. The drawing shows a flowchart of a method for determining a lane change of an autonomous vehicle using reinforcement learning in a vehicle-only road environment. (Drawing includes non-English language text) S101Step for receiving vehicle information of surrounding vehiclesS103Step for using received vehicle information to select important vehicleS105Step for calculating lane change probability of important vehicleS107Step for adding lane change probability to vehicle informationS109Step for performing pre-processing necessary for reinforcement learningS111Step for performing reinforcement learning by adding information about autonomous driving vehicle to information preprocessed in preprocessing network |
Please summarize the input | The audio visual of a vehicle, and cooperative recognitionPROBLEM TO BE SOLVED: To provide a cooperative audio-visual inference solution means to accurately recognize emergency vehicles in diverse geographic locations.
SOLUTION: A vehicle recognition system 100A includes a sound analysis circuit 110 to analyze captured sounds using an audio machine learning technique to identify a sound event. The system includes an image analysis circuit 107 to analyze captured images using an image machine learning technique to identify an image event, and a vehicle identification circuit 105 to identify a type of vehicle based on the image event and the sound event. The vehicle identification circuit 105 may further use V2V or V2I alerts to identify the type of vehicle and communicate a V2X or V2I alert message based on the vehicle type. In some aspects, the type of vehicle is further identified based on a light event associated with light signals detected by the vehicle recognition system.
SELECTED DRAWING: Figure 1A|1. In a system for emergency vehicle recognition in a vehicle; the system analyzes audio data by using a machine learning technique; and is a voice detection circuit for determining a voice event; the audio data are generated by an sauce outside the vehicle. The image detection circuit is detected by a microphone array installed in the vehicle; a sound detection circuit; and an image detection circuit for analyzing image data using the machine learning technique to determine an image event; and the image data are provided. An image detection circuit and a classification circuit are acquired by a camera array installed in the vehicle; and the classification circuit generates an audio-image association; and the audio-image association is used for a plurality of time instances. The audio sample of the voice event is collated with the image frame of the image event; and the audio sample is based on the audio-image association; and an emergency vehicle recognition is performed; and a classification circuit and the classification circuit are provided. In the vehicle interface for transmitting a message to a vehicle control system; the message is based on the emergency vehicle recognition; and a system provided with a vehicle interface. provided.
| 2. The image event is to detect the visual representation of the emergency vehicle in at least one of the image frames; and the voice event is to detect a voice associated with the emergency vehicle in at least one of the audio samples. System. Included in Claim 1
| 3. In order to generate the audio-image associations, the classification circuit further uses a sampling rate of the audio samples to normalize the frame rate of the image frame; and for each time instance of the plurality of time instances; An audio sample (ASPIF) parameter per image frame is determined; The system. in the claim 1 or 2
| 4. The audio-image association is a data structure; the classification circuit further relates to each image frame of the image frame; an identifier of a time instance of the plurality of time instances corresponding to the image frame; and an identifier of the image frame. An identifier of a subset of the audio sample corresponding to the image frame based on the ASPIF parameter; and a detection result associated with the image frame; and a detection result based on the image event, and; A detection result associated with each audio sample of a subset of the audio samples is detected; and a detection result based on the voice event is stored in the data structure; and a system. described in claim 3 is stored.
| 5. The detection result associated with the image frame is the type of the emergency vehicle detected in the image frame; and the system. described in claim 4 is provided.
| 6. The detection result associated with each audio sample of the subset of the audio samples is the type of the emergency vehicle detected based on the audio sample; and the system. described in claim 5 is provided.
| 7. The classification circuit further applies a clustering function to the detection result associated with the subset of the audio samples, and generates a combined detection result associated with a subset of the audio samples, as well as: Data fusion between the detection result associated with the image frame and the combined detection result associated with the subset of the audio samples is executed to execute the emergency vehicle recognition, and the system. described in claim 6 is executed.
| 8. A prediction of the type of the emergency vehicle detected during the emergency vehicle recognition is generated; and the message is generated for transmission to the vehicle control system; and the message includes the type of the emergency vehicle. Further provided with a prediction generating circuit configured to do so; the system. described in any one of the claims 1 to 7
| 9. The vehicle control system executes a response action on the basis of the message; and the system. described in claim 8 is provided.
| 10. The response action includes an autonomous vehicle steering based on the type of the emergency vehicle detected during the emergency vehicle recognition, and the system. described in claim 9
| 11. The machine learning technique includes the artificial neural network; the system. described in any one of the claims 1 to 10
| 12. The machine is a procedure for analyzing audio data by using a machine learning technique to determine a voice event; the audio data are sensed by a microphone array installed in a vehicle; and the image data are analyzed by using the machine learning technique. In the procedure for determining an image event, the image data is obtained by a camera array installed in the vehicle; a procedure and a procedure for generating an audio-image association; and the audio-image association is provided. For a plurality of time instances, the audio sample of the voice event is collated with the image frame of the image event; a procedure; and a procedure for performing emergency vehicle recognition based on the audio-image association. In the procedure for outputting a message to a vehicle control system of the vehicle, the message is based on the emergency vehicle recognition; and the vehicle control system executes a response action based on the message; and a program. for performing the procedure is performed.
| 13. In addition to the above machine, the frame rate of the image frame is normalized by using the sampling rate of the audio sample, and for each time instance of the plurality of time instances; The program. described in Claim 12 for performing the procedure which determines the audio sample (ASPIF) parameter per image frame
| 14. The audio-image association is a data structure; the program further includes: an identifier of a time instance of the plurality of time instances corresponding to the image frame for each image frame of the image frame; and a method for controlling the image frame. The identifier of the image frame; an identifier of a subset of the audio sample corresponding to the image frame based on the ASPIF parameter; and a detection result associated with the image frame; and a detection result based on the image event, and; In a detection result associated with each audio sample of a subset of the audio samples, a detection result based on the voice event is performed to execute a procedure for storing the detection result in the data structure, and the program. described in claim 13 is executed.
| 15. The detection result associated with the image frame is of the type of the emergency vehicle detected in the image frame; and the detection result associated with each audio sample of the subset of the audio samples is provided. A type of the emergency vehicle detected based on the audio sample; the program further applies a clustering function to the detection result associated with a subset of the audio sample; and to provide a method for detecting the detection result of the audio sample. A procedure for generating a combined detection result associated with a subset of the audio sample, and a result of detection associated with the image frame and a combined detection result associated with a subset of the audio sample are performed to perform data fusion. The program. described in claim 14 for carrying out the procedure which performs the said emergency vehicle recognition
| 16. In a means for analyzing audio data using a machine learning technique and determining a voice event, the audio data are sensed by a microphone array installed in a vehicle; and the image data are analyzed by using the machine learning technique. In the means for determining an image event, the image data are acquired by a camera array installed in the vehicle; a means; and an audio-image association; and the audio-image association is provided. For a plurality of time instances, the audio sample of the voice event is collated with the image frame of the image event; a means; and a means for performing emergency vehicle recognition based on the audio-image association. In a means for outputting a message to a vehicle control system, the message is based on the emergency vehicle recognition; and the vehicle control system performs a response action based on the message; and a device. provided with the means.
| 17. The image event detects the visual representation of the emergency vehicle in at least one of the image frames; the voice event is at least one of the audio samples; and the voice associated with the emergency vehicle is detected; and the device. described in the claim 16 is detected.
| 18. The means for generating the audio-image association uses a sampling rate of the audio sample; normalizes the frame rate of the image frame; and for each time instance of the plurality of time instances; The device. described in the claim 16 or 17 includes a means for determining the audio sample (ASPIF) parameter per image frame.
| 19. The audio-image association is a data structure; the device further relates to each image frame of the image frame; the identifier of the time instance of the plurality of time instances corresponding to the image frame; and the identifier of the image frame. An identifier of a subset of the audio sample corresponding to the image frame based on the ASPIF parameter; and a detection result associated with the image frame; and a detection result based on the image event, and; A detection result associated with each audio sample of a subset of the audio samples is provided with a means for storing a detection result based on the voice event in the data structure; and the device. described in the claim 18 is provided.
| 20. The detection result associated with the image frame is the type of the emergency vehicle detected in the image frame; and the device. described in claim 19 is provided.
| 21. The detection result associated with each audio sample of the subset of the audio samples is of the type of the emergency vehicle detected based on the audio sample; and the device. described in claim 20 is provided.
| 22. A clustering function is applied to the detection result associated with the subset of the audio samples; a means for generating a combined detection result associated with a subset of the audio samples; and the detection result associated with the image frame are disclosed. Data fusion with the combined detection result associated with the subset of the audio samples is carried out and further provided with a means for performing the emergency vehicle recognition, and the device. described in claim 21 is provided.
| 23. A means for generating the prediction of the type of the emergency vehicle detected during the recognition of the emergency vehicle, and a means for generating the message for transmission to the vehicle control system, and the message further includes a step of including the type of the emergency vehicle; and a step for generating the message. Equipment. in any of the claims 16 through 22
| 24. Non-temporary machine-readable media. which stored the program described in any one of the claim 12 to 15 | The vehicle recognition system (100A) has a microphone arrangement (116) that is operatively mounted in a vehicle (104) to capture sounds outside of the vehicle. A sound analysis circuit (110) analyzes the captured sounds using an audio machine learning technique to identify a sound event. An image capture arrangement (115) is operatively mounted in the vehicle to capture images outside of the vehicle. An image analysis circuit analyzes the captured images using an image machine learning technique to identify an image event. A vehicle identification circuit identifies a type of vehicle based on the image event and the sound event. INDEPENDENT CLAIMS are included for the following: a method for vehicle recognition; anda non-transitory machine-readable medium storing program for vehicle recognition. Vehicle recognition system for identifying type of vehicle such as autonomous vehicle based on image event and sound event. The safety features are designed to avoid collisions and accidents by offering technologies that alert the driver to potential problems or to avoid collisions by implementing safeguards, and taking over control of the vehicle based on such safeguards. The association between the emergency vehicle image, and the emergency vehicle sound takes place to accurately recognize the emergency vehicle type based on the audio, light, and image data. The drawing shows the schematic drawing illustrating the system using the vehicle recognition platform to provide emergency vehicle detection based on sound data, light data, and image data.100AVehicle recognition system 104Vehicle 110Sound analysis circuit 115Image capture arrangement 116Microphone arrangement |
Please summarize the input | AUTONOMOUS DRIVING SYSTEMThe present invention relates to an autonomous driving system. The autonomous driving system of the present invention includes a server that transmits environmental information including at least one of road conditions and traffic conditions; A V2X communication unit that receives the transmitted environmental information; a state information collection unit that collects driving state information of the vehicle; a display unit that displays at least one of a plurality of driving modes of the vehicle to occupants of the vehicle; and determining a selectable driving mode among the plurality of driving modes based on at least one of the received environmental information and the collected driving state information, and controlling the display unit to display the selectable driving mode distinctly from other driving modes. It may include a control unit that does.|1. a server that transmits environmental information including at least one of road conditions and traffic conditions;
A V2X communication unit that receives the transmitted environmental information;
a state information collection unit that collects driving state information of the vehicle;
a display unit that displays at least one of a plurality of driving modes of the vehicle to occupants of the vehicle; and determining a selectable driving mode among the plurality of driving modes based on at least one of the received environmental information and the collected driving state information, and controlling the display unit to display the selectable driving mode distinctly from other driving modes. An autonomous driving system that includes a control unit that
| 2. The method of claim 1, wherein the plurality of driving modes include: an autonomous driving mode in which the vehicle drives on the road by itself;
A cooperative driving mode in which the vehicle and at least one other vehicle drive while maintaining a predetermined distance apart; and a normal driving mode in which the driver of the vehicle directly drives the vehicle.
| 3. The method of claim 2, wherein the driving state information includes at least one of speed information of the vehicle, acceleration information of the vehicle, and driving time information of the vehicle, and the control unit controls the driver based on the driving state information. An autonomous driving system characterized by generating status information and determining the selectable driving mode based on the generated driver status information.
| 4. The method of claim 3, wherein the control unit generates the driver state information based on acceleration information of the vehicle and determines the selectable driving mode based on the generated driver state information and the received environment information. Featured autonomous driving system.
| 5. The method of claim 4, wherein at least one of the server and the control unit, when the cooperative driving mode is included in the selectable driving mode, displays the generated driver state information and the other driver state generated in the at least one other vehicle. An autonomous driving system that compares information and recommends the vehicle to drive first.
| 6. The method of claim 5, wherein at least one of the server and the control unit operates in a front vehicle among vehicles operating in the cooperative driving mode while the vehicle and the at least one other vehicle are operating in the cooperative driving mode. If the generated driver status information indicates abnormality, designating the second vehicle among the vehicles in operation as the emergency driving lead vehicle, and controlling the gap between the vehicle in the lead and the vehicle in the second location. An autonomous driving system characterized by releasing.
| 7. The method of claim 4, wherein the generated driver state information includes a degree of risk, and the process of determining the selectable driving mode by the control unit includes: generating an excess of the acceleration information with respect to a predetermined acceleration reference value;
determining the risk through an operation including at least one of a cumulative operation and an average operation based on the generated excess; and, when the risk is outside a normal range, excluding at least one of the plurality of driving modes from the selectable driving modes.
| 8. The autonomous driving system according to claim 7, wherein the predetermined acceleration reference value is generated based on the environmental information.
| 9. The autonomous driving system of claim 8, wherein the environmental information includes at least one of speed limit information on the road and weather information.
| 10. The autonomous driving system of claim 9, wherein the control unit adjusts an update cycle of the received environmental information based on the risk.
| 11. A V2X communication unit that receives environmental information including at least one of road conditions and traffic conditions from an external server;
a state information collection unit that collects driving state information of the vehicle; and a control unit that determines which of the driving modes are selectable based on the received environmental information and the collected driving state information, and controls an external display device to display the selectable mode differently from the non-selectable mode.. | The system (10) has server (200) that transmits environmental information includes road conditions and traffic conditions. A V2X communication unit (110) receives the transmitted environmental information. A state information collection unit (120) collects driving state information of the vehicle. A display unit (300) displays multiple driving modes of the vehicle to occupants of the vehicle. A control unit (150) determines a selectable driving mode among multiple driving modes based on received environmental information and collected driving state information. A control unit controls the display unit to display the selectable driving mode distinctly from other driving modes. Autonomous driving system for commercial vehicle i.e. passenger car. The autonomous driving system comprises a display unit that displays multiple driving modes of a vehicle to occupants of the vehicle, where a server transmits environmental information including road conditions and traffic conditions, thus ensuring smooth cooperative driving between vehicles through data transmission and reception between vehicles. The drawing shows a block diagram of a autonomous driving system for commercial vehicle. (Drawing includes non-English language text) 10Autonomous driving system100Driving terminal110V2x communication unit120Collection unit150Control unit200Server300Display unit400Generation device500Steering unit |
Please summarize the input | HAILING A VEHICLETypically, an indication of the potential occupant's intention to use the autonomous vehicle is received via a user interface. In response to receiving this indication, a call request is transmitted by the signaling mode to the at least one autonomous vehicle capable of receiving the call request directly according to the signaling mode.|1. A stationary device comprising: at least one processor;
screen; and at least one non-transitory storage medium storing instructions, wherein the instructions, when executed by the at least one processor, cause the at least one processor to:
display, on the screen, a user interface for hailing a vehicle;
receive, from an operator of the stationary device operating the user interface, an indication of a request for pick-up by a vehicle at a location proximate to the stationary device;
transmit the pickup request to a plurality of vehicles;
receive a response accepting the pickup request from a responsive vehicle among the plurality of vehicles, wherein the response includes at least one message ensuring that a plurality of vehicles do not respond to the pickup request; Received after the vehicle has been exchanged with another vehicle from the plurality of vehicles -;
and cause the response to be displayed on the screen to an operator of the stationary device.
| 2. The stationary device of claim 1, wherein the indication of the pickup request includes one or more indications of occupants, number of occupants, destination location, class of service, and time of arrival.
| 2. The method of claim 1, wherein instructions causing the at least one processor to transmit the pickup request to a plurality of vehicles further cause the at least one processor to transmit the pickup request to a central system for transmission to the plurality of vehicles. A stationary device that causes a request to be transmitted.
| 4. The method of claim 3, wherein the instructions cause the at least one processor to transmit the pickup request to a plurality of vehicles and cause the at least one processor to broadcast the pickup request directly to the plurality of vehicles. A fixed device.
| 5. The method of claim 4, wherein instructions causing the at least one processor to broadcast the pickup request directly to the plurality of vehicles cause the at least one processor to use a vehicle to infrastructure (V2I) communication protocol. A stationary device that allows broadcasting a pickup request.
| 2. The stationary device of claim 1, wherein the stationary device is a kiosk.
| 1. A method performed by a stationary device at a fixed location, comprising: displaying, on a screen of the stationary device, a user interface for hailing a vehicle;
Receiving, from an operator of the stationary device operating the user interface, an indication of a request for pick-up by a vehicle at a location proximate to the stationary device;
transmitting the pickup request to a plurality of vehicles;
Receiving a response accepting the pickup request from a responding vehicle among the plurality of vehicles, wherein the response includes at least one message ensuring that the multiple vehicles do not respond to the pickup request, wherein the responding vehicle sends the plurality of vehicles Received after exchanging with another of the vehicles -; and displaying, on the screen, the response to an operator of the stationary device.
| 8. The method of claim 7, wherein the indication of the pickup request includes one or more indications of passengers, number of passengers, destination location, class of service, and time of arrival.
| 8. The method of claim 7, wherein transmitting the pickup request to the plurality of vehicles includes broadcasting the pickup request to the plurality of vehicles.
| 10. The method of claim 9, wherein broadcasting the pickup request to the plurality of vehicles includes broadcasting the pickup request directly to the plurality of vehicles using a vehicle to infrastructure (V2I) communication protocol. How to.
| 8. The method of claim 7, comprising transmitting the pickup request to a central system for transmission to the plurality of vehicles.
| 9. The method of claim 8, wherein the stationary device is a kiosk.
| 13. At least one non-transitory storage medium storing instructions implemented in a fixed device residing in a fixed location, wherein the instructions, when executed by the at least one processor, cause the at least one processor to:
display, on a screen of the stationary device, a user interface for hailing a vehicle;
receive, from an operator of the stationary device operating the user interface, an indication of a request for pick-up by a vehicle at a location proximate to the stationary device;
transmit the pickup request to a plurality of vehicles;
receive a response accepting the pickup request from a responding vehicle among the plurality of vehicles, wherein the response includes at least one message ensuring that the multiple vehicles do not respond to the pickup request; After exchanging with another of the vehicles, received -;
At least one non-transitory storage medium that causes the response to be displayed on the screen to an operator of the stationary device.
| 14. The at least one non-transitory storage medium of claim 13, wherein the indication of the pickup request includes one or more indications of passengers, number of passengers, destination location, class of service, and time of arrival.
| 14. The method of claim 13, wherein instructions causing the at least one processor to transmit the pickup request to a plurality of vehicles further cause the at least one processor to transmit the pickup request to the plurality of vehicles. At least one non-transitory storage medium that allows transmitting a request.
| 14. The method of claim 13, wherein the instructions cause the at least one processor to transmit the pickup request to a plurality of vehicles and cause the at least one processor to broadcast the pickup request to the plurality of vehicles., at least one non-transitory storage medium.
| 17. The method of claim 16, wherein instructions causing the at least one processor to broadcast the pickup request to the plurality of vehicles cause the at least one processor to broadcast the pickup request to the plurality of vehicles using a vehicle to infrastructure (V2I) communication protocol. At least one non-transitory storage medium that allows broadcasting the pickup request directly to vehicles.
| 14. The at least one non-transitory storage medium of claim 13, wherein the stationary device is a kiosk.
| 19. delete
| 20. delete
| 21. delete | The method involves receiving a hailing request by a signaling mode from a receiving device of an autonomous vehicle. Intention indication of a potential rider is determined in the autonomous vehicle corresponding to the hailing request. Hailing request processing operation is performed in the autonomous vehicle. The hailing request is received by direct wireless communication from a mobile device, where signaling mode comprises a visual mode and an audible mode, the visual mode comprises a display of graphical elements and image or light. An INDEPENDENT CLAIM is also included for an apparatus for hailing an autonomous vehicle. Method for hailing an autonomous vehicle (claimed) i.e. taxicab or ride-sharing vehicle. The method enables including temporal properties such as the display duration of image to modulate encode additional information so as to reduce incidence of false detections, thus ensuring appearance of the gesture. The method enables performing hailing process of the autonomous vehicle so as to ensure images or lights displaying effect and sound emitting effect by using the hailing device. The drawing shows a schematic block diagram of an apparatus for hailing an autonomous vehicle. |
Please summarize the input | Hailing a vehicleIn general, an indication is received through a user interface of an intention of a potential rider to use an autonomous vehicle. In response to the receipt of the indication, a hailing request is sent by a signaling mode to at least one autonomous vehicle that can receive the hailing request directly in accordance with the signaling mode.The invention claimed is:
| 1. A system comprising:
at least one processor; and
at least one non-transitory computer-readable media comprising instructions that, upon execution of the instructions by the at least one processor, are to cause the at least one processor to:
receive, by at least one sensor of a vehicle from an infrastructure access point, a wireless signal that indicates a request for transportation services, wherein the wireless signal that indicates the request for transportation services is based on a wireless signal received by the infrastructure access point directly from a hailing device of a user;
determine, using the at least one processor of the vehicle, that the vehicle can safely stop to pick up the user;
determine, using the at least one processor of the vehicle, to accept the request based, at least in part, on the determination that the vehicle can safely stop to pick up the user;
select, based on the wireless signal received from the infrastructure access point, a stopping place; and
cause the vehicle to stop at the stopping place.
| 2. The system of claim 1, wherein the wireless signal that indicates the request for transportation devices is a vehicle-to-infrastructure (V2I) wireless signal.
| 3. The system of claim 1, wherein the wireless signal received from the infrastructure access point is a vehicle-to-infrastructure (V2I) wireless signal.
| 4. The system of claim 1, wherein the wireless signal that indicates the request for transportation services is generated by the infrastructure access point based on the wireless signal received from the hailing device without modification by a server that is communicatively coupled with the infrastructure access point.
| 5. The system of claim 1, wherein the hailing device is a mobile device of a user.
| 6. The system of claim 1, wherein the infrastructure access point is a WiFi access point.
| 7. A method comprising:
detecting, by at least one sensor of a vehicle, a first wireless signal that indicates a request for transportation services received from an infrastructure access point, wherein the first wireless signal is based on a second wireless signal that was previously received by the infrastructure access point directly from a hailing device of a user;
determining, using the at least one processor of the vehicle, that the vehicle can safely stop to pick up the user;
determining, using the at least one processor of the vehicle, to accept the request based, at least in part, on the determination that the vehicle can safely stop to pick up the user;
selecting, by at least one processor of the vehicle a stopping place based on the wireless signal received from the infrastructure access point; and
causing, by the at least one processor, the vehicle to stop at the stopping place.
| 8. The method of claim 7, wherein the wireless signal that indicates the request for transportation devices is a vehicle-to-infrastructure (V2I) wireless signal.
| 9. The method of claim 7, wherein the wireless signal received from the infrastructure access point is a vehicle-to-infrastructure (V2I) wireless signal.
| 10. The method of claim 7, wherein the wireless signal that indicates the request for transportation services is generated by the infrastructure access point based on the wireless signal received from the hailing device without modification by a server that is communicatively coupled with the infrastructure access point.
| 11. The method of claim 7, wherein the hailing device is a mobile device of a user.
| 12. The method of claim 7, wherein the infrastructure access point is a WiFi access point.
| 13. At least one non-transitory computer-readable media comprising instructions that, upon execution of the instructions by one or more processors of a vehicle, are to cause the vehicle to:
detect, by at least one sensor of a vehicle, a first wireless signal that indicates a request for transportation services received from an infrastructure access point, wherein the first wireless signal is based on a second wireless signal that was previously received by the infrastructure access point directly from a hailing device of a user;
determine, using the at least one processor of the vehicle, that the vehicle can safely stop to pick up the user;
determine, using the at least one processor of the vehicle, to accept the request based, at least in part, on the determination that the vehicle can safely stop to pick up the user;
select, by at least one processor of the vehicle a stopping place based on the wireless signal received from the infrastructure access point; and
cause, by the at least one processor, the vehicle to stop at the stopping place.
| 14. The at least one non-transitory computer-readable media of claim 13, wherein the wireless signal that indicates the request for transportation devices is a vehicle-to-infrastructure (V2I) wireless signal.
| 15. The at least one non-transitory computer-readable media of claim 13, wherein the wireless signal received from the infrastructure access point is a vehicle-to-infrastructure (V2I) wireless signal.
| 16. The at least one non-transitory computer-readable media of claim 13, wherein the wireless signal that indicates the request for transportation services is generated by the infrastructure access point based on the wireless signal received from the hailing device without modification by a server that is communicatively coupled with the infrastructure access point.
| 17. The at least one non-transitory computer-readable media of claim 13, wherein the hailing device is a mobile device of a user.
| 18. The at least one non-transitory computer-readable media of claim 13, wherein the infrastructure access point is a WiFi access point.
| 19. The system of claim 1, the instructions when executed are to cause the at least one processor to transmit a message indicating that the vehicle has accepted the request for transportation services to at least one other vehicle using vehicle-to-vehicle (V2V) communication.
| 20. The method of claim 7, further comprising transmitting a message indicating that the vehicle has accepted the request for transportation services to at least one other vehicle using vehicle-to-vehicle (V2V) communication.
| 21. The at least one non-transitory computer-readable media of claim 13, the instructions when executed are to cause the at least one processor to transmit a message indicating that the vehicle has accepted the request for transportation services to at least one other vehicle using vehicle-to-vehicle (V2V) communication. | The system has a processor which is configured to receive a wireless signal that indicates a request for transportation servicesupon execution of the instructions by the processor from an infrastructure access point, by a sensor of a vehicle. The wireless signal that indicates the request for transportation services is based on a wireless signal received by the infrastructure access point directly from a hailing device (72)of a user. The processor is configured to select a stopping place based on the wireless signal received from the infrastructure access point. The processor is configured to cause the vehicle to stop at the stopping place. INDEPENDENT CLAIMS are included for the following:a method for hailing vehicle; and a non-transitory computer-readable media storing program for hailing vehicle. System for hailing vehicle such as autonomous vehicle. The temporal properties such as the display duration of each image is modulated to encode additional information, or reduce incidence of false detections. The characteristics of the emitted light is uncommon, to minimize the chance that emitted light having similar or identical characteristics is erroneously detected by sensors on an autonomous vehicle as a hailing request, thus resulting in a false detection. The appearance of a gesture or gestures is uncommon to minimize the chance that a similar or identical gesture that is not being performed for the purpose of hailing an autonomous vehicle is detected by sensors on an autonomous vehicle, thus resulting in a false detection. The temporal properties such as the display duration of each gesture is modulated to encode additional information, or reduce incidence of false detections. The user interface enables the potential rider to indicate a destination location, a number of riders, a class of service, a time to arrive, and a variety of other pieces of information relevant to the hailing. The drawing shows a block diagram illustrating the system for hailing vehicle.70Gesture 72Hailing device |
Please summarize the input | WAVELENGTH BASED V2X ANTENNAProvided are wavelength based V2X antennas, and related antenna systems and method, which can include a first antenna having a first wavelength and a second antenna having a second antenna. Some antenna systems control a wavelength of a signal for transmission using one of the first antenna and the second antenna.WHAT IS CLAIMED IS:
| 1 . An antenna system comprising: a Vehicle to Everything (V2X) antenna comprising: a first antenna having a first wavelength; and a second antenna having a second wavelength, the second wavelength being different than the first wavelength; wherein the V2X antenna is configured to control a wavelength of a signal for transmission using one of the first antenna and the second antenna.
| 2. The antenna system of claim 1 , wherein the V2X antenna further comprises a switch communicatively coupled to the first antenna and to the second antenna, and wherein the V2X antenna is configured to use the switch for controlling the wavelength of the signal for transmission.
| 3. The antenna system of claim 2, wherein when the switch is in a first position, the V2X antenna transmits the signal in the first wavelength using the first antenna, and wherein when the switch is in a second position, the V2X antenna transmits the signal in the second wavelength using the second antenna.
| 4. The antenna system of any one of the previous claims, wherein the V2X antenna is configured to transmit the signal using one of the first antenna and the second antenna.
| 5. The antenna system of any one of the previous claims, wherein the first antenna has a first electrical length associated with the first wavelength, wherein the second antenna has a second electrical length associated with the second wavelength, and wherein the second electrical length is different from the first electrical length.
| 6. The antenna system of any one of the previous claims, wherein the V2X antenna is configured to control the wavelength of the signal for transmission by: receiving a control signal indicative of which of the first antenna or the second antenna to transmit from; and transmitting the signal using one of the first antenna and the second antenna based on the control signal.
| 7. The antenna system of any one of the previous claims, wherein: the V2X antenna is configured to transmit the signal from the first antenna as a V2V signal; and the V2X antenna is configured to transmit the signal from the second antenna as a V2I signal.
| 8. The antenna system of any one of the previous claims, wherein the V2X antenna further comprises: a compensator configured to compensate for gain degradation of the signal.
| 9. The antenna system of any one of the previous claims, wherein the V2X antenna further comprises: a matcher configured to control a center of the wavelength.
| 10. The antenna system of any one of the previous claims, wherein the V2X antenna does not include a beamformer.
| 1 1 . The antenna system of any one of the previous claims, wherein: the antenna system is configured to determine a signal quality parameter indicative of a quality of a received signal; and the V2X antenna is configured to transmit the signal using one of the first antenna and the second antenna based on the signal quality parameter.
| 12. A method, performed by a Vehicle to Everything, V2X, antenna comprising a first antenna having a first wavelength and a second antenna having a second wavelength different than the first wavelength, wherein the method comprises: controlling a wavelength of a signal for transmission using one of the first antenna and the second antenna.
| 13. The method of claim 12, wherein the V2X antenna further comprises a switch communicatively coupled to the first antenna and to the second antenna, and wherein controlling the wavelength of the signal for transmission comprises switching the switch.
| 14. The method of claim 13, the method further comprising: when the switch is in a first position, transmitting the signal in the first wavelength using the first antenna; and when the switch is in a second position, transmitting the signal in the second wavelength using the second antenna.
| 15. The method of any one of claims 12-14, the method further comprising: receiving a control signal indicative of which of the first antenna or the second antenna to transmit from; and transmitting the signal using one of the first antenna and the second antenna based on the control signal.
| 16. The method of any one of claims 12-15, wherein controlling the wavelength of the signal for transmission comprises: transmitting the signal from the first antenna as a V2V signal; and transmitting the signal from the second antenna as a V2I signal.
| 17. The method of any one of claims 12-16, the method further comprising: controlling, using a matcher, a center of the wavelength.
| 18. The method of any one of claims 12-17, wherein the method does not comprise beamforming.
| 19. The method of any one of claims 12-18, the method further comprising: determining a signal quality parameter indicative of a quality of a received signal; and transmitting the signal using one of the first antenna and the second antenna based on the signal quality parameter.
| 20. An autonomous vehicle comprising: a Vehicle to Everything, V2X, antenna comprising: a first antenna having a first wavelength; and a second antenna having a second wavelength, the second wavelength being different than the first wavelength; wherein the V2X antenna is configured to control a wavelength of a signal for transmission using one of the first antenna and the second antenna. | The system has vehicle to everything (V2X) antenna comprising a first antenna having a first wavelength and a second antenna having a second wavelength, where the second wavelength being different than the first wavelength. The V2X antenna is configured to control a wavelength of a signal for transmission using one of the first antenna and the second antenna. The V2X antenna further comprises a switch communicatively coupled to the first antenna and to the second antenna, and where the V2X antenna is configured to use the switch for controlling the wavelength of the signal for transmission. The V2X antenna transmits the signal in the first wavelength using the first antenna, and where the V2X antenna transmits the signal in the second wavelength using the second antenna when the switch is in a second position. The V2X antenna is configured to transmit the signal using one of the first antenna and the second antenna. INDEPENDENT CLAIMS are included for the following:a method for performed by a vehicle to everything (V2X), antenna; andan autonomous vehicle. Antenna system for use in an vehicle e.g. autonomous vehicle (claimed) such as car and bus. The vehicle-to-everything communication system provides for robust and reliable performance of an autonomous vehicle. The system provides similar performance as conventional beamforming but using a simple and efficient structure into the vehicle. The wavelength based V2X antenna is configured to optimize beam patterns (e.g., radiation patterns) for the different types of V2X communication, allowing for optimized and/or improved communication. The system provides improved connectivity performance while avoiding the use of complex beamforming schemes. The system improves signal quality by avoiding signal mismatch, while reducing wasted power consumption. The respective radiation patterns, due to changes in wavelength, can be different depending on the type of V2X communication, allowing flexibility for optimizing communication. The drawing shows a diagram of an example implementation of a wavelength based V2X antenna. 702V2X system704RF signal706Compensator708Matcher712Control signal |
Please summarize the input | DYNAMIC ANTENNA SYSTEMProvided are dynamic antenna systems, such as for an autonomous vehicle, which can include at least one modem and at least one antenna operatively connected with the at least one modem. Some antenna systems described also are configured to determine, using at least one processor, a performance parameter indicative of a performance of a communication between the at least one antenna and a network node, and control, using the at least one processor, based on the performance parameter, one or more of a position of the at least one antenna, and a connection of the at least one antenna to the at least one modem.WHAT IS CLAIMED IS:
| 1 . An antenna system for an autonomous vehicle, the antenna system comprising: at least one modem; and at least one antenna operatively connected with the at least one modem; wherein the antenna system is configured to: determine, using at least one processor, a performance parameter indicative of a performance of a communication between the at least one antenna and a network node; and control, using the at least one processor, based on the performance parameter, one or more of: a position of the at least one antenna, and a connection of the at least one antenna to the at least one modem.
| 2. The antenna system of claim 1 , wherein the antenna system is configured to control the position of the at least one antenna relative to a position of the at least one modem.
| 3. The antenna system of any one of the preceding claims, wherein the antenna system is configured to control, using the at least one processor, based on the performance parameter, a position of the at least one antenna by controlling one or more of: an orientation of the at least one antenna, a phase of the at least one antenna, an angle of the at least one antenna, and a pose of the at least one antenna.
| 4. The antenna system of any one of the preceding claims, wherein the antenna system is configured to control, using the at least one processor, based on the performance parameter, the position of the at least one antenna by controlling a rotation of the at least one antenna.
| 5. The antenna system of any one of the preceding claims, wherein the antenna system is configured to control, using the at least one processor, based on the performance parameter, the connection of the at least one antenna to the at least one modem.
| 6. The antenna system of any one of the preceding claims, wherein the antenna system is configured to control the connection of the at least one antenna to the at least one modem using a switch coupled to the at least one antenna and the at least one modem.
| 7. The antenna system of any one of the preceding claims, wherein the antenna system is configured to control, using the at least one processor, based on the performance parameter, the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem by: determining if the performance parameter satisfies a criterion; and in response to determining that the performance parameter satisfies the criterion, controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem.
| 8. The antenna system of claim 7, wherein the performance parameter satisfies the criterion in response to the performance parameter being below a performance threshold.
| 9. The antenna system of any one of the preceding claims, wherein the at least one antenna is one or more of a cellular antenna and a V2X antenna.
| 10. The antenna system of any one of the preceding claims, wherein the antenna system comprises a plurality of antennas associated with at least two different carriers, wherein each carrier of the at least two different carriers operates on a different frequency band.
| 11. The antenna system of any one of the preceding claims, wherein the antenna system comprises a plurality of modems.
| 12. The antenna system of any one of the preceding claims, wherein the antenna system comprises a plurality of modems and a plurality of antennas, wherein each of the plurality of antennas is connected to each of the plurality of modems via a switch.
| 13. An autonomous vehicle comprising: at least one modem; and at least one antenna operatively connected with the at least one modem; wherein the autonomous vehicle is configured to: determine, using at least one processor, a performance parameter indicative of a performance of a communication between the at least one antenna and a network node; and control, using the at least one processor, based on the performance parameter, one or more of: a position of the at least one antenna, and a connection of the at least one antenna to the at least one modem.
| 14. A method comprising: determining, by at least one processor, a performance parameter indicative of a performance of a communication between at least one antenna and a network node; and controlling, using the at least one processor, based on the performance parameter, one or more of: a position of the at least one antenna, and a connection of the at least one antenna to at least one modem.
| 15. The method of claim 14, wherein controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem comprises: controlling the position of the at least one antenna relative to a position of the at least one modem.
| 16. The method of any one of claims 14-15, wherein controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem comprises: controlling one or more of: an orientation of the at least one antenna, a phase of the at least one antenna, an angle of the at least one antenna, and a pose of the at least one antenna.
| 17. The method of any one of claims 14-16, wherein controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem comprises: rotating the at least one antenna.
| 18. The method of any one of claims 14-17, wherein controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem comprises: controlling the connection of the at least one antenna to the at least one modem.
| 19. The method of any one of claims 14-18, wherein controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem comprises: determining if the performance parameter satisfies a criterion; and in response to determining that the performance parameter satisfies the criterion, controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem.
| 20. The method of claim 19, wherein determining if the performance parameter satisfies a criterion comprises: determining if the performance criteria is below a performance threshold; and in response to determining that the performance criteria is below the performance threshold, determining that the performance parameter satisfies the criterion. | The system (600) has an antenna (602) operatively connected with a modem (612A). A processor determines a performance parameter indicative of performance of communication between the antenna and a network node. The processor controls position of the antenna based on the performance parameter and connection of antenna to the modem, where the system controls the position of the antenna relative to position of the modem by controlling orientation of the antenna, phase of the antenna and angle of antenna. A switch (614A1) is coupled to the antenna and modem. Multiple antennas are associated with two different carriers. Each carrier operates on different frequency band. Multiple modems and multiple antennas are provided. INDEPENDENT CLAIMS are also included for:an autonomous vehiclea method for operating an antenna system in an autonomous vehicle. Antenna system for an autonomous vehicle (claimed). The system is robustly implemented in the vehicle, and enables the vehicle to operate in an environment in which large amount of wireless connectivity is required in an efficient manner. The antenna provides 360-degree coverage around the vehicle in a cost effective manner. The drawing shows a schematic view of a dynamic antenna system.600Antenna system602A, 602BAntenna610Electronic control unit612A, 612BModems614A1-614A4, 614B1-614B4Switches |
Please summarize the input | Trajectory planning of vehicles using route informationAbstract Title: Trajectory planning of vehicles using route information
An autonomous vehicle receives information from vehicle sensors indicating the presence of an object (e.g. bus, delivery vehicle, bicycle) nearby. The object is identified (e.g. a bus with a particular model number) and the expected route of the object is retrieved (e.g. from a database, vehicle-to-vehicle communications or a mobile device associated with the object). The current trajectory is estimated using the on-board sensors and is compared with the expected route of the object, including approximating where the object may be in 5 seconds. The information is used to plan the trajectory of the autonomous vehicle (e.g. to avoid collisions). The expected route may be a one-time or temporary route (e.g. recreational drivers, delivery vehicles, construction vehicles) or a re-occurring route (e.g. postal services, busses). A confidence level may be assigned based on an accuracy and/or reliability of the expected route of the object.T. IS CLAMED IS: A method comprising: receiving, by at least one processor, information indicating a presence ofan object operating in an environment; determining, by the at least one processor, a. trajectory of the object, the tra including at least a position, a speed, and a direction of travel of the object; determining, by the at least one processor, an expected. route of the object, wherein the expected route is pre-planned and includes an expected future position of the object at a future time; comparing the trajectory of the object to the expected route of the object; and in accordance with the comparison that the -trajectory of the object is consistent with the expected route of the object, updating the trajectory of the object based on the expected route of the object.
The method of claim 1, wherein determining the expected route of the object includes re in route information from a server.
The method of any of the preceding claims, wherein the determination of the exp route is based on received route information from a transceiver or a mobile device associate with the object.
The method of * ny of the preceding claims wherein the future time is at least S seconds he future, 5. The method of any of the preceding claims, wherein comparing the trajectory of the object to the expected route of the object includes determining that the position of the object is an expected position along the expected Mute.
6. The method of any of the preceding claims, wherein comparing the trajectory of the object to the expected route of the object includes determining that a velocity of the object is an expected velocity along the expected route. -2 -
7 The method of any of the preceding claims, wherein the received information is from at least one sensor of a host vehicle.
8. The method of any of the preceding claims. wherein the at least one processor is part of a remote server.
9. The method of any of the preceding claims, wherein the received information is fron transceiver or mobile device associated with the object.
10. The method of any of the preceding claims, further comprising: determining if the received information is sufficient to determine an expected route of the object and in accordance with errnir -ion that the received data is not sufficient determine the expected route of the object receiving additional information of at least one state of the object.
11. The method of claim 10 herein the received additinnal information is from at least one sensor of a host vehicle.
12. The method of claim 10 or claim 11, further comprising, in accordance with he determination that the received data is not sufficient to determine the expected route of the object, transmitting the received additional information to a machine learning module for object classification.
13. The method of any of the preceding claims, further comprising, in accordance with the comparison that the trajectory of the object is consistent with the expected route of the object, determining an uncertainty of the updated trajectory.
14. The method of any of the preceding claims, further comprising determining a reliability of the expected route.
15. The method of any of the preceding claims, further comprising transmitting the updated trajectory information of the object.
16. A non-transitory computer--readable storage medium comprising at least one program for execution by at least one processor of a first device, the at least one program including instructions which, when executed by the at least one processor, cause the first device to perform the method of any of the preceding claims.
17. A vehicle comprising: at least one sensor configured to capture information of an object; at least one transceiver configured to transmit and receive route information of the object; and at least one processor communicatively coupled to the at least one sensor and the at least one transceiver and configured to execute computer executable instructions, the execution carrying out the method of any of claims 1-15. | The method (800) involves receiving (802) that information indicating a presence of an object operating in an environment by processor. A trajectory of the object is determined (804) a speed including a position and a direction of travel of the object. An expected route of the object is determined (806) by the processor and the expected route is pre-planned and includes an expected future position of the object at a future time. The trajectory of the object is compared (808) to the expected route of the object. The trajectory of the object is updated (810) based on the expected route of the object and in accordance with the comparison that the -trajectory of the object is consistent with the expected route of the object. An INDEPENDENT CLAIM is included for a non-transitory computer-readable storage medium storing program for trajectory planning of vehicles using route information. Method for trajectory planning of vehicle (claimed) e.g. autonomous vehicle (AV), car, drone, shuttle, train, 4wheel-drive pickup truck, sport utility vehicle (SUV) and bus, using route information. The objects in the environment is safer, since the autonomous vehicle to avoid interfering with the expected route of the objects. The computing devices are located on the AV algorithmically generate control actions based on both real-time sensor data and prior information, thus allowing the AV system to execute autonomous driving capabilities. The cloud that includes cloud data centers along with the network and networking resources, thus facilitating the computing systems access to cloud computing service. The drawing shows a flow diagram illustrating the method for trajectory planning of vehicles using route information.800Method for trajectory planning of vehicles using route information 802Step for receiving information indicating a presence of an object operating in an environment 804Step for determining a trajectory of the object 806Step for determining an expected route of the object 808Step for comparing the trajectory of the object to the expected route of the object 810Step for updating the trajectory of the object based on the expected route of the object |
Please summarize the input | Road surface condition guided decision making and predictionAbstract Title: ROAD SURFACE CONDITION GUIDED DECISION MAKING AND PREDICTION
A surface detection system for an autonomous vehicle comprising at least one sensor, a computer readable medium and at least one processor. The system receives data from the sensor associated with a surface along which the vehicle is travelling (1402). Using a surface classifier to determine a classification of the surface based on the sensor data (1404). Based on this data determining the drivability of the surface (1406) and planning based on the drivability the behaviour of the vehicle on the surface (1408). Controlling the vehicle over the surface (1410). The system may also receive data from a network outside of the vehicle and from other vehicles (V2V). |
WHAT IS CLAIMED IS: A system, comprising: at least one sensor; at least one computer-readable medium storing computer-executable instructions; at least one processor configured to communicate with the at least one sensor and to execute the computer executable instructions, the execution carrying out operations including: receiving, from the at least one sensor, sensor data associated with a surface along a path to be traveled by a vehicle; using a surface classifier to determine a classification of the surface based on the sensor data; determining, based on the classification of the surface, drivability properties of the surface, planning, based on the drivability properties of the surface, a behavior of the vehicle when driving near the surface or on the surface; and controlling the vehicle based on the planned behavior.
|
2 The system of claim 1, wherein determining, based on the surface classification drivability properties of the surface comprises: generating a surface map that includes at least one of: a list of geometric descriptions of the surface or a distribution of the drivability properties on the path of the vehicle.
|
3. The system of claim I or claim 2, wherein the surface classification includes a known surface, and wherein determining, based on the surface classification, drivability properties of the surface comprises: obtaining, from a database, the drivability properties associated with the known surface.
|
4. The system of any preceding claim, wherein the surface classification is an unknown surface, and wherein determining, based on the surface classification, the drivability properties of the surface comprises: determining, from a database, sensor measurements included in a label of the unknown surface, wherein the sensor measurements are historical sensor measurements associated with the unknown surface; and determining the drivability properties of the unknown surface based on the sensor measurements.
|
5. The system of claim 4, wherein the historical sensor measurements are measured by the vehicle or received from another vehicle.
|
6. The system of any preceding claim, wherein planning, based on the drivability properties of the surface, a behavior of the vehicle when driving near the surface or on the surface comprises: determining, based on the drivability properties, a vehicle motion that is associated with a safety or performance value that is greater than a current safety or performance value associated with a current vehicle motion.
|
7. The system of any preceding claim, wherein the surface is a first surface, and wherein planning, based on the drivability properties of the surface, a behavior of the vehicle when driving near the surface or on the surface comprises: determining a historical vehicle motion performed on a second surface that has properties similar to the drivability properties of the first surface.
|
8. The system of any preceding claim, wherein the vehicle is a first vehicle, and wherein planning, based on the drivability properties of the surface, a behavior of the vehicle when driving near the surface or on the surface comprises: detecting a second vehicle in proximity of the first vehicle; determining, based on the drivability properties of the surface, an expected motion of the second vehicle; and determining, based on the expected motion of the second vehicle, the behavior of the first vehicle.
|
9. The system of any preceding claim, wherein the surface classifier receives, from the at least one sensor, sensor measurements performed when the vehicle drives over the surface.
|
10. The system of any preceding claim, wherein the surface classification is a known surface classification, and wherein the operations further comprise: updating, based on the sensor measurements, a classifier associated with the surface classification.
|
I I. The system of any of claims I to 9, wherein the surface classification is an unknown surface, and wherein the operations further comprise: adding the sensor measurements to a label associated with the unknown surface.
|
12. The system of any preceding claim, the operations further comprising: receiving from a shared dynamic database at least one of a road surface classification information or known surface property information.
|
13. The system of any preceding claim, wherein the vehicle is a first vehicle, and wherein the operations further comprise: capturing, using the at least one sensor, a motion of a second vehicle that s driving on the surface 14. The system of any preceding claim, the operations further comprising: sending to a shared dynamic database at least one of: surface property feedback or vehicle motion feedback when the vehicle drives on the surface.
15. A method comprising: receiving, from at least one sensor of a vehicle, sensor data associated with a surface along a path to be traveled by a vehicle; using a surface classifier to determine a classification of the surface based on the sensor data determining, based on the classification of the surface, drivability properties of the surface; planning, based on the drivability properties of the surface, a behavior of the vehicle when driving near the surface or on the surface; and controlling the vehicle based on the planned behavior.
16. A non-transitory computer-readable storage medium comprising at least one program for execution by at least one processor of a first device, the at least one program including instructions which, when executed by the at least one processor, cause the first device to perform the method of claim 15. | The system (120) has at least one sensor (121), a computer-readable medium storing computer-executable instructions. A processor is configured to communicate with the at least one sensor and execute the computer-executable instructions. The execution performing operations include receiving sensor data associated with a surface along a path for a vehicle (100) to travel from the sensor. A surface classifier is used to determine a classification of the surface based on the sensor data. The drivability characteristics of the surface are determined based on the classification of the surface. A behavior of the vehicle when driving near the surface or on the surface is planned based on the drivability characteristics of the surface. The vehicle is controlled based on the planned behavior. INDEPENDENT CLAIMS are included for the following:a method for decision making and prediction with control based on road surface condition; anda non-transitory computer-readable storage medium storing program for executing a method for decision making and prediction with control based on road surface condition. Autonomous vehicle system for decision making and prediction with control based on road surface condition. The vehicle's behavior is adjusted based on dynamically changing road surfaces and conditions that affect safety and drivability. The vehicle can predict the behavior of other vehicles driving on the surface and can proactively adjust its behavior accordingly, based on the drivability characteristics of the surface. The system improves vehicle safety and reliability, particularly when driving in hazardous environments. The system reduces the chances of collisions and improves vehicle reliability and safety. The automated driving systems achieve better safety. The system ensures better decision-making, obeying traffic rules and predicting future events better than humans, and reliably controls a vehicle better than a human. The movement planner determines vehicle behavior that causes the vehicle to travel in the lane of other vehicles while it is snowing. The drawing shows a schematic view of an autonomous vehicle with autonomous capability. (Drawing includes non-English language text) 100Vehicle120AV system121Sensor122Stereo video camera132Computer peripheral |
Please summarize the input | Hailing a vehicleIn general, an indication is received through a user interface of an intention of a potential rider to use an autonomous vehicle. In response to the receipt of the indication, a hailing request is sent by a signaling mode to at least one autonomous vehicle that can receive the hailing request directly in accordance with the signaling mode.The invention claimed is:
| 1. A stationary apparatus comprising:
at least one processor;
a screen; and
at least one non-transitory storage medium storing instructions that, when executed by the at least one processor, cause the at least one processor to:
display, on the screen, a user interface for hailing a vehicle;
receive, from an operator of the stationary apparatus operating the user interface, an indication for a request for a pick-up by a vehicle at a location proximate to the stationary apparatus;
transmit the request for the pick-up to a plurality of vehicles;
receive a response from a responsive vehicle of the plurality of vehicles accepting the request for the pick-up, wherein the response is received after the responsive vehicle exchanges at least one message with another vehicle of the plurality of vehicles ensuring that multiple vehicles do not respond to the request for the pick-up; and
display, on the screen, the response to the operator of the stationary apparatus.
| 2. The stationary apparatus of claim 1, wherein the indication for the request for the pick-up comprises an indication of one or more of a rider, a number of riders, a destination location, a class of service, and a time to arrive.
| 3. The stationary apparatus of claim 1, wherein the instructions that cause the at least one processor to transmit the request for the pick-up to the plurality of vehicles cause the at least one processor to broadcast the request directly to the plurality of vehicles.
| 4. The stationary apparatus of claim 3, wherein the instructions that cause the at least one processor to broadcast the request directly to the plurality of vehicles cause the processor to broadcast the request using a vehicle to infrastructure (V21) communications protocol.
| 5. The stationary apparatus of claim 1, wherein the stationary apparatus comprises a kiosk that resides at a fixed location.
| 6. A method performed by a stationary apparatus at a fixed location, the method comprising:
displaying, on a screen of a stationary apparatus, a user interface for hailing a vehicle;
receiving, from an operator of the stationary apparatus operating the user interface, an indication for a request for a pick-up by a vehicle at a location proximate to the stationary apparatus;
transmitting the request for the pick-up to a plurality of vehicles;
receiving a response from a responsive vehicle of the plurality of vehicles accepting the request for the pick-up, wherein the response is received after the responsive vehicle exchanges at least one message with another vehicle of the plurality of vehicles ensuring that multiple vehicles do not respond to the request for the pick-up; and
displaying, on the screen, the response to the operator of the stationary apparatus.
| 7. The method of claim 6, wherein the indication for the request for the pick-up comprises an indication of one or more of a rider, a number of riders, a destination location, a class of service, and a time to arrive.
| 8. The method of claim 6, wherein transmitting the request for the pick-up to the plurality of vehicles comprises broadcasting the request to the plurality of vehicles.
| 9. The method of claim 8, comprising broadcasting the request directly to the plurality of vehicles using a vehicle to infrastructure (V21) communications protocol.
| 10. The method of claim 6, wherein the stationary apparatus comprises a kiosk.
| 11. At least one non-transitory storage medium storing instructions embodied in a stationary apparatus residing at a fixed location, the instructions, when executed by at least one processor, cause the at least one processor to:
display, on a screen, a user interface for hailing a vehicle;
receive, from an operator of the stationary apparatus operating the user interface, an indication for a request for a pick-up by a vehicle at a location proximate to the stationary apparatus;
transmit the request for the pick-up to a plurality of vehicles;
receive a response from a responsive vehicle of the plurality of vehicles accepting the request for the pick-up, wherein the response is received after the responsive vehicle exchanges at least one message with another vehicle of the plurality of vehicles ensuring that multiple vehicles do not respond to the request for the pick-up; and
display, on the screen, the response to the operator of the stationary apparatus.
| 12. The at least one non-transitory storage medium of claim 11, wherein the indication for the request for the pick-up comprises an indication of one or more of a rider, a number of riders, a destination location, a class of service, and a time to arrive.
| 13. The at least one non-transitory storage medium of claim 11, wherein the instructions that cause the at least one processor to transmit the request for the pick-up to the plurality of vehicles cause the at least one processor to broadcast the request to the plurality of vehicles.
| 14. The at least one non-transitory storage medium of claim 13, wherein the instructions that cause the at least one processor to broadcast the request to the plurality of vehicles cause the processor to broadcast the request directly to the plurality of vehicles using a vehicle to infrastructure (V21) communications protocol.
| 15. The at least one non-transitory storage medium of claim 11, wherein the stationary apparatus comprises a kiosk. | The apparatus has a processor (232,280,282) for displaying a user interface (248) for hailing a vehicle (200) on a screen. The processor receives an indication for a request for a pick-up by the vehicle at a location proximate to the stationary apparatus from an operator of the stationary apparatus operating the user interface. The request for the pick-up is transmitted to a set of vehicles. A response is received from one of the set of the vehicles accepting the request for the pick-up. The response is displayed on the screen to the operator of the stationary apparatus. The indication comprises an indication of a rider, a number of riders, a destination location, a class of service, and a time to arrive. INDEPENDENT CLAIMS are included for:(1) a method performed by a stationary apparatus at a fixed location; and(2) a non-transitory storage media storing instructions. Stationary apparatus for hailing a vehicle, such as autonomous vehicle. Can also be used for a taxicab and a ride-sharing vehicle. The hailing request is received directly in accordance with the signaling mode from the device in the vicinity of the potential rider, and the pickup location is provided to the processor based on the hailing confirmation, thus allowing the user to hail the autonomous vehicle in an efficient manner. The method allows the user of the mobile device to provide hailing information to the vehicle, and allows the vehicle to provide a pickup location to autonomous driving features of the vehicle in a reliable manner. The drawing shows a block diagram of the vehicle.200Vehicle 202Stimulus detector 204Video sensor 232,280,282Processors 248User interface |
Please summarize the input | Permission authentication and public key-based integrity self-driving vehicle IoT firmware update generation device and method through hyperledger fabric block chain moduleIn the present invention, while updating the firmware of a conventional self-driving car through the vehicle's external communication network, various hacking such as DDoS (Distributed Denial of Service) attack, vehicle ID (identification) falsification, GPS manipulation, and information collection is performed on the self-driving car's firmware. If so, there are problems that could cause the entire self-driving system to be damaged by causing it to move to a different location instead of the desired destination, or spreading malicious code, and when updating the IoT-type firmware module of the self-driving car, excessive use of the vehicle's external communication network may occur. In order to improve the problem of not being able to respond to firmware update requests provided by firmware suppliers, which can cause network bottlenecks by causing traffic, MKG-type IoT firmware brokerage control module (100), firmware brokerage for firmware suppliers Consisting of a control module (200) and a firmware-type Hyperledger Fabric blockchain module (300), the MKG-type IoT firmware brokerage control module is connected to a 5G Wifi communication network, Through the firmware brokerage control module for firmware suppliers and the firmware-type Hyperledger Fabric blockchain module, it is possible to create unprecedented permission authentication and public key-based integrity autonomous vehicle IoT firmware updates, and to determine the cause of accidents in self-driving cars. It can be provided as forensic data to prove, and is connected to the IoT-type firmware module of the self-driving car, forming an MKG-type IoT firmware brokerage control module that replaces the IoT-type firmware module of the self-driving car, thereby solving the transaction overload problem., It facilitates the management of IoT-type firmware modules, and by using bridge peer nodes, it can guarantee high availability of 80% compared to the existing system without single point of failure on public networks, private networks, and Hyperledger Fabric networks., The program is designed to receive and process token-type objects of the MKG-type IoT firmware mediation control module and firmware update request signals in the order of authorization and authentication applications in the Hyperledger Fabric blockchain, causing excessive traffic in the vehicle's external communication network., Bottlenecks that occur in public networks, private networks, and Hyperledger Fabric networks can be reduced to less than 70% compared to existing ones, and only authorized nodes can participate in public networks, private networks, and Hyperledger Fabric networks, and smart Since only some nodes execute the chain code of the contract function, multiple transactions can be quickly processed in parallel, the ledger is disclosed only to authorized nodes using the channel, and the identity of network participating nodes can be confirmed. The purpose is to provide a permission authentication and public key-based integrity autonomous vehicle IoT firmware update generation device and method through a firmware-type Hyperledger Fabric blockchain module that can clearly determine who is responsible when a problem occurs.|1. It is connected to the IoT firmware module that plays the role of cognitive control, learning judgment control, and autonomous driving control of the autonomous vehicle, and receives firmware data on the presence or absence of firmware updates, version, and image data before and after the accident of the autonomous vehicle, A firmware-type Hyperledger Fabric block that is formed to download firmware update files directly from a firmware supplier that supplies firmware update files based on additional permission certification and a public key after receiving permission certification from the Ledger Fabric blockchain. Permission authentication and public key-based integrity autonomous vehicle IoT firmware update generation device through chain module.
| 2. According to claim 1, wherein the permission authentication/public key-based integrity autonomous vehicle IoT firmware update generation device is connected to the IoT-type firmware module of the autonomous vehicle, replacing the IoT-type firmware module of the autonomous vehicle, A token-type object is created by consolidating the data of the car's IoT-type firmware modules into one, and after applying for permission authentication to the firmware-type Hyperledger Fabric blockchain module, the created token-type object and firmware update request signal are sent to the firmware-type Hyperledger Fabric block. MKG-type IoT firmware brokerage control that transmits the data to the chain module, receives the public key for downloading the firmware update file from the firmware-type Hyperledger Fabric blockchain module, and mediates and controls the firmware update file to be downloaded from the firmware brokerage control module for firmware suppliers. The module 100 is connected to the firmware supplier's smart device and, on behalf of the firmware supplier, applies for permission authentication to the firmware-type Hyperledger Fabric blockchain module, Checks whether the firmware supplier is registered in the Hyperledger Fabric blockchain, registers the firmware update file provided by the firmware supplier, and sends the firmware update file to the authorized MKG-type IoT firmware through the firmware-type Hyperledger Fabric blockchain module. It is located between the firmware intermediary control module 200 for firmware suppliers, which mediates and controls distribution to the intermediary control module, the firmware intermediary control module for firmware suppliers, and the MKG-type IoT firmware intermediary control module, and creates a chain code according to automatic agreement. After forming the (Chain Code), register and distribute the firmware update file sent to the firmware brokerage control module for the firmware supplier, and block the token-type object and firmware update request sent from the MKG-type IoT firmware brokerage control module to the node. Create an ordering service node with a structure, Permission through a firmware-type Hyperledger Fabric blockchain module, which consists of a firmware-type Hyperledger Fabric blockchain module 300 that creates a permission-type blockchain that allows only authorized nodes to participate in the Hyperledger Fabric network. Authentication/public key-based integrity autonomous vehicle IoT firmware update generation device.
| 3. According to claim 2, the MKG-type IoT firmware mediation control module 100 is connected to the IoT-type firmware module of the self-driving car, and provides firmware data regarding the presence or absence of firmware update and version sensed by the IoT-type firmware module of the self-driving car. Forms an interface that receives video data before and after the accident of an autonomous vehicle, video and audio data inside and outside the vehicle, and traffic data from internal and external network communication through V2X communication or IVN (In-Vehicle Network). The RS-232 interface unit 110 and the UART signal and SPI signal are connected to IEEE802. A 5G Wifi communication forming unit 120 for IoT firmware that converts to the 11b/g/n wireless LAN protocol and forms a firmware brokerage control module for firmware suppliers, a firmware-type Hyperledger Fabric blockchain module, and a 5G Wifi communication network, and RS -232 From the interface unit, the presence or absence of a firmware update, which is the sensing data of IoT-type firmware modules of the self-driving car, firmware data regarding the version, video data before and after the accident of the self-driving car, and video and audio data inside and outside the vehicle, a firmware data token-type object creation control unit 130 that receives internal and external network communication traffic data through V2X communication or IVN (In-Vehicle Network) and controls it to generate a token-type object by consolidating it into one, and an autonomous vehicle Receives firmware version data from the IoT-type firmware module, compares and analyzes it with the latest version value, and if it is a lower version, A firmware update request control unit 140 that receives permission from the certificate-based Hyperledger Fabric blockchain through a CA (certification authority) and controls a firmware update request, a token-type object created by the firmware data token-type object creation control unit, and, a permission authentication application control unit 150 for IoT firmware that controls whether the firmware update request signal from the firmware update request control unit can be transmitted to the firmware type Hyperledger Fabric blockchain module to request permission certification, and a firmware type Hyperledger Fabric block. Receive the public key and metadata for downloading the firmware update file from the chain module, Permission authentication through a firmware-type Hyperledger Fabric blockchain module, characterized in that it consists of a Check Firmware Download Algorithm engine unit 160 that forms a firmware update file to be downloaded from the firmware brokerage control module for the firmware supplier. ·Public key-based integrity autonomous vehicle IoT firmware update generation device.
| 4. The method of claim 2, wherein the firmware intermediary control module 200 for the firmware supplier transmits the UART signal and the SPI signal to IEEE802. A 5G Wifi communication formation unit 210 for firmware suppliers that converts to 11b/g/n wireless LAN protocol to form a firmware-type Hyperledger Fabric blockchain module, MKG-type IoT firmware intermediary control module, and 5G Wifi communication network, and firmware Controls whether the supplier is registered in the Hyperledger Fabric blockchain and whether the firmware update file provided by the firmware supplier can be registered in the Hyperledger Fabric blockchain by sending it to the Hyperledger Fabric blockchain to request permission authentication. After checking whether the firmware supplier is registered in the Hyperledger Fabric blockchain, the permission authentication application control unit 220 for the firmware supplier, and a set vendor algorithm (A SetVendor Algorithm engine unit 230, a firmware update algorithm engine unit 240 that updates metadata of the firmware update file in the Hyperledger Fabric blockchain and creates a transaction, Firmware-type Hyperledger, characterized in that it consists of a firmware update file distribution control unit 250 that controls the distribution of the firmware update file directly to the authorized MKG-type IoT firmware mediation control module through the firmware-type Hyperledger Fabric blockchain module. Permission authentication and public key-based integrity autonomous vehicle IoT firmware update generation device through fabric blockchain module.
| 5. According to claim 2, the firmware-type Hyperledger Fabric blockchain module 300 registers and distributes firmware update files of firmware mediation control modules for firmware suppliers to be shared, and firmware of IoT-type firmware modules of autonomous vehicles. Firmware data regarding update status and version, video data before and after the accident of an autonomous vehicle, video and audio data inside and outside the vehicle, and internal and external network communication through V2X communication or IVN (In-Vehicle Network). A node structure is created by blocking the Hyperledger Fabric-type distributed ledger unit 310, which records all changes in traffic data, and the Hyperledger Fabric-type distributed ledger unit 310, which records all changes in traffic data. An ordering service node 320 that creates an ordering block, determines the order of transactions in the ordering block, and hosts an ordering service that is delivered to connected nodes, a firmware brokerage control module for firmware suppliers, It forms a chain code based on automatic agreement between MKG-type IoT firmware mediation control modules, processes business logic agreed upon by nodes participating in the Hyperledger Fabric network, and creates a new distributed ledger unit in the Hyperledger Fabric type. The chain code unit 330, which is responsible for updating content or reading existing content, manages the Hyperledger Fabric-type distributed ledger and chain code on the Hyperledger Fabric network, and operates at the ordering service node. Permission authentication through a firmware-type Hyperledger Fabric blockchain module, which consists of a peer node 340 that verifies the created block and stores a Hyperledger Fabric-type distributed ledger based on the block. ·Public key-based integrity autonomous vehicle IoT firmware update generation device.
| 6. According to claim 5, the peer node 340 is an endorsing peer node 341 that determines whether a transaction is appropriate through chain code simulation and performs verification of the latest block. A committing peer node 342, which plays the role of communicating with a Hyperledger Fabric network located in another organization, and an anchor peer node, which plays a role of communicating with a peer node located in another organization. peer node) (343), a leader peer node (344) that is connected to the ordering service node and receives the latest block and transmits it to other peer nodes in the organization, Hyperledger Fabric blockchain, Hyperledger Fabric Blockchain, which consists of devices that connect public blockchains and private blockchains, Based on permission authentication and public key through a firmware-type Hyperledger Fabric blockchain module, which consists of a bridge peer node (345) that forms one or two of the public blockchain and private blockchain to participate in the selected blockchain. Type integrity autonomous vehicle IoT firmware update generation device.
| 7. The MKG-type IoT firmware brokerage control module is connected to the IoT-type firmware module of the self-driving car, and instead of the IoT-type firmware module of the self-driving car, the data of the IoT-type firmware modules of the self-driving car is integrated into one to create a token-type object. A step of creating (S10), a step of applying for permission authentication to the firmware-type Hyperledger Fabric blockchain module through the MKG-type IoT firmware brokerage control module (S20), and permission authentication of the firmware-type Hyperledger Fabric blockchain module. Then, a step (S30) of transmitting the token-type object and firmware update request signal generated by the MKG-type IoT firmware brokerage control module to the firmware-type Hyperledger Fabric blockchain module, and the firmware brokerage control module for the firmware supplier is sent to the firmware supplier. A step of connecting to a smart device and applying for permission authentication to the firmware-type Hyperledger Fabric blockchain module on behalf of the firmware supplier (S40), and when permission certification of the firmware-type Hyperledger Fabric blockchain module is achieved, A step (S50) of checking whether the firmware supplier is registered in the Hyperledger Fabric blockchain in the firmware-type Hyperledger Fabric blockchain module and registering the firmware update file provided by the firmware supplier (S50), and the firmware-type Hyperledger Fabric block A step (S60) of sending the public key and metadata for downloading the firmware update file to the MKG-type IoT firmware mediation control module that sent the firmware update request signal from the chain module, and sending the firmware update file from the firmware supplier firmware mediation control module to the firmware-type When distributed to the authorized MKG-type IoT firmware brokerage control module through the Hyperledger Fabric blockchain module, the firmware update file is downloaded based on the public key and metadata for downloading the firmware update file from the MKG-type IoT firmware brokerage control module., Creation of permission authentication/public key-based integrity self-driving car IoT firmware update through a firmware-type Hyperledger Fabric blockchain module, which consists of installing the firmware update file of the IoT-type firmware module of the self-driving car (S70). method. | The device has a firmware-type Hyperledger (RTM: object-oriented programming language) fabric block that is connected to an Internet of things (IoT) firmware module to download firmware update files directly from a firmware supplier. The Fabric block that is formed to download firmware update files directly from a firmware supplier that supplies firmware update files based on additional permission certification and a public key after receiving permission certification from the Ledger Fabric blockchain. A token-type object is created by consolidating the data of the IoT-type firmware modules. The brokerage control transmits the data to the chain module, receives public key for downloading the firmware update file from firmware-type Hyperledger Fabric blockchain module. An INDEPENDENT CLAIM is included for a method for generating integrity of autonomous vehicle Internet of Things firmware update based on permission authentication and public key using Hyperledger fabric blockchain module. Device for generating integrity of autonomous vehicle Internet of Things (IoT) firmware update based on permission authentication and public key using Hyperledger fabric blockchain module. The device generates firmware updates and provides forensic data for proving cause of accidents in self-driving cars. The ledger is disclosed only to authorized nodes, and identity of nodes participating in the network can be verified, so that responsibility is clearly identified in the event of a problem. The bottlenecks occurs in public networks, private networks and Hyperledger Fabric networks is reduced. The drawing shows a block diagram of a device generating integrity of autonomous vehicle Internet of Things firmware update. (Drawing includes non-English language text) 1Device for generating integrity of autonomous vehicle100Brokerage control module200Control module300Chain Module |
Please summarize the input | The V2 X (V X) communication system using the OTP (one time password).The present invention relates to a V2X communication system using an OTP, and more particularly, to a V2X communication system using an OTP, capable of not only preventing the malfunction of an electronic component mounted on a vehicle due to hacking but also further improving driving safety in an autonomous driving operation, by encrypting transmission data using the OTP which the vehicle uses to be transmitted and decrypting reception data using a counterpart OTP to be decoded, after receiving a secret key for the generation of a counterpart unique OTP through a security relay center when the vehicle communicates with an external device.
COPYRIGHT KIPO 2018
REPRESENTATIVE DRAWING - Reference numerals: (10) Vehicle; (20) External device; (30) Security relay center; (AA) Request a secret key 2 used by a communication target external device; (BB) Request a secret key 1 used by a communication target vehicle; (CC) Transmit the secret key 2; (DD) Transmit the secret key 1; (EE) Encrypt the transmission data 1 using the OTP 1; (FF) Encrypt the transmission data 2 using the OTP 2; (GG) Generate transmission data 1 and generate an OTP 1; (HH) Receive the transmission data 2 and generate the OTP 2; (II) Decrypt the transmission data 2 using the OTP 2; (JJ) Receive the transmission data 1 and generate the OTP 1; (KK) Decrypt the transmission data 1 using the OTP 1; (LL) Generate transmission data 2 and generate an OTP 2|1. A V2X communication system using OTP, wherein: the ECU (11), and wireless communication module (12) are included; data transmitted to the external device (20) is encoded to the characteristic OTP1 and it transmits; data received from the external device (20) are decoded to the OTP2 of the external device (20) intrinsic and the vehicle (10) decoded, the control unit (21) controlling the communication behavior with the vehicle (10), and wireless communication module (22) are included; data transmitted to the vehicle (10) is encoded to the characteristic OTP2 and it transmits; and it configures with the external device (20) which decodes data received from the vehicle (10) to the OTP1 of the vehicle (10) intrinsic and decoded, and the ECU (11) the communication system controls the communication behavior with the external device (20).
| 2. The V2X communication system using OTP of claim 1, wherein: the communication system moreover altogether stores the secret key for the OTP1 of the vehicle (10) and external device (20) and OTP2 generation; and it further includes the security transit centres (30) which transmits the corresponding secret key in case it has the request for transmission of the far-end secret key from the vehicle (10) and external device (20), and the security transit centres (30) organizes on the communication unit (31) which transmits the corresponding secret key stored in the secret key store (32) with the vehicle (10) and external device (20) in case it has the request for transmission of the secret key from the secret key store (32), storing all secret keys that use in the vehicle (10) and external device (20) and vehicle (10) and external device (20).
| 3. The V2X communication system using OTP of claim 1, wherein: the vehicle (10) produces the OTP1 by being transmitted the secret key 2 of the external device (20) for the V2X communication with the external device (20) from the security transit centres (30) and using the secret key 1 of the vehicle intrinsic in the ECU (11): stored in the OTP module (13) and data transmission, and in the data receiver time, comprises the OTP module (13) producing the OTP2, the encryption unit (14), and the decoder (15) by using the secret key 2, and as to the encryption unit (14), data transmitted to the external device (20) is encoded to the OTP1 generated in the OTP module (13) and transmitted; and the decoder (15) decodes data transmitted from the external device (20) to the OTP2 generated in the OTP module (13) and decoded.
| 4. The V2X communication system using OTP of claim 1, wherein: the external device (20) produces the OTP2 by being transmitted the secret key 1 of the vehicle (10) for the V2X communication with the vehicle (10) from the security transit centres (30) and using the secret key 2 of the external device intrinsic in the control unit (21): stored in the OTP module (23) and data transmission, and in the data receiver time, comprises the OTP module (23) producing the OTP1, the encryption unit (24), and the decoder (25) by using the secret key 1, and as to the encryption unit (24), data transmitted to the vehicle (10) is encoded to the OTP2 generated in the OTP module (23) and transmitted; and the decoder (25) decodes data transmitted from the vehicle (10) to the OTP1 generated in the OTP module (23) and decoded.
| The system has an electronic control unit (ECU) for controlling communication behavior with an external device (20). A one time password (OTP) module produces first OTP by using a secret key of the vehicle in a data transmission part and produces second OTP by using another secret key in a data receiver. A wireless communication module transmits data to the external device, where the data is encoded by using the first OTP and the data received from the external device is decoded by using the second OTP. A control unit controls communication behavior with a vehicle (10). Vehicle-to-everything (V2X) communication system. The system is provided with an automotive electric part that is mounted in the vehicle so as to prevent malfunction in the system, improve running stability during autonomous driving operation and provide security. The drawing shows a flow diagram illustrating operation of a V2X communication system. '(Drawing includes non-English language text)' AAStep for receiving transmission dataBBStep for decoding transmission data10Vehicle20External device30Security transit center |
Please summarize the input | SYSTEM AND METHODS TO APPLY ROBUST PREDICTIVE TRAFFIC LOAD BALANCING CONTROL AND ROBUST COOPERATIVE SAFE DRIVING FOR SMART CITIESApparatuses. systems and methods applying an ?ηικ?Λ'???λ? rcin-djcbtrmatiLHting anaayniDus car w-1 ^1··=^ nasTgattoo driven traffic model predictive -γ^?-ιΜ?τιΙ rcodimirE predictive load-balancing or. road nrtTsyfHrc -π,+ι?ι-Η dynamically assigns efficient sets of routes to car related navi^bDO aids and Tvhich navigation aids may refer to in dash navtgaiiDQ or to zanart phone navigation application. The system methods are may enable, for exainple. to ?οφτο^?: or to sbbsdtiile oommerrial naiigatict sejvrKe soluti-DEis. applying under -such upgrade ca* stribsfituticei a new hi^hK· pffir^wnt prciaztive fraffir COHtLol f-DT 'Zity SlXe €&' jize L dTllC .|1. A method enabling according to predetermined procedure to perform by in-vehicle apparatus privileged tolling transaction with a toll charging center, while non exposing trip details, and transmitting position related data to a path control system, the method comprising: a. Receiving by an in vehicle apparatus data associated with time related varying positions of a path which should be developed according to dynamic updates to an in-vehicle driving navigation aid, b. Tracking and storing positions along a trip by said in vehicle apparatus, c. Comparing by said in vehicle apparatus said tracked time related positions by the in-vehicle apparatus with time related positions associated with said path that should be developed according to updates to the driving navigation aid, d. Determining by said in-vehicle apparatus, according to a level of a match, privilege related toll charging data, e. Transmitting by said in-vehicle apparatus using an IP address associated with the in-vehicle apparatus a message which is characterized by being vehicle identifying and not trip identifying toll charging related data message, wherein the IP address differs from an IP address that is associated with the in-vehicle apparatus while in-vehicle positioning related data is transmitted anonymously.
| 2. An in-vehicle apparatus which according to 1 comprises: a. Mobile internet transceiver, b. GNSS positioning receiver, c. Processor and memory, d. Communication apparatus to communicate with an in-vehicle driving navigation aid.
| 3. A method and a system according to which conditions to improve traffic flow on a road network is encouraged by encouraging directly or indirectly usage of vehicles having in-vehicle driving navigation aids which interact with drivers, or with driving control means of autonomous-vehicles, to guide trips of a vehicles according to path controlled trips, the method comprises: a) receiving by an in-vehicle driving navigation aid data for dynamic path assignments, b) tracking by in-vehicle apparatus the actual path of the trip, c) comparing by in-vehicle apparatus the tracked path with the path complying with the dynamic path assignments along a trip, d) determining by in-vehicle apparatus the privilege entitling usage of the assigned path according to predetermined criteria for the level of the match determined by the comparison. e) transmitting by in-vehicle apparatus privilege related transaction data which expose no trip details.
| 4. A method and a system according to which traffic flow improvement conditions on a road network is encouraged by encouraging directly or indirectly usage of vehicles having in-vehicle driving navigation aids which interact with drivers, or with driving control means of autonomous-vehicles, to guide trips of a vehicles according to path controlled trips, the method comprises: a) receiving by an in-vehicle driving navigation aid data for dynamic path assignments, b) tracking by in-vehicle apparatus the actual path of the trip, c) comparing by in-vehicle apparatus the tracked path with the assigned path complying with the dynamic path assignments along a trip, d) determining by in-vehicle apparatus the privilege entitling usage of the assigned path according to predetermined criteria for the level of the match determined by the comparison.
| 5. A method according to 3 or 4, wherein said privilege is free of charge road toll.
| 6. A method according to 3 or 4, wherein said privilege is discount in charged road toll.
| 7. A method according to 5 or 6, wherein an entitlement for privilege include a criterion according to which travel on certain predetermined links requires that a trip will be stopped for a minimum predetermined time.
| 8. A method according to 7, wherein said predetermined links are links on which traffic is diluted.
| 9. A method according to 3 or 4, wherein a said vehicle is an autonomous vehicle classified as level 4 according to the Society of Automotive Engineers.
| 10. A method according to 3 or 4, wherein a said vehicle is an autonomous vehicle classified as level 5 according to the Society of Automotive Engineers.
| 11 A method according to 3 or 4, wherein path controlled trips tending to be coordinated by dynamic assignment of paths performed by coordinating path control.
| 12. A method according to 11 wherein which traffic on the network tends to converge to traffic load balance.
| 13. A method according to 11, wherein a DTA simulator is used with traffic predictions of coordinating path control.
| 14. A method according to 13, wherein the DTA simulator includes models of motion of autonomous vehicles on roads and interactions of autonomous vehicles with other vehicles on roads.
| 15. A method according to 11, wherein gradual coordination is applied by determining current highest priority links which negatively contribute to traffic load balance subject to a given computation power.
| 16. A method according to 11, wherein dynamic assignments of paths use processes of coordination control iterations.
| 17. A method according to 16, wherein coordination control iterations apply fairness related processes.
| 18. A method according to 17, wherein processes of iterations of coordination control are also used.
| 19. A method according to 12,13,14,16, 17 and 18, wherein, paths are assigned to fictitious destinations on a fictitious road map which expands a real part of a road map with evacuation of traffic from a part of a network.
| 20. A method according to 12, 13, 14, 16, 17 and 18, wherein, paths are assigned to fictitious destinations on a fictitious road map which expands a real part of a road map with traffic dilution of a part of a network.
| 21. A method and system according to which improved safe driving on a road network is encouraged by encouraging usage of safety aids, the method comprises: a) tracking by in-vehicle apparatus the actual use of said safety aid along the trip, d) determining by in-vehicle apparatus privilege related data for usage of said safety aid according to predetermined criteria entitling privilege for the level usage, c) transmitting by in-vehicle apparatus privilege related transaction data which expose no trip details.
| 22. A method according to 21, wherein said privilege applies free of charge road toll.
| 23. A method according to 21, wherein said privilege is a discount in charged road toll.
| 24. A method according to 21, wherein safety aids are cooperative safety driving aids enabling to improve a single in-vehicle measurement of a safety driving aid by in-vehicle fusion of the measurement with one or more respective measurements performed by other one or more vehicles and received by a fusion apparatus through vehicle to vehicle communication.
| 25. A method according to 3,4 and 21, wherein privilege for usage refers to usage of both safety driving aids and path controlled trips.
| 26. A method according to which a path control system assigns paths to path controlled trips according to coordination control processes, wherein coordination control processes comprise iterative mitigation processes, and wherein an iteration of mitigation processes comprise determination of relatively loaded links, and wherein determination of relatively loaded links is associated with processes to determine time dependent traffic volumes to capacity ratios related data on network links along predicted time horizon which include feeding a calibrated Dynamic Traffic assignment (DTA) simulation by: a. non-mitigated pending paths, b. current and predicted assigned paths associated with path controlled trips, which are not associated with non mitigated pending paths.
| 27. A method according to 26, wherein paths fed to the DTA simulation include current and predicted non path controlled trips.
| 28. A method according to 26, wherein paths fed to the DTA simulation include current and predicted non coordinating path controlled trips.
| 29. A method according to 26, wherein determination of relatively loaded links further include determination of reference time dependent traffic volume to capacity ratios related data which include feeding a calibrated Dynamic Traffic assignment (DTA) simulation by: a. current and predicted assigned paths associated with path controlled trips, b. current and predicted non path controlled trips.
| 30. A method according to 29, wherein paths fed to the DTA simulation include current and predicted non coordinating path controlled trips.
| 31. A method according to 26, wherein an iteration of mitigation processes includes searching for new alternative paths to yet non-mitigated pending alternative paths, preferably by substantially simultaneous search processes, wherein time dependent travel times that are associated with a search are determined by synthesis of DTA based traffic prediction fed by paths which include current and predicted assigned paths associated with path controlled trips which include paths that are associated with mitigated paths up to the current iteration in current cycle,
| 32. A method according to 31, wherein paths fed to the DTA simulation include current and predicted non path controlled trips.
| 33. A method according to 31, wherein paths fed to the DTA simulation include current and predicted non coordinating path controlled trips.
| 34. A method according to 31, wherein an iteration of mitigation processes includes: a. Determining a threshold related acceptance criterion to accept new alternative paths as a substitution to assigned path controlled trips, wherein the threshold is adaptively determined in order to enable controllable mitigation by the current iteration in perspective of one or more prior iterations, b. Accepting new alternative paths or pending alternative paths according to a predetermined acceptance procedure which may but not be limited to a threshold which enables to put a limit on acceptance of said new alternative paths, according to results from said search.
| 35. A method according to 34, wherein a threshold puts a limit on the maximum accepted reduction in potential travel time improvement in comparison to the potential travel time improvement that was assumed to be gained by said search for a path which became a non mitigated pending path.
| 36. An apparatus comprising means for performing the method of any one of claims 1 and 3-35. | The method involves receiving data associated with time related varying positions of a path which should be developed according to dynamic updates to an in-vehicle driving navigation aid, and tracking and storing positions along a trip. The tracked time related positions are compared with time related positions associated with the path that should be developed according to updates to the driving navigation aid. According to the comparison result, privilege related toll charging data are determined. By using an Internet Protocol (IP) address associated with the in-vehicle apparatus, a message which is characterized by being vehicle identifying and not trip identifying toll charging related data message is transmitted. The IP address differs from an IP address that is associated with the in-vehicle apparatus while in-vehicle positioning related data is transmitted anonymously. INDEPENDENT CLAIMS are also included for the following:an in-vehicle apparatus; anda method of improving traffic flow on a road network. Method, by in-vehicle apparatus (claimed), for performing privileged tolling transaction with toll charging center. Robust privacy preservation eliminates, or at least minimizes, possible negativism to conditional tolling, since with robust privacy preservation the non-exposure of trip details can be guaranteed or at least an exposure can be under control of the owner of the vehicle. Privacy preserving path control, supported by privacy preserving free of charge toll or toll discount, may reduce reluctance to apply and use path controlled trips usage and may thus enable to generate high usage of path controlled trips, which with improved traffic mapping and traffic prediction provide good conditions for high performance traffic load balancing. The drawing is a schematic figure which illustrates the coordination of path controlled trips preferably applied with a basic paths planning layer. 210bCoordinated paths transmission211Basic paths planning layer212Requested paths input213Traffic prediction travel time costs |
Please summarize the input | MAP UPDATE USING IMAGESMethods and apparatuses associated with updating a map using images are described. An apparatus can include a processing resource and a memory resource having instructions executable to a processing resource to monitor a map including a plurality of locations, receive, at the processing resource, the memory resource, or both, and from a first source, image data associated with a first location, identify the image data as being associated with a missing portion, an outdated portion, or both, of the map, and update the missing portion, the outdated portion, or both, of the map with the image data.What is claimed is:
| 1. An apparatus, comprising:
a processing resource; and
a memory resource in communication with the processing resource having instructions executable to:
monitor a map including a plurality of locations;
receive, at the processing resource, the memory resource, or both, and from a first source, image data associated with a first location;
identify the image data as being associated with a missing portion, an outdated portion, or both, of the map; and
update the missing portion, the outdated portion, or both, of the map with the image data.
| 2. The apparatus of claim 1, wherein the image data associated with the first location includes metadata identifying a physical location and viewing direction of the image data.
| 3. The apparatus of claim 2, wherein the instructions are executable to identify the image data as being associated with the missing portion, the outdated portion, or both, by matching the metadata associated with the image data with location and viewing direction data associated with the missing portion, the outdated portion, or both.
| 4. The apparatus of claim 1, further comprising the instructions executable to update a machine learning model associated with the map in response to receipt of the image data associated with the first location.
| 5. The apparatus of claim 1, wherein the first source is a sensor in communication with an autonomous vehicle.
| 6. The apparatus of claim 1, further comprising instructions executable to update the map as new image data associated with the first location is received.
| 7. The apparatus of claim 6, further comprising instructions executable to detect changes to the first location based on the map update and the new image data received.
| 8. The apparatus of claim 1, further comprising instructions executable to update the map and create a time-lapse version of the map as new image data associated with the particular location and the plurality of locations is received.
| 9. A non-transitory machine-readable medium comprising a processing resource in communication with a memory resource having instructions executable to:
receive, at the processing resource, the memory resource, or both, a first plurality of images from a plurality of sources;
determine, at the processing resource, the memory resource, or both, a second plurality of images of a map monitored by the processing resource, the memory resource, or both, that have not been updated within a particular time period;
search the first plurality of images and a database of previously received images for the second plurality of images of the map;
in response to finding an image matching one of the second plurality of images, insert the one of the second plurality of images into the map; and
in response to not finding a matching image, request, from the plurality of sources, the image matching one of the second plurality of images.
| 10. The medium of claim 9, further comprising the instructions executable to determine a change between the matching image and the one of the second plurality of images.
| 11. The medium of claim 9, further comprising the instructions executable to alert the plurality of sources of the change.
| 12. The medium of claim 11, further comprising the instructions executable to instruct the plurality of sources to share the change with different sources in which the plurality of sources is in communication.
| 13. The medium of claim 10, further comprising the instructions executable to alert a party outside of the plurality of sources of the change.
| 14. A method, comprising:
receiving, at a processing resource, a memory resource, or both, a plurality of images including location and time metadata from sensors associated with a plurality of vehicles that utilize vehicle-to-everything (V2X) communication;
detecting, at the processing resource, areas of a map having an outdated image, a missing image, or both;
determining whether one of the plurality of received images addresses the outdated image, the missing image, or both based on the metadata;
in response to determining the one of the plurality of received images addresses the outdated image, the missing image, or both, updating the map using the one of the plurality of images;
in response to not finding an image to address the outdated image, the missing image, or both, requesting, from the plurality of sources, a matching one of the second plurality of images;
detecting and classifying, by the processing resource and based on the updated map, an issue associated with a particular location on the map; and
sending a notification to the sensors of the plurality of vehicles and additional vehicles based on the detected and classified issue.
| 15. The method of claim 14, further comprising receiving Decentralized Environmental Notification Message (DENM) signals from the sensors of the plurality of sources and classifying the DENM signals.
| 16. The method of claim 15, further comprising providing DENM alerts to an Intelligent Transport System (ITS) based on the classified DENM signals.
| 17. The method of claim 14, further comprising:
determining periodic updates associated with the particular location; and
storing to the memory resource, a database, or both, the periodic updates.
| 18. The method of claim 14, further comprising requesting, from a database of images uploaded from a plurality of autonomous vehicles, the matching one of the second plurality of images in response to not finding an image to address the outdated image, the missing image, or both.
| 19. The method of claim 14, wherein detecting the issue comprises detecting a change in a structure associated with the particular location.
| 20. The method of claim 14, wherein detecting the issue comprises detecting a road condition change associated with the particular location. | The apparatus (100) has a memory resource (104) in communication with a processing resource (102) having instructions executable to monitor a map containing a set of locations. An image data associated with a first location is received at the processing resource, the memory resource, or both, and from a first source. The image data is identified as being associated with a missing portion, an outdated portion, or both, of the map. The missing portion, the outdated portion, or both, of the map are updated with the image data. The image data associated with the first location contains metadata identifying a physical location and viewing direction of the image data. The instructions are executable to identify the image data as being associated with the missing portion, the outdated portion, or both, by matching the metadata associated with the image data with location and viewing direction data associated with the missing portion, the outdated portion, or both. INDEPENDENT CLAIMS are included for:1) a non-transitory machine-readable medium; and2) a method for updating a map using images. Apparatus for updating a map using images. Uses include but are not limited to a mobile phone, smartphone, tablet, phablet, computing device, implantable device, vehicle, home appliance, smart home device, monitoring device, wearable device and an intelligent shopping system. The method enables utilizing a computing device to transmit information to users through a display to view images and/or text, speakers to emit sound, and a sensor to collect data in an efficient manner. The drawing shows a block diagram of the apparatus for updating a map using images.100Apparatus for updating a map using images 102Processing resource 104Memory resource 106Monitoring 108Receiving |
Please summarize the input | VEHICLE-TO-EVERYTHING (V2X) COMMUNICATION BASED ON USER INPUTIn some implementations, a device associated with a vehicle may receive, based on user input to an interface of the vehicle, information indicating an incident associated with the vehicle, wherein the user input indicates at least one of: whether the incident is associated with the user of the vehicle, or whether the incident is associated with an event outside of the vehicle. The device may transmit, to a system and via a transceiver of the vehicle, a message indicating the incident associated with the vehicle. The device may receive, from the system and via the transceiver of the vehicle, an acknowledgement of the message, wherein the acknowledgement indicates a classification of the incident. The device may cause the vehicle to perform one or more actions based on the incident associated with the vehicle and based on the classification of the incident.What is claimed is:
| 1. A device associated with a vehicle, comprising:
an interface configured to receive user input from a user of the vehicle;
a transceiver;
a memory; and
one or more processors, coupled to the memory, configured to:
receive, based on the user input to the interface, information indicating an incident associated with the vehicle, wherein the user input indicates at least one of:
whether the incident is associated with the user of the vehicle, or
whether the incident is associated with an event outside of the vehicle;
transmit, to a system and via the transceiver of the device, a message indicating the incident associated with the vehicle;
receive, from the system and via the transceiver of the device, an acknowledgement of the message, wherein the acknowledgement indicates a classification of the incident; and
cause the vehicle to perform one or more actions based on the incident associated with the vehicle and based on the classification of the incident.
| 2. The device of claim 1, wherein the interface is configured to receive the user input as a voice input.
| 3. The device of claim 1, wherein the one or more processors, to cause the vehicle to perform the one or more actions, are configured to:
provide one or more instructions to autonomously drive the vehicle to a facility based on:
the user input indicating that the incident is associated with the user of the vehicle,
the classification of the incident, and
a capability of the vehicle.
| 4. The device of claim 3, wherein the one or more processors are further configured to:
transmit, via the transceiver, a notification to a facility system associated with the facility, wherein the notification indicates the incident.
| 5. The device of claim 1, wherein the one or more processors, to cause the vehicle to perform the one or more actions, are configured to:
provide one or more instructions to autonomously park the vehicle based on:
the user input indicating that the incident is associated with the user of the vehicle,
the classification of the incident, and
a capability of the vehicle.
| 6. The device of claim 5, wherein the one or more processors are further configured to:
determine a location at which the vehicle is parked or is to be parked;
initiate, with an emergency dispatch system, an emergency call based on the user input indicating that the incident is associated with the user of the vehicle; and
transmit, to the emergency dispatch system and via the transceiver, an indication of the location at which the vehicle is parked or is to be parked.
| 7. The device of claim 1, wherein, based on the user input indicating that the incident is associated with the event outside the vehicle, the message indicating the incident associated with the vehicle includes one or more of:
an image associated with the incident,
a video associated with the incident, or
sensor information detected by one or more sensors associated with the vehicle.
| 8. The device of claim 1, wherein the one or more processors, to cause the vehicle to perform the one or more actions, are configured to:
provide one or more instructions to autonomously drive the vehicle along a route that bypasses the incident based on the classification of the incident and a capability of the vehicle.
| 9. The device of claim 1, wherein the message indicating the incident associated with the vehicle further indicates a location associated with the incident.
| 10. The device of claim 1, wherein the classification of the incident is associated with a severity level, and wherein the severity level is based on whether the classification corresponds to an emergency classification or a non-emergency classification.
| 11. A system, comprising:
a memory; and
one or more processors, coupled to the memory, configured to:
receive, from a vehicle, a message indicating an incident associated with the vehicle, wherein the message indicates:
whether the incident is associated with a user of the vehicle, or
whether the incident is associated with an event outside of the vehicle;
generate a classification of the incident based on the incident associated with the vehicle, wherein the classification is an emergency classification or a non-emergency classification depending on the incident associated with the vehicle;
transmit, to the vehicle, an acknowledgement of the message, wherein the acknowledgement indicates the classification of the incident; and
perform one or more actions based on the classification of the incident.
| 12. The system of claim 11, wherein the one or more processors, to perform the one or more actions, are configured to:
identify a plurality of other vehicles that are within a defined range from the vehicle or that are along a route associated with the vehicle; and
transmit, to the plurality of other vehicles and based on the classification of the incident as the emergency classification, one or more messages that indicate one or more vehicle actions to be performed by the plurality of other vehicles based on the incident associated with the vehicle.
| 13. The system of claim 11, wherein the one or more processors, to perform the one or more actions, are configured to:
refrain from transmitting one or more messages associated with the incident to a plurality of other vehicles based on the classification being the non-emergency classification.
| 14. The system of claim 11, wherein the one or more processors are configured to generate the classification based on other messages received from a plurality of other vehicles, wherein the other messages provide additional information that corroborates the message indicating the incident associated with the vehicle.
| 15. The system of claim 11, wherein the system is a vehicle-to-everything (V2X) system that is co-located with an access point that serves a geographic location associated with the vehicle or a geographic location associated with the incident.
| 16. The system of claim 11, wherein the system is a vehicle-to-everything (V2X) system that is configured to communicate with a plurality of access points, wherein an access point in the plurality of access points serves a geographic location associated with the vehicle or a geographic location associated with the incident.
| 17. A method, comprising:
receiving, based on user input to an interface of a vehicle, information indicating an incident associated with the vehicle, wherein the user input indicates at least one of:
whether the incident is associated with a user of the vehicle, or
whether the incident is associated with an event outside of the vehicle;
transmitting, to a system and via a transceiver of the vehicle, a message indicating the incident associated with the vehicle;
receiving, from the system and via the transceiver of the vehicle, an acknowledgement of the message, wherein the acknowledgement indicates a classification of the incident; and
causing the vehicle to perform one or more actions based on the incident associated with the vehicle and based on the classification of the incident.
| 18. The method of claim 17, wherein causing the vehicle to perform the one or more actions comprises:
providing one or more instructions to autonomously drive the vehicle to a facility based on:
the user input indicating that the incident is associated with the user of the vehicle,
the classification of the incident, and
a capability of the vehicle.
| 19. The method of claim 17, wherein causing the vehicle to perform the one or more actions comprises:
providing one or more instructions to autonomously park the vehicle based on:
the user input indicating that the incident is associated with the user of the vehicle,
the classification of the incident, and
a capability of the vehicle.
| 20. The method of claim 17, wherein causing the vehicle to perform the one or more actions comprises:
providing one or more instructions to autonomously drive the vehicle along a route that bypasses the incident based on the classification of the incident and a capability of the vehicle.
| 21. The method of claim 17, wherein the classification of the incident is associated with a severity level, and wherein the severity level is based on whether the classification corresponds to an emergency classification or a non-emergency classification.
| 22. A method, comprising:
receiving, at a system from a vehicle, a message indicating an incident associated with the vehicle, wherein the message indicates:
whether the incident is associated with a user of the vehicle, or
whether the incident is associated with an event outside of the vehicle;
generating a classification of the incident based on the incident associated with the vehicle, wherein the classification is an emergency classification or a non-emergency classification depending on the incident associated with the vehicle;
transmitting, to the vehicle, an acknowledgement of the message, wherein the acknowledgement indicates the classification of the incident; and
performing one or more actions based on the classification of the incident.
| 23. The method of claim 22, wherein performing the one or more actions comprises:
identifying a plurality of other vehicles that are within a defined range from the vehicle or that are along a route associated with the vehicle; and
transmitting, to the plurality of other vehicles and based on the classification of the incident as the emergency classification, one or more messages that indicate one or more vehicle actions to be performed by the plurality of other vehicles based on the incident associated with the vehicle.
| 24. The method of claim 22, wherein performing the one or more actions comprises:
refraining from transmitting one or more messages associated with the incident to a plurality of other vehicles based on the classification being the non-emergency classification.
| 25. The method of claim 22, wherein generating the classification is based on other messages received from a plurality of other vehicles, wherein the other messages provide additional information that corroborates the message indicating the incident associated with the vehicle. | The device (900) has a processor (920) for receiving information indicating an incident associated with a vehicle based on user input to an interface, where the user input indicates one of whether the incident is associated with a user of the vehicle or whether the incident is associated with an event outside of the vehicle. The processor transmits a message indicating the incident associated with the vehicle to a system and through a transceiver of the device, receives an acknowledgement of the message from the system and through the transceiver of the device, where the acknowledgement indicates a classification of the incident. The processor causes the vehicle to perform actions based on the incident associated with the vehicle and based on the classification of the incident. INDEPENDENT CLAIMS are included for:(1) a system for facilitating vehicle-to-everything communication between an entity and a vehicle for detecting an event based on a user input; and (2) a method for facilitating vehicle-to-everything communication between an entity and a vehicle for detecting an event based on a user input. Device for facilitating vehicle-to-everything communication between an entity and a vehicle for detecting an event e.g. traffic event, vehicle accident and poor road conditions such as icy roads based on a user input. Uses include but are not limited to an automobile, a motorcycle, a bus, a train, a scooter and a truck. The device facilitates vehicle-to-everything communication to improve road safety and traffic efficiency and realize energy savings. The drawing shows a block diagram of components of a device for facilitating vehicle-to-everything communication between an entity and a vehicle for detecting an event based on a user input. 900Device for facilitating vehicle-to-everything communication between entity and vehicle for detecting event based on user input 920Processor 940Input component 950Output component 960Communication component |
Please summarize the input | A motor vehicle, the program for motor vehicles, the navigation apparatus for motor vehicles, and the program for navigation apparatuses for motor vehicles|1. It has a transmission/reception function; a radio communication part for performing radio communication between vehicles; and a point on the route of a route guidance which requires execution of other vehicle influence behavior which affects the behavior of the other vehicle in the periphery. A car navigation function part for forming route guide data including trigger information of inter-vehicle communication and a route guide using the route guide data by the car navigation function part are executed, and at a point on the route where the trigger information is detected, respectively. Through the wireless communication section, a vehicle-to-vehicle communication request means for requesting communication between vehicles around the vehicle, a communication channel generation means for generating a communication path between the vehicle and the other vehicle responding to the inter-vehicle communication request, and the communication path generated by the communication channel generation means are provided. This automobile is provided with a behavior schedule information transmitting means for transmitting the behavior schedule information for informing the other vehicle influence behavior of the execution schedule to the other vehicle.
| 2. The trigger information is included in a claim 1 on the route guidance route, and is included in the same spot as the point where the other vehicle influence behavior should be performed or at a predetermined point in front of the same point.
| 3. The automobile is provided with an automatic operation mode for autonomously performing the behavior of the own vehicle; the car navigation function part is for the automatic operation mode; and a turn indicator, a hazard lamp, and a backlight are used. This device is provided with a means for flashing or lighting-controlling an indicator including a brake light; and on the route guide route of the route guide data, information of a flashing indication point and a lighting indication point for blinking or lighting control of the indicator is included. The point on the route including the trigger information is made to be the same point as that of the flashing indication point and the lighting indication point, or to a point before it, and is described in claim 1 or claim 2.
| 4. The trigger information consists of other vehicle influence behavior generation information including the information of the other vehicle influence behavior to be generated, and is described in either of the claims 1 to claim 3.
| 5. As the other vehicle influence behavior, the behavior of turning to the right or left, the behavior of changing the lane, the behavior of merging on the general road or the highway, the behavior of entering into a traffic line in traffic, the behavior of entering the intersection. The behavior including at least one of the behavior of entering the rotary of the station is stored; and the information of the other vehicle influence behavior included in the other vehicle influence behavior generation information is provided. The automobile described in claim 4 is characterized by the information of the selected behavior from the stored behavior.
| 6. The other vehicle influence behavior generation information includes information on a point where the other vehicle influence behavior is performed, and is described in a claim 4 or a claim 5.
| 7. The behavior schedule information includes the information of the other vehicle influence behavior scheduled to be executed, and the information of the other vehicle influence behavior to be executed is acquired from the other vehicle influence behavior generation information, and is described in either of the claim 4 to the claim 6.
| 8. The behavior schedule information includes information about a traveling route related to the present position information of the own vehicle and the other vehicle influence behavior, and is described in one of the claim 1 to claim 7.
| 9. A means for receiving reply information sent from the other vehicle through the radio communication part, and a reply information analysis means for analyzing the received reply information and analyzing the behavior of the other vehicle to the other vehicle influence behavior are provided corresponding to the behavior schedule information; and the reply information analysis means for analyzing the behavior of the other vehicle. The automobile is provided with a behavior determination means for determining the behavior of the own vehicle based on an analysis result by the reply information analysis means; and a behavior execution means for executing the behavior of the own vehicle determined by the behavior determination means.
| 10. When it is determined that the other vehicle influence behavior can be safely executed according to the analysis result by the reply information analysis means, a confirmation notice showing that the behavior of the other vehicle determined corresponding to the behavior schedule information is confirmed; and the behavior of the other vehicle is confirmed. The automobile is provided with a confirmation notice means for notifying the other vehicle through the communication path.
| 11. The behavior of the other vehicle involved in the other vehicle influence behavior of the own vehicle is specified from the behavior of the other vehicle detected by the analysis result by the reply information analysis means; and when it is determined that the other vehicle influence behavior can be safely executed by the specified other vehicle behavior, the other vehicle behavior is determined. The automobile is provided with a confirmation notice means for notifying the other vehicle of the confirmation notice indicating that the behavior of the other vehicle determined corresponding to the behavior schedule information is confirmed is provided.
| 12. The behavior determined by the behavior determination means is an automobile described in either of claims 9 to 11, which is characterized in that the behavior of the other vehicle is notified by the result of analysis by the reply information analysis means.
| 13. In the self-driving vehicle capable of autonomous traveling, the behavior determined by the behavior determining means is the behavior in which the behavior of the other vehicle in the analysis result by the reply information analyzing means is executed safely in the automatic operation, and is described in either of the claims 9 to the claim 11. ?
| 14. The automobile is provided with a manual operation mode and an automatic operation mode in which autonomous traveling is possible, and is provided with an operation mode discriminating means for discriminating between the manual operation mode and the automatic operation mode; and the behavior determined by the behavior determination means is provided. When the operation mode determination means determines the manual operation mode, it is the behavior of notifying the behavior of the other vehicle as the result of the analysis by the reply information analysis means; and when the operation mode determination means determines the automatic operation mode, the operation mode determination means determines that the operation mode is the automatic operation mode. The behavior of the other vehicle in the result of the analysis by the reply information analysis means is a behavior for safely executing the behavior of the other vehicle in the automatic operation, and the automobile is described in either of the claims 9 to 11.
| 15. During the execution of the behavior determined by the behavior determination means, the automobile is described in a claim 14 which is characterized in that it is prohibited from switching to the manual operation mode from the automatic operation mode.
| 16. During the communication between the other vehicle and the vehicle, the automobile is described in a claim 14 or claim 15 characterized in that it is prohibited from switching from the automatic operation mode to the manual operation mode.
| 17. The behavior determination means acquires information on the traveling speed of the other vehicle in the inter-vehicle communication with the other vehicle, and based on the traveling speed of the other vehicle obtained from the acquired information and the traveling speed of the own vehicle. The behavior of the own vehicle is determined, and the behavior of the vehicle is described in either of claims 9 to 16.
| 18. This vehicle is provided with a camera for photographing the periphery of the own vehicle; the behavior determination means recognizes a traffic sign and/or a traffic sign from a photographed image of the camera; and determines traffic regulation around the own vehicle based on the recognition result. In the automobile, the behavior of the own vehicle is determined in consideration of the discriminated traffic regulation.
| 19. The automobile is provided with a camera for photographing the periphery of the own vehicle, and the behavior determination means determines the peripheral situation of the own vehicle from the photographed image of the camera, and determines the behavior of the own vehicle in consideration of the discriminated peripheral situation.
| 20. The behavior schedule information includes information for specifying the own vehicle to the other vehicle, and the reply information from the other vehicle includes information for the own vehicle to specify the other vehicle, and the automobile is described in either of the claims 9 to 19.
| 21. This system is provided with a camera for photographing the periphery of one's own vehicle and/or a microphone for collecting sound around the own vehicle, and a means for detecting an emergency vehicle from a photographed image around one's own vehicle photographed by the camera and/or voice around the own vehicle collected by the microphone. In the behavior determination means, the automobile is described in either of claims 9 to 20, which determines the behavior of the self-vehicle with the priority of the emergency vehicle.
| 22. The information for specifying the own vehicle to the other vehicle includes: the present position information of the own vehicle; and feature information capable of specifying the own vehicle from the photographed image of the camera; and the present position information of the other vehicle is included in the information for specifying the other vehicle by the own vehicle. The automobile is characterized in that it includes feature information capable of specifying the other vehicle from a photographed image of the camera.
| 23. The behavior schedule information includes information for specifying the own vehicle to the other vehicle, and the automobile is described in either of the claims 1 to the claim 22.
| 24. The information for specifying the own vehicle to the other vehicle includes: the present position information of the own vehicle; and the feature information capable of specifying the own vehicle from the photographed image of the camera.
| 25. The vehicle is sent from the other vehicle in response to the inter-vehicle communication request; or it is an automatic driving vehicle for the other vehicle to autonomously perform the behavior of the own vehicle. Or a means for receiving response information including information of a state of an autonomous driving mode for autonomously performing the behavior of the own vehicle through the radio communication part; and a self-driving vehicle for autonomously performing the behavior of the own vehicle by the other vehicle is from the received response information. Alternatively, the device is provided with a discrimination means for discriminating whether it is a state of an automatic operation mode for autonomously performing the behavior of the own vehicle, and the communication path generation means is, based on the discrimination result of the discrimination means, the other vehicle which has determined that it is an automatic driving vehicle which autonomously performs the behavior of the own vehicle. Alternatively, a communication path is generated between the automobile and the other vehicle which is discriminated to be in the state of an automatic operation mode for autonomously executing the behavior of the own vehicle, and the automobile is described in either of the claims 1 to 24.
| 26. The automobile is provided with a vehicle-to-vehicle communication request button, and when the inter-vehicle communication request button is operated, the inter-vehicle communication request means transmits a vehicle-to-vehicle communication request to the other vehicle in the periphery through the radio communication part.
| 27. When the disconnection request of the communication path from the other vehicle transmitted through the communication path generated by the communication path generation means is received, the communication path between the other vehicle and the other vehicle is cut, and the automobile is described in either of the claims 1 to 26.
| 28. The device is provided with a means for discriminating whether or not the execution of the other vehicle influence behavior has been completed; and when it is determined that the execution of the other vehicle influence behavior has been completed, the execution end notification is transmitted to the other vehicle through the communication path, and thereafter, the execution of the other vehicle influence behavior is discriminated. The automobile is described in one of claims 1 to 27, which is characterized by cutting off the communication path.
| 29. The other vehicle influence behavior is different according to whether the vehicle is on the left side or on the right side, and the automobile is described in either of the claims 1 to the claim 28.
| 30. The automobile is provided with a present position detecting means, determines whether the vehicle is on the left side or right side traffic according to the country or region specified based on the present position detected by the present position detecting means; and the other vehicle influence behavior is set on the basis of the determined result.
| 31. The other vehicle influence behavior is set on the basis of the presence/absence of the traffic signal, and the automobile is described in one of the claims 1 to the claim 30.
| 32. The automobile is provided with a camera for photographing the periphery of one's own vehicle, and the presence or absence of the signal is discriminated from the photographed image of the camera.
| 33. The automobile is provided with a communication means for receiving a radio wave from a traffic signal, and the presence or absence of the signal is discriminated from the information of the radio wave received by the communication means.
| 34. The other vehicle influence behavior is set on the basis of the priority relation between the vehicles, and the automobile is described in either of the claims 1 to the claim 33.
| 35. The automobile is provided with a camera for photographing the periphery of one's own vehicle, and the priority relation between the vehicles is discriminated from the photographed image of the camera.
| 36. The automobile is provided with a present position detecting means; the priority relation between vehicles is determined according to the country or region specified based on the present position detected by the present position detection means; and the other vehicle influence behavior is set based on the determination result.
| 37. The automobile is provided with a camera for photographing the periphery of the own vehicle, and the vehicle-to-vehicle communication request means performs inter-vehicle communication request to the other vehicle in the periphery when detecting the recognition of a prescribed situation from the photographed image of the camera.
| 38. The prescribed situation is the automobile described in a claim 37 characterized in that it includes the time of entering the intersection, the approach of the highway, or the entering of the rotary of the station.
| 39. The inter-vehicle communication requesting means is an automobile described in one of claims 1 to 38 which are characterized in that when it is detected that radio wave information from a beacon installed on a road is received, the communication request means to the other vehicle in the vicinity of the vehicle.
| 40. The inter-vehicle communication requesting means is an automobile described in either of claims 1 to 39, which is characterized in that when a backlight lighting is detected, a vehicle-to-vehicle communication request is made to another vehicle in the periphery.
| 41. The inter-vehicle communication request means is an automobile described in either of claims 1 to 40, which is characterized in that when a sudden accelerator is detected, a vehicle-to-vehicle communication request is made to the other vehicle around.
| 42. The inter-vehicle communication requesting means is an automobile described in one of claims 1 to 41 characterized in that when the lighting of the brake light is detected in the case of a sudden braking, the communication request is made to the other vehicle around the vehicle.
| 43. The automobile is provided with a communication monitoring part for transmitting the generation notification of the transmission trigger to the inter-vehicle communication request means when the trigger information is detected.
| 44. The automobile is provided with a reception processing part for the inter-vehicle communication request, and the communication monitoring part starts the reception processing part when the reception of the inter-vehicle communication request from the other vehicle is detected.
| 45. The communication monitoring part is characterized in that after the processing of the inter-vehicle communication request is completed, processing for detecting the trigger information is resumed.
| 46. The communication monitoring part, after finishing the processing of the inter-vehicle communication request, and after finishing the processing in the reception processing part, resumes processing for detecting the trigger information, and at the same time, it is also provided. To resume processing for monitoring the reception of a vehicle-to-vehicle communication request from another vehicle.
| 47. A computer provided with a transmission/reception function and provided with a motor vehicle provided with a radio communication part for wirelessly communicating between vehicles corresponding to the point on the route of the route guidance which requires execution of other vehicle influence behavior which affects the behavior of the other vehicle in the periphery. A car navigation function part which forms route guide data including trigger information of communication between vehicles; and during execution of route guidance using the route guidance data by the car navigation function part; and at the point on the route which detected the trigger information, through the radio communication part. A vehicle-to-vehicle communication request means for requesting communication between vehicles in the periphery, a communication channel generating means for generating a communication path between the vehicle and another vehicle responding to the inter-vehicle communication request, and the communication path generated by the communication channel generation means are provided. The program for the automobile is made to function as a behavior schedule information transmitting means for transmitting the behavior schedule information for informing the other vehicle influence behavior of the execution schedule to the other vehicle.
| 48. The navigation device for an automobile includes a transmission/reception function, a radio communication part for performing radio communication between vehicles, and a vehicle-to-vehicle communication request means for making a communication request between vehicles in the surrounding other vehicles through the radio communication part based on the trigger information. A means for forming route guide data including the trigger information corresponding to a point on a route guide route which requires execution of other vehicle influence behavior affecting the behavior of the other vehicle in the periphery, and a route guide using the route guide data are executed. The navigation device for the automobile is provided with a means for notifying the occurrence of the trigger information to the inter-vehicle communication request means at a point on the route where the trigger information is detected.
| 49. A communication path generating means for generating a communication path between the vehicle and the other vehicle responding in response to the inter-vehicle communication request, and the communication path generated by the communication channel generation means are provided. This navigation device for the automobile is provided with a behavior schedule information transmission means for transmitting the behavior schedule information for informing the other vehicle influence behavior of the execution schedule to the other vehicle.
| 50. The trigger information is included in a point of claim 48 or a claim 49 which is included in the same spot as the point where the other vehicle influence behavior is to be performed or at a predetermined point on this side of the route guidance route.
| 51. The automobile is provided with an automatic operation mode for autonomously performing the behavior of the own vehicle, and is provided with a means for blinking or lighting control of an indicator including a turn indicator, a hazard lamp, a back light, and a brake light; and the automobile navigation device is provided. In the automatic operation mode, on the route of the route guidance of the route guide data, information of a flashing indication point and a lighting indication point for blinking or lighting control of the indicator is included, and the point on the route including the trigger information is provided. The navigation device for the automobile described in either the same point as the blinking indication point or the same point as the lighting indication point or the same point than the lighting indication point is described in either of the claim 48 and the claim 50.
| 52. The trigger information is composed of other vehicle influence behavior generation information including information of the other vehicle influence behavior to be generated, and is included in either of the claim 48 to the claim 51.
| 53. As the other vehicle influence behavior, the behavior of turning to the right or left, the behavior of changing the lane, the behavior of merging on the general road or the highway, the behavior of entering into a traffic line in traffic, the behavior of entering the intersection. The behavior including at least one of the behavior of entering the rotary of the station is stored; and the information of the other vehicle influence behavior included in the other vehicle influence behavior generation information is provided. The navigation device for the automobile described in claim 52 is characterized in that it is the information of the behavior selected from the stored behavior.
| 54. The other vehicle influence behavior occurrence information includes information of a point where the other vehicle influence behavior is performed, and is included in a claim 52 or a claim 53.
| 55. The other vehicle influence behavior included in the other vehicle influence behavior generation information is differed depending on whether the vehicle is on the left side or on the right side, and the other vehicle navigation device is characterized in either of the claim 52 to the claim 54.
| 56. The automobile navigation device determines whether the vehicle is on the left side or the right road according to the country or region specified based on the present position detected by the present position detection means; and the other vehicle influence behavior is set based on the determination result.
| 57. The other vehicle influence behavior included in the other vehicle influence behavior generation information is set on the basis of the presence or absence of the traffic signal, and the vehicle navigation device is described in either of the claim 52 to the claim 56.
| 58. The other vehicle influence behavior included in the other vehicle influence behavior generation information is set on the basis of the priority relation between the vehicles, and the vehicle navigation device is described in either of the claim 52 to the claim 57.
| 59. A computer provided with a navigation device for an automobile includes a transmission/reception function, a radio communication part for performing radio communication between vehicles, and a vehicle-to-vehicle communication request means for performing inter-vehicle <span id='highlight_communication' style='background-color: #ffff00'... | The motor vehicle comprises a vehicle-to-vehicle-communications request unit (201), which provides a vehicle to vehicle-communications demand to a surrounding the second vehicle through a wireless communication unit. A communication channel generation unit generates a communication channel between the other vehicles which respond according to the vehicle-to-vehicle communication request. A behavior plan information transmission unit transmits the behavior plan data which notifies the other vehicle influence behavior that is due to be performed through the communication channel generated by the generation unit to the second vehicle. The trigger information is included in the point same as the point which should carry out other influence behavior on the route of the route guidance. INDEPENDENT CLAIMS are included for the following :a program for motor vehicles;a navigation apparatus for motor vehicles;a program for navigation apparatuses; anda route guidance data for the navigation apparatuses. Motor vehicle e.g. The behavior plan information transmission unit transmits the behavior plan that notifies the other vehicle influence behavior that is performed through the communication channel, so that traffic accident can be avoided in a reliable manner, thus ensuring safety of the vehicle. The drawing shows a block diagram of the motor vehicle. (Drawing includes non-English language text). 201Vehicle-to-vehicle-communications request unit204Present Position Detection unit205Confirmation unit206Determination unit300Process unit301Generation Management unit |
Please summarize the input | Automobile and automotive program|1. The automobile is provided with a manual operation mode and an automatic operation mode capable of autonomous traveling, has a transmission/reception function, and is provided with a radio communication part for performing radio communication between vehicles, and a monitoring means for monitoring the reception of a communication request between vehicles from another vehicle through the radio communication part. The monitoring means, when receiving a vehicle-to-vehicle communication request from the other vehicle, determines whether or not the own vehicle is in the manual operation mode; and when the vehicle is in the manual operation mode, switching means to switch to the automatic operation mode are provided. The automobile is provided with a communication path generation means for generating a communication path between the vehicle and the other vehicle which has transmitted the inter-vehicle communication request in the state of the automatic operation mode switched by the switching means.
| 2. The switching means is a vehicle described in a claim 1 which is characterized in that it is forcibly switched to the automatic operation mode when the mode is in the manual operation mode.
| 3. The switching means, when the driver is in the manual operation mode, inquires of the driver of the permission of switching to the automatic operation mode, and when the driver obtains the answer of the permission of the changeover, the vehicle is switched to the automatic operation mode.
| 4. During the communication between the other vehicle and the vehicle, the automobile is described in one of claims 1 to 3 which are characterized in that it is made impossible to switch from the automatic operation mode to the manual operation mode.
| 5. A behavior schedule information analysis means for analyzing behavior schedule information for notifying other vehicle influence behavior which is sent from the other vehicle through a communication path generated by the communication path generation means and which affects the behavior of the other vehicle in the periphery is provided. On the basis of an analysis result by the behavior schedule information analysis means, a behavior determination means for determining the behavior of the own vehicle corresponding to the behavior schedule information is provided. The automobile is described in one of claims 1 to 4, which includes: reply information including information on the behavior of the own vehicle determined by the behavior determination means; and a reply information transmission means for sending the information to the other vehicle through the communication path.
| 6. The behavior schedule information includes information about the current position information of the other vehicle and a travel route related to the other vehicle influence behavior. The behavior schedule information analysis means determines whether or not the traveling of the own vehicle is affected by the information on the current position information of the other vehicle included in the behavior schedule information and the travel route related to the other vehicle influence behavior; and when it is determined that there is no influence, the behavior schedule information analysis means is configured to determine whether the vehicle is affected by the vehicle. The automobile is characterized in that the communication path generated by the communication path generation means is cut off.
| 7. The behavior determination means acquires information on the traveling speed of the other vehicle in the inter-vehicle communication with the other vehicle, and determines the behavior of the own vehicle based on the traveling speed of the other vehicle detected from the acquired information and the traveling speed of the own vehicle. The automobile is characterized by a claim 5 or a claim 6.
| 8. This system is provided with a camera for photographing the periphery of one's own vehicle; the behavior determination means recognizes a traffic sign and/or a traffic sign from a photographed image of the camera; and discriminates traffic regulations around one's own vehicle based on the recognition result; and determines the behavior of the own vehicle in consideration of the discriminated traffic regulation. The automobile is described in one of claims 5 to 7.
| 9. The automobile is provided with a camera for photographing the periphery of the own vehicle, and the behavior determination means determines the peripheral situation of the own vehicle from the photographed image of the camera, and determines the behavior of the own vehicle in consideration of the discriminated peripheral situation.
| 10. The behavior schedule information includes information for specifying the other vehicle, and the reply information includes information for specifying one's own vehicle, and is described in either of the claims 5 to 9.
| 11. The other vehicle influence behavior is the automobile described in either the claim 5 to the claim 10 characterized in that the vehicle is different depending on whether the vehicle is on the left side or on the right side traffic.
| 12. The other vehicle influence behavior is set on the basis of the presence/absence of the traffic signal, and the automobile is described in one of the claim 5 to claim 11.
| 13. The other vehicle influence behavior is set on the basis of the priority relation between the vehicles, and the automobile is described in either of the claims 5 to claim 12.
| 14. This system is provided with a camera for photographing the periphery of one's own vehicle and/or a microphone for collecting sound around the own vehicle, and a means for detecting an emergency vehicle from a photographed image around one's own vehicle photographed by the camera and/or voice around the own vehicle collected by the microphone. In the behavior determination means, the automobile is described in either of claims 5 to claim 13, which determines the behavior of the own vehicle with the priority of the emergency vehicle.
| 15. When the completion notice of the other vehicle influence behavior is received from the other vehicle, the automobile is described in either of the claim 5 to the claim 14 characterized by the disconnection of the communication path.
| 16. A confirmation notice reception means for receiving the confirmation notice of the information of the behavior of the own vehicle included in the reply information is provided through the communication path; and after the confirmation notice is received by the confirmation notice reception means, the behavior of the own vehicle determined by the behavior determination means is executed. The automobile is described in one of claims 5 to 15.
| 17. A navigation system is provided with a car navigation function part for forming route guide data including trigger information of inter-vehicle communication in response to a point on the route of a route guide which requires execution of other vehicle influence behavior which affects the behavior of the other vehicle around the vehicle. The automobile is described in one of claims 1 to 16.
| 18. During execution of a route guide using the route guide data by the car navigation function part, at a point on the route where the trigger information is detected, the automobile is described in a claim 17 characterized in that the communication request is made to the other vehicle in the periphery through the radio communication part.
| 19. The automobile is provided with a camera for photographing the periphery of one's own vehicle, detects the recognition of a prescribed situation from the photographed image of the camera as trigger information for communication between vehicles, and makes a communication request between vehicles to other vehicles around the vehicle.
| 20. The prescribed situation is the automobile described in a claim 19 characterized in that it includes the time of entering the intersection, the approach of the highway, or the rotary approach of the station.
| 21. The automobile is provided with a vehicle-to-vehicle communication request means for making a vehicle-to-vehicle communication request to other vehicles in the periphery through the radio communication part.
| 22. The automobile is provided with a vehicle-to-vehicle communication request button, and when the inter-vehicle communication request button is operated, the inter-vehicle communication request means performs the inter-vehicle communication request to the other vehicle in the periphery through the radio communication part.
| 23. The inter-vehicle communication requesting means detects the reception of the radio wave information from the beacon installed on the road as trigger information of the inter-vehicle communication, and makes a vehicle-to-vehicle communication request to the other vehicle in the periphery, and the automobile is described in the claim 21.
| 24. The inter-vehicle communication requesting means detects at least one of a backlight lighting; a brake light lighting; a sudden accelerator; and a sudden brake as trigger information for inter-vehicle communication, and makes a vehicle-to-vehicle communication request to other vehicles around the vehicle. The automobile described in the claim 21 to Claim 23 is characterized by the above.
| 25. The automobile is described in either of the claims 18 to claim 24, in which a communication request for the vehicle is made to the other vehicle around the vehicle, and when the response information is not received within a prescribed time from the other vehicle in the periphery, a message indicating that the other vehicle capable of communication between vehicles is not present in the periphery is notified to the driver.
| 26. In response to a vehicle-to-vehicle communication request from the other vehicle, a means for receiving response information including information on whether the other vehicle is an automatic driving vehicle for autonomously performing the behavior of the own vehicle or an autonomous driving mode for autonomously performing the behavior of the own vehicle is received through the radio communication part; and a means for receiving the response information from the other vehicle via the radio communication part. From the received response information, it is provided with a discrimination means for discriminating whether the other vehicle is an automatic driving vehicle for autonomously performing the behavior of the own vehicle or an automatic operation mode state for autonomously performing the behavior of the own vehicle. The communication path generation means generates a communication path between the other vehicle discriminated as being an automatic driving vehicle for autonomously performing the behavior of the own vehicle or the other vehicle determined to be in an autonomous operation mode for autonomously performing the behavior of the own vehicle based on the discrimination result of the discrimination means. The automobile is described in one of claims 1 to 24.
| 27. A computer provided with a manual operation mode and an automatic operation mode capable of autonomous traveling and having a transmission/reception function and provided with a radio communication part for radio communication between vehicles is provided through the radio communication part; and a monitoring means for monitoring reception of a communication request between vehicles from other vehicles is provided. The monitoring means, when receiving a vehicle-to-vehicle communication request from the other vehicle, determines whether the own vehicle is in the manual operation mode state; and when the vehicle is in the manual operation mode, switching means to switch to the automatic operation mode is performed. The program for the automobile is made to function as a communication path generating means for generating a communication path between the vehicle and the other vehicle which has transmitted the inter-vehicle communication request in the state of the automatic operation mode switched by the switching means. | The automobile has a manual operation mode and an automatic operation mode capable of autonomous traveling. A radio communication unit (102) is used for performing radio communication between vehicles. A monitoring unit is used for monitoring the reception of communication requests between vehicles from other vehicles through the radio communication unit. A switching unit is used for switching to the automatic operation mode when the vehicle is in the manual operation mode. The automatic operation mode is switched by the switching unit. A communication path generating unit is used for generating a communication path between the vehicle and the other vehicle which has transmitted the inter-vehicle communication request. An INDEPENDENT CLAIM is included for a program for automobile. Automobile e.g. car. The communication path generation part generates the communication path between the vehicle and the other vehicle, which has transmitted the inter-vehicle communication request in the state of the automatic operation mode switched by the switching part, so that traffic accident can be avoided and other vehicle influence behavior can be safely performed. The drawing shows a schematic view of the electronic control circuit portion. (Drawing includes a non-English language text) 102Radio communication unit110Position detecting unit112Touch panel117Vehicle-to-vehicle communication control processing unit118Behavior control section |
Please summarize the input | The program for a motor vehicle and motor vehiclesPROBLEM TO BE SOLVED: To provide an automobile capable of avoiding a traffic accident at the time of performing an another vehicle affecting behavior, and assuredly performing the behavior.
SOLUTION: When the timing has come that an another vehicle affecting behavior which is predetermined as a behavior affecting a behavior of another vehicle in the vicinity should be performed, an inter-vehicle communication request is transmitted to the another vehicle in the vicinity by means of a wireless communication unit. On receipt of response information responding to the inter-vehicle communication request sent from an autonomous vehicle that autonomously performs an own vehicle behavior, or another vehicle placed in an autonomous mode in which the own vehicle behavior is autonomously performed, a communication channel is produced with respect to the another vehicle. Behavior schedule information needed to notify another vehicle of an another vehicle affecting behavior that is scheduled to be performed is transmitted to the another vehicle in the vicinity over the communication channel. Return information to be sent from the another vehicle accordingly is analyzed in order to check the behavior of the another vehicle determined according to the behavior schedule information. Processing is then performed in line with the checked behavior of the another vehicle.
SELECTED DRAWING: Figure 1|1. The wireless communication part for having a transmission/reception function and communicating by radio|wireless between vehicles,
A vehicle-to-vehicle-communications request|requirement means to transmit a vehicle-to-vehicle-communications request|requirement to a surrounding other vehicle through the said wireless communication part when it becomes the timing which should carry out predetermined other vehicle influence behavior as behavior which affects behavior of a surrounding other vehicle,
The said other vehicle sent from the said other vehicle according to the said vehicle-to-vehicle-communications request|requirement is an automatic driving vehicle which performs behavior of the own vehicle autonomously,
Or a means to receive the response information containing that information which is a state in the automatic driving|operation mode in which behavior of the own vehicle is performed autonomously through the said wireless communication part,
The response information received as above to the said other vehicle is an automatic driving vehicle which performs behavior of the own vehicle autonomously,
Or the determination means which discriminate|determines whether it is a state in the automatic driving|operation mode in which behavior of the own vehicle is performed autonomously,
A communication channel production|generation means to produce|generate a communication channel between the other vehicle discriminate|determined based on the determination result of the said determination means as it is an automatic driving vehicle which performs behavior of the own vehicle autonomously, or the other vehicle discriminate|determined as it is a state in automatic driving|operation mode which performs behavior of the own vehicle autonomously,
A behavior plan information transmission means to transmit to the said other vehicle by the said wireless communication part through the communication channel which produced|generated the behavior plan information for notifying the said other vehicle influence behavior that is due to be performed to an other vehicle by the said communication channel production|generation means,
These are provided,
The motor vehicle characterized by the above-mentioned.
| 2. The present position information of the own vehicle and the information regarding the running path related to the said other vehicle influence behavior are contained in the said behavior plan information,
The motor vehicle of Claim 1 characterized by the above-mentioned.
| 3. A means to receive the reply information sent from the said other vehicle through the said wireless communication part corresponding to the said behavior plan information,
A reply information analysis means to analyze the said reply information received as above and to analyze behavior of the said other vehicle with respect to the said other vehicle influence behavior,
The behavior determination means which determines behavior of the own vehicle based on the analysis result in the said reply information analysis means,
The behavior control means which performs behavior of the own vehicle determined by the said behavior determination means,
These are provided,
The motor vehicle of Claim 1 or Claim 2 characterized by the above-mentioned.
| 4. When it judges that the said other vehicle influence behavior can perform safely by behavior of the said other vehicle of the analysis result in the said reply information analysis means,
A confirmation means to notify the notification of confirmation which shows having confirmed behavior of the said other vehicle determined corresponding to the said behavior plan information to the said other vehicle through the said communication channel is provided,
The motor vehicle of Claim 3 characterized by the above-mentioned.
| 5. Behavior of the said others who concern in the said other vehicle influence behavior of the own vehicle is specified out of behavior of the said other vehicle detected by the analysis result in the said reply information analysis means,
When it judges that the said other vehicle influence behavior can perform safely by other vehicle behavior specified as above, a confirmation means to notify the notification of confirmation which shows having confirmed behavior of the said other vehicle determined corresponding to the said behavior plan information to the said other vehicle through the said communication channel is provided,
The motor vehicle of Claim 3 characterized by the above-mentioned.
| 6. The said behavior determination means is determined so that the own vehicle may perform behavior which notifies a driver|operator of behavior of the said other vehicle of the analysis result in the said reply information analysis means,
The motor vehicle in any one of the Claims 3-5 characterized by the above-mentioned.
| 7. It is an automatic driving vehicle which can be autonomously run,
Comprising:
The said behavior determination means determines the behavior about the automatic driving|operation the said other vehicle influence behavior enables it to perform safely corresponding to behavior of the said other vehicle of the analysis result in the said reply information analysis means,
The motor vehicle in any one of the Claims 3-5 characterized by the above-mentioned.
| 8. It is a motor vehicle provided with manual operation mode and the automatic driving|operation mode which can be autonomously run,
Comprising:
The operation mode determination means which discriminate|determines any in the said manual operation mode and the said automatic driving|operation mode are selected is provided,
The said behavior determination means,
When the said manual operation mode was selected by the said operation mode determination means and it discriminate|determines, it determines so that the own vehicle may perform behavior which notifies a driver|operator of behavior of the said other vehicle of the analysis result in the said reply information analysis means,
When the said automatic driving|operation mode was selected by the said operation mode determination means and it discriminate|determines, corresponding to behavior of the said other vehicle of the analysis result in the said reply information analysis means, the behavior about the automatic driving|operation the said other vehicle influence behavior enables it to perform safely is determined,
The motor vehicle in any one of the Claims 3-5 characterized by the above-mentioned.
| 9. During execution of the behavior determined by the said behavior determination means by the said behavior control means, the switching in manual operation mode from automatic driving|operation mode is prohibited,
The motor vehicle of Claim 8 characterized by the above-mentioned.
| 10. The said behavior determination means acquires the information of the travel speed of the said other vehicle in the vehicle-to-vehicle communications between the said other vehicles,
Behavior of the said own vehicle is determined based on the travel speed of the said other vehicle detected from the acquired said information, and the travel speed of the own vehicle,
The motor vehicle in any one of the Claims 3-9 characterized by the above-mentioned.
| 11. The camera which image|photographs the periphery of the own vehicle is provided,
The said behavior determination means recognizes a traffic sign and/or a traffic sign from the picked-up image of the said camera,
Based on the recognition result, the traffic regulation in the periphery of the own vehicle is discriminate|determined,
The discriminate|determined said traffic regulation is considered and behavior of the said own vehicle is determined,
The motor vehicle in any one of the Claims 3-10 characterized by the above-mentioned.
| 12. The camera which image|photographs the periphery of the own vehicle is provided,
The said behavior determination means discriminate|determines the periphery condition of the own vehicle from the picked-up image of the said camera, the discriminate|determined said periphery condition is considered, and behavior of the said own vehicle is determined,
The motor vehicle in any one of the Claims 3-11 characterized by the above-mentioned.
| 13. While the information for specifying the own vehicle as the said other vehicle is contained in the said behavior plan information, information for the own vehicle to specify the said other vehicle is contained in the reply information from the said other vehicle,
The motor vehicle in any one of the Claims 1-12 characterized by the above-mentioned.
| 14. The present position information of the own vehicle and the characteristic information which can specify the own vehicle from the picked-up image of a camera are contained in the information for specifying the own vehicle as the said other vehicle,
The present position information of the said other vehicle and the characteristic information which can specify an other vehicle from the picked-up image of a camera are contained in information for the own vehicle to specify the said other vehicle,
The motor vehicle of Claim 13 characterized by the above-mentioned.
| 15. Whether it became the timing which should carry out predetermined other vehicle influence behavior discriminate|determines based on operation of operation of a turn indicator, operation of a hazard lamp, operation of back driving, a sudden accelerator, or rapid braking,
The motor vehicle in any one of the Claims 1-14 characterized by the above-mentioned.
| 16. A vehicle-to-vehicle-communications request button is provided,
When the said vehicle-to-vehicle-communications request button is operated, the said vehicle-to-vehicle-communications request|requirement means transmits a vehicle-to-vehicle-communications request|requirement to a surrounding other vehicle through the said wireless communication part,
The motor vehicle in any one of the Claims 1-15 characterized by the above-mentioned.
| 17. The function part of the car-navigation system whose setting of the said other vehicle influence behavior that should transmit the said vehicle-to-vehicle-communications request|requirement is enabled about the said other vehicle influence behavior included in route guidance data is provided,
When the said other vehicle influence behavior that should transmit the said vehicle-to-vehicle-communications request|requirement set as above is detected, the said vehicle-to-vehicle-communications request|requirement is transmitted,
The motor vehicle in any one of the Claims 1-16 characterized by the above-mentioned.
| 18. The function part of a car-navigation system is provided,
When it searches for the path|route to the destination,
The function which sets the behavior plan information for notifying the said other vehicle influence behavior for the position same as the position which should carry out said other vehicle influence behavior, or the predetermined|prescribed position of the near side to an other vehicle as a position which transmits to a surrounding other vehicle through the said wireless communication part is provided,
The motor vehicle in any one of the Claims 1-17 characterized by the above-mentioned.
| 19. When a cutting|disconnection request|requirement of the said communication channel from the other vehicle which transmitted through the communication channel which produced|generated the said behavior plan information by the said communication channel production|generation means is received, the communication channel between the said other vehicles is cut|disconnected,
The motor vehicle in any one of the Claims 1-18 characterized by the above-mentioned.
| 20. A means to discriminate|determine whether execution of the said other vehicle influence behavior was complete|finished is provided, and when it discriminate|determines that execution of the said other vehicle influence behavior was complete|finished, the notification of the completion of completion|finish of execution is transmitted to the said other vehicle through the said communication channel,
Then, the said communication channel is cut|disconnected,
The motor vehicle in any one of the Claims 1-19 characterized by the above-mentioned.
| 21. It is an automatic driving vehicle which performs behavior of the own vehicle autonomously,
Comprising:
The wireless communication part for having a transmission/reception function and communicating by radio|wireless between vehicles,
A monitoring means for the said wireless communication part to have been led and to monitor reception of the vehicle-to-vehicle-communications request|requirement from an other vehicle,
A means by which the own vehicle transmits the response information containing the information of being an automatic driving vehicle which performs behavior of the own vehicle autonomously through the said wireless communication part by the said monitoring means when the vehicle-to-vehicle-communications request|requirement from the said other vehicle is received,
A plan information reception means to receive the behavior plan information for notifying the other vehicle influence behavior which is due to be performed as behavior which affects behavior of the other surrounding vehicle of the said other vehicle from the said other vehicle that has sent the said vehicle-to-vehicle-communications request|requirement through the radio channel produced|generated based on the said response information,
The influence judgment means which judges whether the received said behavior plan information is analyzed and the said other vehicle influence behavior has influence on behavior of the own vehicle,
The behavior determination means which determines behavior of the own vehicle based on the judgment result of the said influence judgment means,
These are provided,
The motor vehicle characterized by the above-mentioned.
| 22. It is a motor vehicle provided with the automatic driving|operation mode in which behavior of the own vehicle is performed autonomously, and manual operation mode,
Comprising:
The wireless communication part for having a transmission/reception function and communicating by radio|wireless between vehicles,
A monitoring means for the said wireless communication part to have been led and to monitor reception of the vehicle-to-vehicle-communications request|requirement from an other vehicle,
A means to transmit the response information which contains the information of being a state in the said automatic driving|operation mode when the vehicle-to-vehicle-communications request|requirement from the said other vehicle is received and the own vehicle is a state in the said automatic driving|operation mode through the said wireless communication part by the said monitoring means,
A plan information reception means to receive the behavior plan information for notifying the other vehicle influence behavior which is due to be performed as behavior which affects behavior of the other surrounding vehicle of the said other vehicle from the said other vehicle that has sent the said vehicle-to-vehicle-communications request|requirement through the radio channel produced|generated based on the said response information,
The influence judgment means which judges whether the said behavior plan information is analyzed and the said other vehicle influence behavior has influence on behavior of the own vehicle,
The behavior determination means which determines behavior of the own vehicle based on the judgment result of the said influence judgment means,
These are provided,
The motor vehicle characterized by the above-mentioned.
| 23. While carrying out the said other vehicle and vehicle-to-vehicle communications, switching to the said manual operation mode from the said automatic driving|operation mode is made improper,
The motor vehicle of Claim 22 characterized by the above-mentioned.
| 24. The said behavior determination means is said influence judgment means, and when the said other vehicle influence behavior judges that behavior of the own vehicle is not affected, the said communication channel produced|generated by the said communication channel production|generation means is cut|disconnected,
The motor vehicle in any one of the Claims 21-23 characterized by the above-mentioned.
| 25. The said behavior determination means is said influence judgment means, and when the said other vehicle influence behavior judges that it has influence on behavior of the own vehicle, based on the analysis result in the said behavior plan information analysis means, behavior of the own vehicle corresponding to the said behavior plan information is determined,
The reply information containing the information of behavior of the determined said own vehicle is sent to the said other vehicle through the said communication channel,
The motor vehicle in any one of the Claims 21-24 characterized by the above-mentioned.
| 26. The present position information of the said other vehicle and the information regarding the running path related to the said other vehicle influence behavior are contained in the said behavior plan information,
The said influence judgment means judges whether driving|running|working of the own vehicle is influenced from the present position information of the said other vehicle contained in the said behavior plan information, and the information regarding the running path related to the said other vehicle influence behavior,
The motor vehicle in any one of the Claims 21-25 characterized by the above-mentioned.
| 27. When the execution complete notification of the said other vehicle influence behavior is received from the said other vehicle, the said communication channel is cut|disconnected,
The motor vehicle in any one of the Claims 21-26 characterized by the above-mentioned.
| 28. The said behavior determination means acquires the information of the travel speed of the said other vehicle in the vehicle-to-vehicle communications between the said other vehicles,
Behavior of the said own vehicle is determined based on the travel speed of the said other vehicle detected from the acquired said information, and the travel speed of the own vehicle,
The motor vehicle in any one of the Claims 21-27 characterized by the above-mentioned.
| 29. The camera which image|photographs the periphery of the own vehicle is provided,
The said behavior determination means recognizes a traffic sign and/or a traffic sign from the picked-up image of the said camera,
Based on the recognition result, the traffic regulation in the periphery of the own vehicle is discriminate|determined,
The discriminate|determined said traffic regulation is considered and behavior of the said own vehicle is determined,
The motor vehicle in any one of the Claims 21-28 characterized by the above-mentioned.
| 30. The camera which image|photographs the periphery of the own vehicle is provided,
The said behavior determination means discriminate|determines the periphery condition of the own vehicle from the picked-up image of the said camera, the discriminate|determined said periphery condition is considered, and behavior of the said own vehicle is determined,
The motor vehicle in any one of the Claims 21-29 characterized by the above-mentioned.
| 31. While the information for specifying the said other vehicle is contained in the said behavior plan information, information for the own vehicle to make it specifying is contained in the said reply information,
The motor vehicle in any one of the Claims 25-30 characterized by the above-mentioned.
| 32. The present position information of an other vehicle and the characteristic information which can specify the said other vehicle from the picked-up image of a camera are contained in the information for specifying the said other vehicle,
The present position information of the own vehicle and the characteristic information as which the own vehicle can be specified from the picked-up image of a camera are contained in the information for specifying the said own vehicle,
The motor vehicle of Claim 31 characterized by the above-mentioned.
| 33. Each part and/or each means of the motor vehicle in any one of Claims 1-20,
Each part and/or each means of a motor vehicle in any one of Claims 21-32
The motor vehicle characterized by having these.
| 34. The said other vehicle influence behavior differs in the vehicle according to left-hand traffic and right-hand traffic,
The motor vehicle in any one of the Claims 1-33 characterized by the above-mentioned.
| 35. A present position detection means is provided and a vehicle determines left-hand traffic and right-hand traffic according to the country or the area pinpointed based on the present position detected by the said present position detection means,
The said other vehicle influence behavior is set according to the determination result,
The motor vehicle of Claim 34 characterized by the above-mentioned.
| 36. The said other vehicle influence behavior is set based on the presence or absence of a signal apparatus,
The motor vehicle in any one of the Claims 1-28 characterized by the above-mentioned.
| 37. The camera which image|photographs the periphery of the own vehicle is provided,
The presence or absence of the said signal apparatus is discriminate|determined from the picked-up image of the said camera,
The motor vehicle of Claim 36 characterized by the above-mentioned.
| 38. The communication means which receives the electromagnetic wave from a signal apparatus is provided,
The presence or absence of the said signal apparatus is discriminate|determined from the information of the electromagnetic wave received by the said communication means,
The motor vehicle of Claim 36 characterized by the above-mentioned.
| 39. The said other vehicle influence behavior is set based on the priority relationship between vehicles,
The motor vehicle in any one of the Claims 1-38 characterized by the above-mentioned.
| 40. The camera which image|photographs the periphery of the own vehicle is provided,
The priority relationship between the said vehicles is discriminate|determined from the picked-up image of the said camera,
The motor vehicle of Claim 39 characterized by the above-mentioned.
| 41. While providing the microphone which sound-collects the surrounding audio|voice of the camera and/or the own vehicle which image|photograph the periphery of the own vehicle, a means to detect an emergency vehicle is provided from the audio|voice around the own vehicle sound-collected by the surrounding picked-up image and/or said microphone of the own vehicle image|photographed with the said camera,
In the said behavior determination means, it determines so that the own vehicle which gave priority to the said emergency vehicle may be served,
The motor vehicle in any one of the Claims 3-40 characterized by the above-mentioned.
| 42. The computer with which a motor vehicle provided with the wireless communication part for having a transmission/reception function and communicating by radio|wireless between vehicles is provided,
A vehicle-to-vehicle-communications request|requirement means to transmit a vehicle-to-vehicle-communications request|requirement to a surrounding other vehicle through the said wireless communication part when it becomes the timing which should carry out predetermined other vehicle influence behavior as behavior which affects behavior of a surrounding other vehicle,
The said other vehicle sent from the said other vehicle according to the said vehicle-to-vehicle-communications request|requirement is an automatic driving vehicle which performs behavior of the own vehicle autonomously,
Or a means to receive the response information containing that information which is a state in the automatic driving|operation mode in which behavior of the own vehicle is performed autonomously through the said wireless communication part,
The response information received as above to the said other vehicle is an automatic driving vehicle which performs behavior of the own vehicle autonomously,
Or the determination means which discriminate|determines whether it is a state in the automatic driving|operation mode in which behavior of the own vehicle is performed autonomously,
A communication channel production|generation means to produce|generate a communication channel between the other vehicle discriminate|determined based on the determination result of the said determination means as it is an automatic driving vehicle which performs behavior of the own vehicle autonomously, or the other vehicle discriminate|determined as it is a state in automatic driving|operation mode which performs behavior of the own vehicle autonomously,
A behavior plan information transmission means to transmit to the said other vehicle by the said wireless communication part through the communication channel which produced|generated the behavior plan information for notifying the said other vehicle influence behavior that is due to be performed to an other vehicle by the said communication channel production|generation means,
The program for motor vehicles for making it function as these.
| 43. It is an automatic driving vehicle which performs behavior of the own vehicle autonomously,
Comprising:
The computer with which a motor vehicle provided with the wireless communication part for having a transmission/reception function and communicating by radio|wireless between vehicles is provided,
A monitoring means for the said wireless communication part to have been led and to monitor reception of the vehicle-to-vehicle-communications request|requirement from an other vehicle,
A means by which the own vehicle transmits the response information containing the information of being an automatic driving vehicle which performs behavior of the own vehicle autonomously through the said wireless communication part by the said monitoring means when the vehicle-to-vehicle-communications request|requirement from the said other vehicle is received,
A plan information reception means to receive the behavior plan information for notifying the other vehicle influence behavior which is due to be performed as behavior which affects behavior of the other surrounding veh... | The vehicle has a communication channel generation part for generating a communication channel between a vehicle determined based on a determination result of a determination part, and another vehicle determined in a state of an automatic driving mode at which behavior of an own vehicle is performed autonomously. A behavior plan information transmission part transmits behavior plan information from the wireless communication part to the latter vehicle through the communication channel generated by the communication channel generation part for notifying vehicle influence behavior. An INDEPENDENT CLAIM is also included for a program comprising a set of instructions for operating a motor vehicle. Motor vehicle. The behavior plan information transmission part transmits the behavior plan information to the vehicle through the communication channel for notifying vehicle influence behavior so as to avoid traffic accident reliably, thus improving safety of a motor vehicle. The drawing shows a block diagram of an electronic control circuit unit of a motor vehicle. '(Drawing includes non-English language text)' 101Control part102Wireless communication part105Manual/automatic operation mode switching control part107Camera group108Sensor group |
Please summarize the input | The program for a motor vehicle and motor vehiclesPROBLEM TO BE SOLVED: To provide an automobile capable of avoiding a traffic accident at the time of performing an another vehicle affecting behavior, and assuredly performing the behavior.
SOLUTION: When the timing has come that an another vehicle affecting behavior which is predetermined as a behavior affecting a behavior of another vehicle in the vicinity should be performed, an inter-vehicle communication request is transmitted to the another vehicle in the vicinity by means of a wireless communication unit. On receipt of response information responding to the inter-vehicle communication request sent from an autonomous vehicle that autonomously performs an own vehicle behavior, or another vehicle placed in an autonomous mode in which the own vehicle behavior is autonomously performed, a communication channel is produced with respect to the another vehicle. Behavior schedule information needed to notify another vehicle of an another vehicle affecting behavior that is scheduled to be performed is transmitted to the another vehicle in the vicinity over the communication channel. Return information to be sent from the another vehicle accordingly is analyzed in order to check the behavior of the another vehicle determined according to the behavior schedule information. Processing is then performed in line with the checked behavior of the another vehicle.
SELECTED DRAWING: Figure 1|1. The wireless communication part for having a transmission/reception function and communicating by radio|wireless between vehicles,
A vehicle-to-vehicle-communications request|requirement means to transmit a vehicle-to-vehicle-communications request|requirement to a surrounding other vehicle through the said wireless communication part before performing said other vehicle influence behavior, when it becomes the timing which should carry out predetermined other vehicle influence behavior as behavior which affects behavior of a surrounding other vehicle,
A communication channel production|generation means to send response information according to the said vehicle-to-vehicle-communications request|requirement and to produce|generate a communication channel between the automatic driving vehicle which performs behavior of the own vehicle autonomously, or the other vehicle of the state in automatic driving|operation mode which performs behavior of the own vehicle autonomously,
A behavior plan information transmission means to transmit the behavior plan information for notifying the said other vehicle influence behavior that is due to be performed to an other vehicle to a surrounding other vehicle by the said wireless communication part through the produced|generated said communication channel,
A reply information analysis means to analyze the reply information sent from the said other vehicle corresponding to the said behavior plan information,
A confirmation means to confirm behavior of the said other vehicle determined corresponding to the said behavior plan information based on the analysis result in the said reply information analysis means,
These are provided,
The process according to behavior of the said other vehicle confirmed by the said confirmation means is performed.
The motor vehicle characterized by the above-mentioned.
| 2. It is an automatic driving vehicle which can be autonomously run,
Comprising:
Based on behavior of the said other vehicle confirmed by the said confirmation means, the behavior control means which performs control about behavior of the own vehicle is provided,
The motor vehicle of Claim 1 characterized by the above-mentioned.
| 3. It is a motor vehicle provided with manual operation mode and the automatic driving|operation mode which can be autonomously run,
Comprising:
Said automatic driving|operation mode
WHEREIN:
Based on behavior of the said other vehicle confirmed by the said confirmation means, the behavior control means which performs control about behavior of the own vehicle is provided,
The motor vehicle of Claim 1 characterized by the above-mentioned.
| 4. During execution of behavior by the said behavior control means, the switching in manual operation mode from automatic driving|operation mode is prohibited,
The motor vehicle of Claim 3 characterized by the above-mentioned.
| 5. In the vehicle-to-vehicle communications between the said other vehicles, the information of the travel speed of the said other vehicle is acquired,
Based on the travel speed of the said other vehicle detected from the acquired said information, and the travel speed of the own vehicle, the behavior determination means which determines behavior of the said own vehicle is provided,
The said behavior control means performs behavior determined by the said behavior determination means,
The motor vehicle in any one of the Claims 2-4 characterized by the above-mentioned.
| 6. While providing the camera which image|photographs the periphery of the own vehicle,
A traffic sign and/or a traffic sign are recognized from the picked-up image of the said camera,
Based on the recognition result, the traffic regulation in the periphery of the own vehicle is discriminate|determined,
The behavior determination means which considers the discriminate|determined said traffic regulation and determines behavior of the said own vehicle is provided,
The motor vehicle in any one of the Claims 2-5 characterized by the above-mentioned.
| 7. While providing the camera which image|photographs the periphery of the own vehicle,
The behavior determination means which discriminate|determines the periphery condition of the own vehicle from the picked-up image of the said camera, considers the discriminate|determined said periphery condition, and determines behavior of the said own vehicle is provided,
The motor vehicle in any one of the Claims 2-6 characterized by the above-mentioned.
| 8. A notification means to notify a driver|operator of behavior of the said other vehicle confirmed by the said confirmation means is provided,
The motor vehicle in any one of the Claims 1-7 characterized by the above-mentioned.
| 9. While the information for specifying the own vehicle as the said other vehicle is contained in the said behavior plan information, information for the own vehicle to specify the said other vehicle is contained in the reply information from the said other vehicle,
The motor vehicle in any one of the Claims 1-8 characterized by the above-mentioned.
| 10. The present position information of the own vehicle and the characteristic information which can specify the own vehicle from the picked-up image of a camera are contained in the information for specifying the own vehicle as the said other vehicle,
The present position information of the said other vehicle and the characteristic information which can specify an other vehicle from the picked-up image of a camera are contained in information for the own vehicle to specify the said other vehicle,
The motor vehicle in any one of the Claims 1-9 characterized by the above-mentioned.
| 11. Whether it became the timing which should carry out predetermined other vehicle influence behavior discriminate|determines based on operation of operation of a turn indicator, operation of a hazard lamp, operation of back driving, a sudden accelerator, or rapid braking,
The motor vehicle in any one of the Claims 1-10 characterized by the above-mentioned.
| 12. When the function part of a car-navigation system is provided and it searches for the path|route to the destination,
The function which sets the behavior plan information for notifying the said other vehicle influence behavior for the position same as the position which should carry out said other vehicle influence behavior, or the predetermined|prescribed position of the near side to an other vehicle as a position which transmits to a surrounding other vehicle through the said wireless communication part is provided,
The motor vehicle in any one of the Claims 1-11 characterized by the above-mentioned.
| 13. It is an automatic driving vehicle which performs behavior of the own vehicle autonomously,
Comprising:
The wireless communication part for having a transmission/reception function and communicating by radio|wireless between vehicles,
A monitoring means for the said wireless communication part to have been led and to monitor reception of the vehicle-to-vehicle-communications request|requirement from an other vehicle,
A communication channel production|generation means to produce|generate a communication channel by the said monitoring means between the other vehicles which have transmitted the said vehicle-to-vehicle-communications request|requirement when the vehicle-to-vehicle-communications request|requirement from the said other vehicle is received,
A behavior plan information analysis means to analyze the behavior plan information for notifying the other vehicle influence behavior which affects behavior of a surrounding other vehicle sent from the said other vehicle through the produced|generated said communication channel,
The behavior determination means which determines behavior of the own vehicle corresponding to the said behavior plan information based on the analysis result in the said behavior plan information analysis means,
A reply information transmission means to send the reply information containing the information of behavior of the said own vehicle determined by the said behavior determination means to the said other vehicle through the said communication channel,
These are provided,
The motor vehicle characterized by the above-mentioned.
| 14. It is a motor vehicle provided with the automatic driving|operation mode in which behavior of the own vehicle is performed autonomously, and manual operation mode,
Comprising:
The wireless communication part for having a transmission/reception function and communicating by radio|wireless between vehicles,
A monitoring means for the said wireless communication part to have been led and to monitor reception of the vehicle-to-vehicle-communications request|requirement from an other vehicle,
By the said monitoring means, when the vehicle-to-vehicle-communications request|requirement from the said other vehicle is received, it is discriminate|determined whether the own vehicle is the said automatic driving|operation mode,
A communication channel production|generation means to produce|generate a communication channel between the other vehicles which have transmitted the said vehicle-to-vehicle-communications request|requirement when it is the said automatic driving|operation mode,
A behavior plan information analysis means to analyze the behavior plan information for notifying the other vehicle influence behavior which affects behavior of a surrounding other vehicle sent from the said other vehicle through the produced|generated said communication channel,
The behavior determination means which determines behavior of the own vehicle corresponding to the said behavior plan information based on the analysis result in the said behavior plan information analysis means,
A reply information transmission means to send the reply information containing the information of behavior of the said own vehicle determined by the said behavior determination means to the said other vehicle through the said communication channel,
These are provided,
The motor vehicle characterized by the above-mentioned.
| 15. While carrying out the said other vehicle and vehicle-to-vehicle communications, switching to the said manual operation mode from the said automatic driving|operation mode is made improper,
The motor vehicle of Claim 13 characterized by the above-mentioned.
| 16. The present position information of the said other vehicle and the information regarding the running path related to the said other vehicle influence behavior are contained in the said behavior plan information,
By the said behavior plan information analysis means, it is judged whether driving|running|working of the own vehicle is influenced from the present position information of the said other vehicle contained in the said behavior plan information, and the information regarding the running path related to the said other vehicle influence behavior,
When it judges that it is uninfluential, the said communication channel produced|generated by the said communication channel production|generation means is cut|disconnected,
The motor vehicle in any one of the Claims 13-15 characterized by the above-mentioned.
| 17. When the notification of completion of the said other vehicle influence behavior is received from the said other vehicle, the said communication channel is cut|disconnected,
The motor vehicle in any one of the Claims 13-16 characterized by the above-mentioned.
| 18. The said behavior determination means acquires the information of the travel speed of the said other vehicle in the vehicle-to-vehicle communications between the said other vehicles,
Behavior of the said own vehicle is determined based on the travel speed of the said other vehicle detected from the acquired said information, and the travel speed of the own vehicle,
The motor vehicle in any one of the Claims 13-17 characterized by the above-mentioned.
| 19. The camera which image|photographs the periphery of the own vehicle is provided,
The said behavior determination means recognizes a traffic sign and/or a traffic sign from the picked-up image of the said camera,
Based on the recognition result, the traffic regulation in the periphery of the own vehicle is discriminate|determined,
The discriminate|determined said traffic regulation is considered and behavior of the said own vehicle is determined,
The motor vehicle in any one of the Claims 13-18 characterized by the above-mentioned.
| 20. The camera which image|photographs the periphery of the own vehicle is provided,
The said behavior determination means discriminate|determines the periphery condition of the own vehicle from the picked-up image of the said camera, the discriminate|determined said periphery condition is considered, and behavior of the said own vehicle is determined,
The motor vehicle in any one of the Claims 13-19 characterized by the above-mentioned.
| 21. While the information for specifying the said other vehicle is contained in the said behavior plan information, information for the own vehicle to make it specifying is contained in the said reply information,
The motor vehicle in any one of the Claims 13-20 characterized by the above-mentioned.
| 22. The present position information of an other vehicle and the characteristic information which can specify the said other vehicle from the picked-up image of a camera are contained in the information for specifying the said other vehicle,
The present position information of the own vehicle and the characteristic information as which the own vehicle can be specified from the picked-up image of a camera are contained in the information for specifying the said own vehicle,
The motor vehicle of Claim 21 characterized by the above-mentioned.
| 23. The motor vehicle characterized by having each part of the motor vehicle in any one of Claims 1-12, and each part of the motor vehicle in any one of Claims 13-22.
| 24. The above-mentioned predetermined other vehicle influence behavior differs in the vehicle according to left-hand traffic and right-hand traffic,
The motor vehicle in any one of the Claims 1-23 characterized by the above-mentioned.
| 25. A present position detection means is provided and a vehicle determines left-hand traffic and right-hand traffic according to the country specified based on the present position detected by the said present position detection means,
The said other vehicle influence behavior is set according to the determination result,
The motor vehicle of Claim 24 characterized by the above-mentioned.
| 26. The above-mentioned predetermined other vehicle influence behavior is set based on the presence or absence of a signal apparatus,
The motor vehicle in any one of the Claims 1-25 characterized by the above-mentioned.
| 27. The camera which image|photographs the periphery of the own vehicle is provided,
The presence or absence of the said signal apparatus is discriminate|determined from the picked-up image of the said camera,
The motor vehicle of Claim 26 characterized by the above-mentioned.
| 28. The communication means which receives the electromagnetic wave from a signal apparatus is provided,
The presence or absence of the said signal apparatus is discriminate|determined from the information of the electromagnetic wave received by the said communication means,
The motor vehicle of Claim 26 characterized by the above-mentioned.
| 29. The above-mentioned predetermined other vehicle influence behavior is set based on the priority relationship between vehicles,
The motor vehicle in any one of the Claims 1-28 characterized by the above-mentioned.
| 30. The camera which image|photographs the periphery of the own vehicle is provided,
The priority relationship between the said vehicles is discriminate|determined from the picked-up image of the said camera,
The motor vehicle of Claim 29 characterized by the above-mentioned.
| 31. While providing the microphone which sound-collects the surrounding audio|voice of the camera and/or the own vehicle which image|photograph the periphery of the own vehicle, a means to detect an emergency vehicle is provided from the audio|voice around the own vehicle sound-collected with the surrounding picked-up image and/or said microphone of the own vehicle image|photographed with the said camera,
In the said behavior determination means, it determines so that the own vehicle which gave priority to the said emergency vehicle may be served,
The motor vehicle in any one of the Claims 5-30 characterized by the above-mentioned.
| 32. The computer with which a motor vehicle provided with the wireless communication part for having a transmission/reception function and communicating by radio|wireless between vehicles is provided,
A vehicle-to-vehicle-communications request|requirement means to transmit a vehicle-to-vehicle-communications request|requirement to a surrounding other vehicle through the said wireless communication part before performing said other vehicle influence behavior, when it becomes the timing which should carry out predetermined other vehicle influence behavior as behavior which affects behavior of a surrounding other vehicle,
A communication channel production|generation means to send response information according to the said vehicle-to-vehicle-communications request|requirement and to produce|generate a communication channel between the automatic driving vehicle which performs behavior of the own vehicle autonomously, or the other vehicle of the state in automatic driving|operation mode which performs behavior of the own vehicle autonomously,
A behavior plan information transmission means to transmit the behavior plan information for notifying the said other vehicle influence behavior that is due to be performed to an other vehicle to a surrounding other vehicle by the said wireless communication part through the produced|generated said communication channel,
A reply information analysis means to analyze the reply information sent from the said other vehicle corresponding to the said behavior plan information,
A confirmation means to confirm behavior of the said other vehicle determined corresponding to the said behavior plan information based on the analysis result in the said reply information analysis means,
The process means according to behavior of the said other vehicle confirmed by the said confirmation means
The program for motor vehicles for making it function as these.
| 33. The said motor vehicle is an automatic driving vehicle which performs behavior of the own vehicle autonomously,
Comprising:
The program for motor vehicles of Claim 32 for functioning the behavior control means which performs control about behavior of the own vehicle as said process means based on behavior of the said other vehicle confirmed by the said confirmation means in the said computer.
| 34. The said motor vehicle is a motor vehicle provided with manual operation mode and the automatic driving|operation mode in which behavior of the own vehicle is performed autonomously,
Comprising:
The program for motor vehicles of Claim 32 for functioning the behavior control means which performs control about behavior of the own vehicle as said process means based on behavior of the said other vehicle confirmed by the said confirmation means, while operating the said computer in the said automatic driving|operation mode.
| 35. It is an automatic driving vehicle which performs behavior of the own vehicle autonomously,
Comprising:
A computer provided with the wireless communication part for having a transmission/reception function and communicating by radio|wireless between vehicles,
A monitoring means for the said wireless communication part to have been led and to monitor reception of the vehicle-to-vehicle-communications request|requirement from an other vehicle,
A communication channel production|generation means to produce|generate a communication channel by the said monitoring means between the other vehicles which have transmitted the said vehicle-to-vehicle-communications request|requirement when the vehicle-to-vehicle-communications request|requirement from the said other vehicle is received,
A behavior plan information analysis means to analyze the behavior plan information for notifying the other vehicle influence behavior which affects behavior of a surrounding other vehicle sent from the said other vehicle through the produced|generated said communication channel,
The behavior determination means which determines behavior of the own vehicle corresponding to the said behavior plan information based on the analysis result in the said behavior plan information analysis means,
A reply information transmission means to send the reply information containing the information of behavior of the said own vehicle determined by the said behavior determination means to the said other vehicle through the said communication channel,
The program for motor vehicles for making it function as these.
| 36. It is a motor vehicle provided with the automatic driving|operation mode in which behavior of the own vehicle is performed autonomously, and manual operation mode,
Comprising:
A computer provided with the wireless communication part for having a transmission/reception function and communicating by radio|wireless between vehicles,
A monitoring means for the said wireless communication part to have been led and to monitor reception of the vehicle-to-vehicle-communications request|requirement from an other vehicle,
By the said monitoring means, when the vehicle-to-vehicle-communications request|requirement from the said other vehicle is received, it is discriminate|determined whether the own vehicle is the said automatic driving|operation mode,
A communication channel production|generation means to produce|generate a communication channel between the other vehicles which have transmitted the said vehicle-to-vehicle-communications request|requirement when it is the said automatic driving|operation mode,
A behavior plan information analysis means to analyze the behavior plan information for notifying the other vehicle influence behavior which affects behavior of a surrounding other vehicle sent from the said other vehicle through the produced|generated said communication channel,
The behavior determination means which determines behavior of the own vehicle corresponding to the said behavior plan information based on the analysis result in the said behavior plan information analysis means,
A reply information transmission means to send the reply information containing the information of behavior of the said own vehicle determined by the said behavior determination means to the said other vehicle through the said communication channel,
The program for motor vehicles for making it function as these. | The vehicle has a communication channel production unit that sends response information according to the vehicle-to-vehicle-communication request and produces a communication channel between the automatic driving vehicle which performs behavior of the own vehicle autonomously, or other vehicle of the state in automatic driving mode which performs behavior of the own vehicle autonomously. A behavior plan information transmission unit transmits the behavior plan information for notifying other vehicle influence behavior that is due to be performed to other vehicle to a surrounding other vehicle by wireless communication portion through the communication channel. A reply information analysis unit analyzes the reply information sent from other vehicle corresponding to the behavior plan information. A confirmation unit confirms behavior of other vehicle determined corresponding to the behavior plan information based on the analysis result in the reply information analysis unit. An INDEPENDENT CLAIM is included for a program for motor vehicle. Motor vehicle such as large sized vehicle such as common four-wheel motor vehicle, bus, truck, tractor, two-wheeled motor vehicle and bicycle, and special vehicle such as vehicle of public institutions, and electric wheelchair. The traffic accident is avoided reliably. The drawing shows a schematic view of the motor vehicle. (Drawing includes non-English language text) 1Automatic driving vehicle101Control unit102Wireless communication unit105Manual/automatic operation mode switching control unit107Camera group |
Please summarize the input | Automobile and automotive programPROBLEM TO BE SOLVED: To provide an automobile which can certainly avoid a traffic accident to safely perform an another-vehicle affection behavior when the another-vehicle affection behavior is attempted to be executed.
SOLUTION: An automobile comprises: a manual driving mode; and an automatic driving mode by which autonomous travel is possible. The automobile comprises: a radio communication unit having a transmission/reception function for performing radio communication between vehicles; monitoring means of monitoring reception of an inter-vehicle communication request from another vehicle through the radio communication unit; and communication path generation means of determining whether or not a self vehicle is in a state of the automatic driving mode to generate a communication path between the self vehicle and another vehicle which transmits the inter-vehicle communication request in the case where the self vehicle is in the state of the automatic driving mode when the inter-vehicle communication request from another vehicle is received by the monitoring means.
SELECTED DRAWING: Figure 1|1. The automobile is provided with a manual operation mode and an automatic operation mode capable of autonomous traveling, has a transmission/reception function, and is provided with a radio communication part for performing radio communication between vehicles, and a monitoring means for monitoring the reception of a communication request between vehicles from another vehicle through the radio communication part. The monitoring means, when receiving a vehicle-to-vehicle communication request from the other vehicle, determines whether or not the own vehicle is in the automatic operation mode; and when the vehicle is in the automatic operation mode, a communication path generation means for generating a communication path between the vehicle and the other vehicle which has transmitted the inter-vehicle communication request. The automobile is characterized by being provided with the automobile.
| 2. When the own vehicle is in the state of the manual operation mode, the automobile is described in claim 1 which does not generate the communication path.
| 3. The automobile is described in a claim 1 or claim 2 in which switching to the manual operation mode is disabled from the automatic operation mode while performing communication between the other vehicle and the vehicle.
| 4. A behavior schedule information analysis means for analyzing behavior schedule information for notifying other vehicle influence behavior which is sent from the other vehicle through a communication path generated by the communication path generation means and which affects the behavior of the other vehicle in the periphery is provided. On the basis of an analysis result by the behavior schedule information analysis means, a behavior determination means for determining the behavior of the own vehicle corresponding to the behavior schedule information is provided. The automobile is provided with a reply information transmitting means for transmitting reply information including information on the behavior of the own vehicle determined by the behavior determination means to the other vehicle through the communication path.
| 5. The behavior schedule information includes information about the current position information of the other vehicle and a travel route related to the other vehicle influence behavior. The behavior schedule information analysis means determines whether or not the traveling of the own vehicle is affected by the information on the current position information of the other vehicle included in the behavior schedule information and the travel route related to the other vehicle influence behavior; and when it is determined that there is no influence, the behavior schedule information analysis means is configured to determine whether the vehicle is affected by the vehicle. The automobile is characterized in that the communication path generated by the communication path generation means is cut off.
| 6. The behavior determination means acquires information on the traveling speed of the other vehicle in the inter-vehicle communication with the other vehicle, and determines the behavior of the own vehicle based on the traveling speed of the other vehicle detected from the acquired information and the traveling speed of the own vehicle. The automobile is described in claim 4 or claim 5.
| 7. This system is provided with a camera for photographing the periphery of one's own vehicle; the behavior determination means recognizes a traffic sign and/or a traffic sign from a photographed image of the camera; and discriminates traffic regulations around one's own vehicle based on the recognition result; and determines the behavior of the own vehicle in consideration of the discriminated traffic regulation. The automobile is described in one of claims 4 to 6.
| 8. The automobile is provided with a camera for photographing the periphery of the own vehicle, and the behavior determination means determines the peripheral situation of the own vehicle from the photographed image of the camera, and determines the behavior of the own vehicle in consideration of the discriminated peripheral situation.
| 9. The behavior schedule information includes information for specifying the other vehicle, and the reply information includes information for specifying one's own vehicle, and is described in either of the claims 4 to claim 8.
| 10. The other vehicle influence behavior is the automobile described in either of claims 4 to 9, which are characterized in that the vehicle is different depending on whether the vehicle is on the left side or on the right side.
| 11. The other vehicle influence behavior is set on the basis of the presence/absence of the traffic signal, and the automobile is described in one of the claims 4 to the claim 10.
| 12. The other vehicle influence behavior is set on the basis of the priority relation between the vehicles, and the automobile is described in one of the claims 4 to claim 11.
| 13. This system is provided with a camera for photographing the periphery of one's own vehicle and/or a microphone for collecting sound around the own vehicle, and a means for detecting an emergency vehicle from a photographed image around one's own vehicle photographed by the camera and/or voice around the own vehicle collected by the microphone. In the behavior determination means, the automobile is described in either of claims 4 to claim 12, which determines the behavior of the own vehicle with the priority of the emergency vehicle.
| 14. When the completion notice of the other vehicle influence behavior is received from the other vehicle, the automobile is described in one of claims 4 to 13 which are characterized in that the communication path is cut.
| 15. When the own vehicle is in the manual operation mode state, the automobile is switched to the automatic operation mode, and the communication path is generated by the communication path generation means.
| 16. A navigation system is provided with a car navigation function part for forming route guide data including trigger information of inter-vehicle communication in response to a point on the route of a route guide which requires execution of other vehicle influence behavior which affects the behavior of the other vehicle around the vehicle. The automobile is described in one of claims 1 to 15.
| 17. During execution of a route guide using the route guide data by the car navigation function part, at a point on the route where the trigger information is detected, the automobile is described in a claim 16 characterized by making a vehicle-to-vehicle communication request to the other vehicle in the periphery through the radio communication part.
| 18. The automobile is provided with a camera for photographing the periphery of one's own vehicle, detects the recognition of a prescribed situation from the photographed image of the camera as trigger information for communication between vehicles, and makes a communication request between vehicles to other vehicles around the vehicle.
| 19. The prescribed situation is the automobile described in a claim 18 characterized by including the time of entering the intersection, the approach of the highway, or the rotary approach of the station.
| 20. The automobile is provided with a vehicle-to-vehicle communication request means for making a vehicle-to-vehicle communication request to other vehicles in the periphery through the radio communication part.
| 21. The automobile is provided with a vehicle-to-vehicle communication request button, and when the inter-vehicle communication request button is operated, the inter-vehicle communication request means performs inter-vehicle communication request to other vehicles around the vehicle through the radio communication part.
| 22. The inter-vehicle communication requesting means detects the reception of the radio wave information from the beacon installed on the road as trigger information of the inter-vehicle communication, and makes a vehicle-to-vehicle communication request to the other vehicle around the vehicle, and the automobile is described in the claim 20 or the claim 21.
| 23. The inter-vehicle communication requesting means detects at least one of a backlight lighting; a brake light lighting; a sudden accelerator; and a sudden brake as trigger information for inter-vehicle communication, and makes a vehicle-to-vehicle communication request to other vehicles around the vehicle. The automobile described in the claim 20 to claim 22 is characterized by the above.
| 24. The automobile is described in either of the claims 17 to 23, wherein a communication request for the vehicle is made to the other vehicle in the periphery, and when the response information is not received within a prescribed time from the other vehicle in the periphery, a message indicating that the other vehicle capable of communicating between vehicles is not present in the periphery is notified to the driver.
| 25. In response to a vehicle-to-vehicle communication request from the other vehicle, a means for receiving response information including information on whether the other vehicle is an automatic driving vehicle for autonomously performing the behavior of the own vehicle or an autonomous driving mode for autonomously performing the behavior of the own vehicle is received through the radio communication part; and a means for receiving the response information from the other vehicle via the radio communication part. From the received response information, it is provided with a discrimination means for discriminating whether the other vehicle is an automatic driving vehicle for autonomously performing the behavior of the own vehicle or an automatic operation mode state for autonomously performing the behavior of the own vehicle. The communication path generation means generates a communication path between the other vehicle discriminated as being an automatic driving vehicle for autonomously performing the behavior of the own vehicle or the other vehicle determined to be in an autonomous operation mode for autonomously performing the behavior of the own vehicle based on the discrimination result of the discrimination means. The automobile is described in one of claims 1 to 24.
| 26. A computer provided with a manual operation mode and an automatic operation mode capable of autonomous traveling and having a transmission/reception function and provided with a radio communication part for radio communication between vehicles is provided through the radio communication part; and a monitoring means for monitoring reception of a communication request between vehicles from other vehicles is provided. The monitoring means, when receiving a vehicle-to-vehicle communication request from the other vehicle, determines whether or not the own vehicle is in the automatic operation mode; and when the vehicle is in the automatic operation mode, a communication path generation means that generates a communication path between the vehicle and the other vehicle which has transmitted the inter-vehicle communication request. The program for the motor vehicles for functioning as it is. | The vehicle has a manual driving mode and an automatic driving mode in which the automobile autonomously travels. A wireless communication unit (102) comprises a transmission/reception function and is configured to perform wireless communication between vehicles. A monitoring unit is configured to monitor reception of a vehicle-to-vehicle communication request from another vehicle through the wireless communication unit. A communication path generation unit is configured to determine whether the host vehicle is in the autonomous driving mode when the monitoring unit receives the inter-vehicle communication request from the other vehicle, and generate a communication path with the other vehicle that has transmitted the inter-vehicle communication request when the host vehicle is in the autonomous driving mode. The communication path is not generated when the own vehicle is in the manual driving mode. An INDEPENDENT CLAIM is included for an automobile program. Vehicle such as self-driving car, electric vehicle, plug in hybrid vehicle, gasoline vehicle, fuel cell vehicle, police car, taxi and bus. The vehicle reliably avoids traffic accidents and ensures that the behavior is performed safely when attempting to behave in a manner that influences other vehicles. The drawing shows a block diagram showing an electronic control circuit unit in a vehicle. (Drawing includes non-English language text)10Electronic control circuit unit 100System bus 101Control unit 102Wireless communication unit 103Motor drive control unit 104Steering drive control unit 105Manual/automatic driving mode switching control unit 106Radar group 107Camera group 108Sensor group 109Surrounding situation understanding unit 110Current position detection unit |
Please summarize the input | Geographically disparate sensor fusion for enhanced target detection and identification in autonomous vehiclesExamples disclosed herein relate to an autonomous driving system in an ego vehicle. The autonomous driving system includes a radar system configured to detect and identify a target in a path and a surrounding environment of the ego vehicle. The autonomous driving system also includes a sensor fusion module configured to receive radar data on the identified target from the radar system and compare the identified target with one or more targets identified by a plurality of perception sensors that are geographically disparate from the radar system. Other examples disclosed herein include a method of operating the radar system in the autonomous driving system of the ego vehicle.What is claimed is:
| 1. An autonomous driving system in an ego vehicle, comprising:
a radar system configured to radiate one or more transmission radio frequency (RF) beams to a surrounding environment of the ego vehicle; and
a sensor fusion module configured to receive combined target identification information that includes at least radar data from the radar system and sensor data from a plurality of perception sensors that are geographically disparate from the ego vehicle, wherein the sensor fusion module includes one or more deep learning networks that are trained with the radar data and the sensor data for target identification.
| 2. The autonomous driving system of claim 1, wherein the radar system comprises a metamaterial antenna structure configured to radiate the one or more transmission RF beams and receive one or more return RF beams reflected from the surrounding environment, wherein the sensor fusion module is configured to send a control signal to the metamaterial antenna structure based on historical sensor data from the radar system, and wherein the control signal enables one or more metamaterial antenna cells in the metamaterial antenna structure to be directed.
| 3. The autonomous driving system of claim 2, wherein the control signal comprises an instruction to the metamaterial antenna structure to radiate additional transmission RF beams at a given phase shift and direction within at least a portion of a field-of-view corresponding to a location of a target identified by the radar system.
| 4. The autonomous driving system of claim 2, wherein the radar system comprises a perception module coupled to the metamaterial antenna structure, and wherein the perception module is configured to generate tracking information of an identified target with a multi-object tracker in the perception module.
| 5. The autonomous driving system of claim 4, wherein the multi-object tracker is configured to track the identified target over time using a Kalman filter.
| 6. The autonomous driving system of claim 4, wherein the perception module is further configured to generate target identification information based at least on the tracking information.
| 7. The autonomous driving system of claim 6, wherein the radar system is further configured to combine the target identification information with other target identification information from the plurality of perception sensors to form the combined target identification information.
| 8. The autonomous driving system of claim 7, wherein the radar system is further configured to send the combined target identification information to the sensor fusion module in the autonomous driving system.
| 9. The autonomous driving system of claim 7, wherein the sensor fusion module is further configured to receive the other target identification information over a vehicle-to-vehicle communication channel from the plurality of perception sensors.
| 10. The autonomous driving system of claim 7, wherein the sensor fusion module is further configured to generate enhanced target identification information from the combined target identification information, the enhanced target identification information including one or more adjustments to the identified target in terms of time and position relative to the ego vehicle.
| 11. The autonomous driving system of claim 10, wherein the sensor fusion module is further configured to determine a next control action for the metamaterial antenna structure based at least on the enhanced target identification information.
| 12. A radar system in an ego vehicle, comprising:
an antenna module comprising one or more metastructure antennas that are configured to radiate one or more transmission radio frequency (RF) beams to a surrounding environment of the ego vehicle and receive one or more return RF beams reflected from the surrounding environment of the ego vehicle, the antenna module having an antenna controller configured to dynamically control the one or more metastructure antennas; and
a perception module coupled to the antenna module and configured to detect and identify one or more targets from the one or more return RF beams, wherein the perception module has one or more deep learning networks that are trained with radar data in the radar system and lidar data from a lidar system in the ego vehicle and a plurality of lidar systems in other autonomous vehicles that are geographically disparate from the radar system.
| 13. The radar system of claim 12, wherein the perception module includes a target identification and decision module that is configured to:
receive a radar point cloud based at least on radar data from the antenna module;
process the radar point cloud to detect and identify the target; and
determine one or more control actions to be performed by the antenna module based on the detection and identification of the target.
| 14. The radar system of claim 13, wherein the perception module is further configured to generate tracking information of the identified target with a multi-object tracker in the perception module.
| 15. The radar system of claim 14, wherein the multi-object tracker is configured to compare one or more candidate targets identified by the target identification and decision module with targets that the multi-object tracker has detected in one or more prior segments of time.
| 16. A method of operating a radar system in an autonomous driving system of an ego vehicle, the method comprising:
directing a metamaterial antenna structure to generate one or more radio frequency (RF) beams with first antenna parameters and radiate the one or more RF beams to one or more targets in a surrounding environment of the ego vehicle;
providing radar data from one or more return RF beams that are reflected from the one or more targets to a sensor fusion module;
combining the radar data with other perception sensor information from a plurality of geographically disparate sensors to form fused sensor data in the sensor fusion module, wherein the sensor fusion module receives the other perception sensor information over a vehicle-to-vehicle communication channel from the plurality of geographically disparate sensors; and
generating enhanced target identification information from the fused sensor data with the sensor fusion module to determine a next control action for the metamaterial antenna structure.
| 17. The method of claim 16, further comprising:
identifying the one or more targets with a perception module in the ego vehicle;
generating tracking information of the identified one or more targets with a multi-object tracker in the perception module; and
generating target identification information based at least on the tracking information.
| 18. The method of claim 17, wherein the target identification information comprises one or more of a classification of the identified one or more targets, a location of the identified one or more targets, or a rate of movement of the identified one or more targets.
| 19. The method of claim 17, further comprising:
extracting a micro-doppler signal from the radar data with a micro-doppler module coupled to the metamaterial antenna structure;
providing the micro-doppler signal to the perception module; and
combining the tracking information provided by the multi-object tracker and the micro-doppler signal provided by the micro-doppler module to generate the target identification information.
| 20. The method of claim 17, wherein the enhanced target identification information provided by the sensor fusion module is used in training one or more deep learning networks of the perception module. | The system has a radar system that is configured to detect and identify a target in a path and a surrounding environment of an ego vehicle (800). A sensor fusion module (808) is configured to receive radar data on the identified target from the radar system. The identified target is compared with one or more targets identified by multiple perception sensors that are geographically disparate from the radar system. The radar system comprises a metamaterial antenna structure that is configured to radiate one or more transmission radio frequency (RF) beams to the target identified by the radar system and receive one or more return RF beams reflected from the target identified by the radar system and the surrounding environment. The sensor fusion module sends a control signal to the metamaterial antenna structure based on historical sensor data from the radar system. An INDEPENDENT CLAIM is included for a method of operating a radar system in an autonomous driving system of an ego vehicle. Autonomous driving system in ego vehicle, for providing partial or full automation of driving functions including steering, accelerating, braking, and monitoring the surrounding environment and driving conditions to respond to events, such as changing lanes or speed when needed to avoid traffic, crossing pedestrians, animals, etc. The geographically disparate sensor fusion enhances target detection and identification in the environment for an ego vehicle. By combining information from previous measurements, expected measurement uncertainties and some physical knowledge, the multi-object tracker generates robust, accurate estimates of target locations. By using the intelligent metamaterial (iMTM) radar system, the driver or driverless vehicle can maintain the maximum safe speed without regard to the weather conditions. The drawing shows a schematic diagram of environment in which geographically disparate sensor fusion enhances target detection and identification in the environment for an ego vehicle. 800Ego vehicle804Ego lidar806Ego radar808Sensor fusion module812Lead lidar |
Please summarize the input | GEOGRAPHICALLY DISPARATE SENSOR FUSION FOR ENHANCED TARGET DETECTION AND IDENTIFICATION IN AUTONOMOUS VEHICLESExamples disclosed herein relate to an autonomous driving system in an ego vehicle. The autonomous driving system includes a radar system configured to detect and identify a target in a path and a surrounding environment of the ego vehicle. The autonomous driving system also includes a sensor fusion module configured to receive radar data on the identified target from the radar system and compare the identified target with one or more targets identified by a plurality of perception sensors that are geographically disparate from the radar system. Other examples disclosed herein include a method of operating the radar system in the autonomous driving system of the ego vehicle.What is claimed is:
| 1. An autonomous control system, comprising:
a radar module;
a computer processing unit;
a memory storage device;
a sensor fusion module configured to receive sensor information from a plurality of sensors, the plurality of sensors including the radar module, the sensor fusion comprising:
target identification and decision module adapted to detect and identify targets and determine control actions for the autonomous control system;
target list and occupancy map adapted to track targets from sensor information; and
composite data repository storing information describing a field of view of the plurality of sensors;
a communication bus coupled to the sensor fusion module, the radar module and the plurality of sensors
wherein the sensor fusion module includes one or more deep learning networks that are trained with the radar data and the sensor data for target identification.
| 2. The autonomous control system of claim 1, wherein the radar module comprises an antenna structure configured to generate and steer one or more transmission RF beams and receive one or more return RF beams reflected from the surrounding environment.
| 3. The autonomous control system of claim 1, wherein the autonomous control system is adapted to control a vehicle.
| 4. The autonomous control system as in claim 3, wherein the sensor fusion operates to combine data from different perception sensors in the vehicle and data received from sensors in other geographically disparate vehicles to perceive an environment.
| 5. The autonomous control system as in claim 4, wherein the sensor fusion module is configured to send a control signal to at least one of the plurality of sensors based on historical sensor data from the radar module.
| 6. The autonomous control system as in claim 5, wherein the radar module comprises an antenna array and the control signal controls directionality of one or more antenna cells in the antenna array.
| 7. The autonomous control system of claim 6, wherein the control signal comprises an instruction to the antenna array to radiate transmission RF beams at a first phase shift and direction within at least a portion of the field of view corresponding to a location of a target identified by the radar system.
| 8. The autonomous control system of claim 1, wherein the sensor fusion module is adapted to receive information from sensor fusion modules in other autonomous control systems.
| 9. The autonomous control system of claim 8, wherein the received information from other autonomous control systems includes time stamps indicating a time of data collection and a location information.
| 10. The autonomous control system of claim 9, wherein the sensor fusion module is adapted to receive mapping information for application to sensor information.
| 11. The autonomous control system of claim 10, wherein the mapping information is used to track targets.
| 12. The autonomous control system of claim 11, wherein the radar module comprises a perception module configured to generate target identification information based on tracking information.
| 13. The autonomous control system as in claim 1, further comprising:
a communication module adapted to receive communications from other vehicles and infrastructure components.
| 14. A vehicle to vehicle (V2V) communication system in a vehicle, comprising:
a computer processing unit;
a sensor fusion module configured to receive information from vehicles, the information identifying targets in an environment of the vehicle; and
a communication bus for communication with modules within the system.
| 15. The V2V communication system as in claim 14, wherein the information is from a lead vehicle in a path of the vehicle.
| 16. The V2V communication system as in claim 15, further comprising a radar module adapted to provide range Doppler maps (RDM) to the sensor fusion module.
| 17. The V2V communication system as in claim 16, wherein the sensor fusion module receives point cloud data corresponding to the information and combining with RDM data from the radar module.
| 18. The V2V communication system as in claim 17, the sensor fusion comprises a perception module to identify targets.
| 19. The V2V communication system as in claim 18, wherein the information provided to sensor fusion is adapted to provide control decisions to avoid an accident.
| 20. The V2V communication system as in claim 14, wherein the information includes communications related to weather conditions in the environment. | The autonomous control system includes a sensor fusion module (108) receiving sensor information from a set of sensors, where the sensors include a radar (106) module. The sensor fusion comprises a target identification and decision module to detect and identify targets and determine control actions for the system. A target list and occupancy map tracks the targets from the sensor information. A composite data repository stores information describing a field of view of the sensors. A communication bus is coupled to the fusion module, the radar module and the sensors, and includes deep learning networks that are trained with radar data and sensor data for target identification. An INDEPENDENT CLAIM is included for a vehicle to vehicle VV communication system in a vehicle. Autonomous control system for vehicle. The method utilizes an autonomous driving system to automatically control driving functions such as steering, accelerating, braking and monitoring the surrounding environment and driving conditions to respond to events such as changing lanes or speed when needed to avoid traffic, crossing pedestrians and animals, and to detect and classify targets in a surrounding environment at the same or possibly even better level as humans. The drawing shows a schematic diagram of an environment in which geographically disparate sensor fusion in an ego vehicle enhances target detection and identification in the environment, and employs the autonomous control system for vehicle.100Ego Vehicle 102Camera 104Ego Vehicle Lidar 106Radar 108Sensor Fusion Module 110Lead Vehicle 112Lead Vehicle Lidar |
Please summarize the input | Method for operating a vehicleThe invention relates to a method for operating a vehicle (1) in an automatic driving mode. According to the invention, the rear space of the vehicle (1) is continuously monitored; -sending a warning prompt (W) to the rear vehicle (2) in the event that the rear vehicle (2) is detected to approach the vehicle (1) in a risky manner, and-continuing to approach the vehicle (1) in a risky manner after the rear vehicle (2) has elapsed for a predetermined period of time, A takeover request for a driving task is sent to a vehicle user of the vehicle (1).|1. A method for operating a vehicle (1) in an automatic driving mode, comprising the following steps: continuously monitoring the vehicle rear space of the vehicle (1); under the condition that the rear vehicle (2) is detected to approach the vehicle (1) in a risk manner, A warning prompt (W) is sent to the rear vehicle (2), and a take-over request for a driving task is issued to a vehicle user of the vehicle (1) in the case where the rear vehicle (2) continues to approach the vehicle (1) in a risky manner after a predetermined period of time.
| 2. The method according to claim 1, wherein if the driving task is not taken over after another pre-set time length, the current driving speed of the vehicle (1) is reduced by the pre-set value. until the rear vehicle (2) no longer approaches the vehicle (1) in a risky manner.
| 3. The method according to claim 2, wherein the vehicle (1) will be manipulated in the following manner after the other time period. under the right traffic condition, the vehicle will enter into the right side edge area in the lane (F1); or under the left traffic condition, the vehicle will enter into the left edge area in the lane (F2).
| 4. The method according to any one of said claims, wherein the warning prompt (W) is sent to the rear vehicle (2) in the form of text information on the display unit (6) arranged at the tail of the vehicle (1).
| 5. The method according to any one of said claims, wherein the warning prompt (W) is transmitted to the rear vehicle (2) by means of vehicle-to-vehicle communication.
| 6. The method according to any one of said claims, wherein the vehicle (1) is waiting for at least 30 seconds before sending the takeover request. | The method involves continuously monitoring a vehicle rear area of the vehicle (1). A warning (W) is output to the following vehicle (2), in the event that a following vehicle approaches the vehicle critically. A takeover request relating to a driving task is issued to a vehicle user of the vehicle, in the event that the following vehicle continues to approach the vehicle critically after a predefinable period of time. Method for operating vehicle e.g. motor vehicle in automated ferry operation. The risk of an accident between the vehicle and the following vehicle can be reduced. The following vehicle is prevented from an adequate distance to the vehicle, so that the distance can be increased and a driver of the vehicle can have the opportunity to an unexpected braking maneuver of vehicle. The safety distance between the two vehicles is speed-dependent since the braking distance of following vehicle increases with increasing driving speed. The acceptance of the automated ferry operation of vehicles is increased. By reducing the speed, the risk of ghost targets being detected can also be reduced, by reducing the driving speed of vehicle in order to signal to following vehicle an offer to overtake the vehicle or to point out again that following vehicle has approached the vehicle critically and is therefore not able to react to an abrupt braking process of vehicle to react. The drawing shows a schematic view of a roadway section with two opposing lanes, the automated vehicle, the following vehicle and the ghost object when using a method for reducing a rear-end collision. 1, 2Vehicle4Sensor system4.1Camera5False objectWWarning |
Please summarize the input | Method and system for assisting driving and/or automatic driving of vehicleThe present invention relates to a method for assisting driving and/or automatic driving of a vehicle, comprising: judging whether the vehicle is close to the intersection (S1) within a predetermined distance; if the vehicle is close to the intersection in a predetermined distance, obtaining and judging whether the driving intention of the vehicle is straight (S2), if it is straight, detecting whether there is another vehicle (S3) having a length greater than or equal to a predetermined length on the left turning lane in front of the crosswalk at the crossroad; If another vehicle is detected, the warning information (S4) is sent to the driver of the vehicle and/or the outside of the vehicle. The invention also relates to a system and a computer program product for executing the method. According to the invention, it can avoid the collision with the pedestrian suddenly appearing from the front of the other vehicle when the vehicle runs through the crosswalk, so as to prevent the accident caused by the pedestrian attempt to pass through without authorization, so as to improve the driving safety.|1. A method for auxiliary driving and/or automatic driving of vehicle, the method comprising: Step 1 (S1): judging whether the vehicle (1) is close to the intersection in a predetermined distance; step S2: if the vehicle (1) is close to the intersection in a predetermined distance, obtaining the driving intention of the vehicle (1) at the intersection, and judging whether the driving intention of the vehicle (1) is straight, step S3: if the driving intention of the vehicle (1) is straight, detecting whether there is another vehicle (2) having a length greater than or equal to a predetermined length on the left turning lane in front of the crosswalk at the crossroad; and step S4: If another vehicle (2) having a length greater than or equal to a predetermined length is detected on the left-turning lane in front of the crosswalk at the crossroad, warning information is sent to the driver of the vehicle (1) and/or the outside of the vehicle (1).
| 2. The method according to claim 1, wherein the method further comprises: step S5: If another vehicle (2) having a length greater than or equal to a predetermined length is detected on the left turning lane in front of the crosswalk at the crossroad, the braking module (13) of the vehicle (1) is placed in a braking preparation state.
| 3. The method according to any one of the preceding claims, wherein the method further comprises: step S6: If another vehicle (2) having a length greater than or equal to a predetermined length is detected on the left-turning lane in front of the crosswalk at the crossroad, the running speed of the vehicle (1) is reduced to a predetermined speed.
| 4. The method according to any one of the preceding claims, wherein the method further comprises: step S7: if another vehicle (2) having a length greater than or equal to a predetermined length is detected on the left turning lane in front of the crosswalk at the intersection, the vehicle communication (V2V) is requested to acquire the front image data of the other vehicle (2) from the image sensor of the other vehicle (2); step S8: detecting whether there is a weak traffic participant (3) on a pedestrian crosswalk in front of the another vehicle (2) by image processing of the front image data; and Step S9: controlling the vehicle (1) to brake if there is a weak traffic participant (3) present on the crosswalk in front of the other vehicle (2).
| 5. The method according to any one of the preceding claims, wherein the warning module (14) optically and/or acoustically to the vehicle (1) of the driver and/or the outside of the vehicle (1), especially the weak traffic participant (3) possibly existing on the crosswalk on the crosswalk sends the warning information; and/or the warning module (14) comprises a vehicle-mounted horn device, a vehicle headlight, a vehicle-mounted voice device, an instrument panel, a head-up display screen and/or a central control display screen; and/or acoustically to the outside of the vehicle (1) through the vehicle-mounted horn device, especially the weak traffic participant (3) on the crosswalk of the crossroad may be present to send the warning information; and/or by the vehicle headlight optical to the outside of the vehicle (1), especially the pedestrian crosswalk on the crosswalk of the possible weak traffic participant (3) sends the warning information, the warning information such as flashing light and/or red area of the projection; and/or acoustically transmitting warning information to the driver of the vehicle (1) through the vehicle-mounted voice device; and/or through instrument panel, head-up display screen and/or central control display screen optically sending warning information to the driver of the vehicle (1).
| 6. The method according to any one of the preceding claims, wherein the sensing and fusion module (11) is used. detecting whether another vehicle (2) having a length greater than or equal to a predetermined length is retained on the left-turn lane in front of the crosswalk of the intersection, in particular by performing an image processing on the image data recorded by the image sensor; and/or implementing the image processing by means of an object classifier and/or a convolution neural network, for example, trained by a sufficient number of related images; and/or the sensing and fusion module (11) comprises an image sensor, a radar and/or a laser radar; and/or the image sensor is, for example, a vehicle-mounted camera.
| 7. The method according to any one of the preceding claims, wherein the state information of other vehicles in the surrounding environment of the vehicle (1) is obtained by the vehicle communication (V2V), thereby detecting whether another vehicle (2) having a length greater than or equal to a predetermined length is retained on the left turning lane in front of the crosswalk at the crossroad; and/or the other vehicle (2) such as a truck, a bus and/or carriage type vehicle and other large vehicle; and/or the disadvantaged traffic participant (3) is, for example, a pedestrian and/or a rider.
| 8. The method according to any one of the preceding claims, wherein the vehicle position information and the high precision map information in the navigation module (12) to judge whether the vehicle (1) is close to the intersection in a predetermined distance; and/or judging whether the vehicle (1) is close to the intersection in a predetermined distance by performing image processing on the image data recorded by the image sensor by sensing and fusing the module (11); and/or obtaining the driving intention of the vehicle (1) at the intersection through the planning path information in the navigation module (12).
| 9. A system (100) for auxiliary driving and/or automatic driving of vehicle, the system (100) is used for executing the method according to any one of claims 1 to 8, wherein the system (100) comprises at least one of the following components: sensing and fusion module (11), the sensing and fusion module (11) is configured to identify the surrounding environment information of the vehicle (1); a navigation module (12), the navigation module (12) is configured to obtain the position information of the vehicle (1) and obtain the planning path information of the driver, and the navigation module (12) is associated with a high precision map unit, in the high precision map unit is stored with high precision map information, the high precision map information such as the position information of the traffic lane, traffic lane type and traffic rule information corresponding to the traffic lane type; a brake module (13) configured to perform braking and/or deceleration of the vehicle (1); a warning module (14) configured to optically and/or acoustically transmit warning information to the driver of the vehicle (1) and/or the outside of the vehicle (1); a communication module (15) configured to acquire status information and/or image data of other vehicles in the surrounding environment of the vehicle (1) through the vehicle communication (V2V); and a control module (16) configured to process the information of each module and generate a corresponding control signal according to the processed information.
| 10. A computer program product, such as a computer readable program carrier, comprising computer program instructions, wherein the computer program instructions when executed by a processor implement the steps of the method according to any one of claims 1 to 1 to 8. | The method involves determining whether a vehicle is close to an intersection in a predetermined distance. A driving intention of the vehicle is obtained at the intersection. Determination is made to check whether the vehicle is straight. A length greater than or equal to a predetermined length is detected on a left-turning lane in front of a crosswalk at a crossroad. Warning information is sent to a driver of a vehicle and/or an outer side of the vehicle. A braking module of the vehicle is placed in a braking preparation state. Running speed of the vehicle is reduced according to pre-determined speed. INDEPENDENT CLAIMS are also included for:A system for auxiliary driving and automatic driving of the vehicle; andA computer program product comprises set of instructions of the system for auxiliary driving and automatic driving of the vehicle. Method for auxiliary driving and automatic driving of vehicle such as truck, bus and carriage type vehicle waiting for green light on left turning lane. The collision with the pedestrian suddenly appearing from the front of the other vehicle can be avoided when the vehicle runs through the crosswalk, so as to prevent the accident caused by the pedestrian attempt to pass through without authorization, so that the driving safety can be improved. The drawing shows a flow diagram illustrating of the method for auxiliary driving and automatic driving of the vehicle. (The drawing includes a Non-English language text). |
Please summarize the input | Method for improved implementation of a fully automated journey with a vehicleThe invention relates to a method for improving the implementation of fully automated driving with a vehicle, in which the fully automated driving vehicle (1) orientates itself on a vehicle (13) driving in front. In a method in which problems in an area sensor system of the vehicle are reliably eliminated during fully automated and autonomous driving operation of the vehicle, after detection of a condition that at least partially impairs the function of an area sensor system (7, 9, 11) the fully automated and autonomously driving vehicle (1) via a wireless coupling (15) to the vehicle (13) driving ahead and is guided by the vehicle (13) driving ahead until the at least partial functional impairment of the surroundings sensors (7, 9, 11) is fixed.|1. Method for improved implementation of a fully automated trip with a vehicle, in which the fully automated vehicle (1) is oriented towards a vehicle (13) driving in front,characterizedthat after detection of an at least partially functionally impairing condition of an environment sensor system (7, 9, 11) arranged in the fully automated and autonomously driving vehicle (1), the fully automated and autonomously driving vehicle (1) uses a wireless coupling (15) to communicate with the vehicle driving ahead ( 13) and is guided by the vehicle (13) driving ahead until the at least partial functional impairment of the surroundings sensors (7, 9, 11) has been remedied.
| 2. procedure afterclaim 1, characterizedthat the wireless coupling takes place through a vehicle-to-vehicle communication (15).
| 3. procedure afterclaim 1 or2, characterizedthat the at least partially functionally impairing condition of the forward-facing surroundings sensors (7, 9, 11) in the fully automated and autonomously driving vehicle (1) consists in a lack of calibration of the surroundings sensors (7, 9, 11) and the autonomously driving vehicle (1) its fully automated and autonomous journey only starts when it recognizes the second fully automated vehicle (13), to which it connects, until the environment sensors (7, 9, 11) is fully calibrated.
| 4. procedure afterclaim 1, 2 or3, characterizedthat the first vehicle (1) is released manually for fully automated and autonomous driving and, before starting, a vehicle system automatically checks whether the environment sensors (7, 9, 11) are sufficiently calibrated.
| 5. Method according to at least one of the preceding claims,characterizedthat the first fully automated and autonomously driving vehicle (1) follows the preceding vehicle (13) at a predetermined distance.
| 6. Method according to at least one of the preceding claims,characterizedthat the at least partially functionally impaired state of the forward-facing environment sensors (7, 9, 11) in the first vehicle (1) consists in a blockage of a camera-guided system (7, 19) and the fully automated and autonomously driving vehicle (1) after detection of the Blocking of the camera-guided system (7, 19), attaches itself to the vehicle (13) driving ahead until the at least partial functional impairment of the camera-guided system (7, 19) is completely lifted again.
| 7. procedure afterclaim 6, characterizedthat with a slightly restricted view of the camera-guided system (7, 19), the fully automated and autonomously driving vehicle (1) identifies a vehicle (13) driving ahead with the remaining vehicle-own, forward-facing environment sensors (9, 11), which the fully automated and autonomous vehicle (1) follows at a constant distance until the slightly restricted view of the camera-guided system (7, 19) is overcome.
| 8. procedure afterclaim 6, characterizedthat when a complete blockage of the camera-guided system (7, 19) is detected, the first fully automated and autonomously driving vehicle (1) is automatically brought to a standstill at a safe location in terms of traffic and a vehicle backend is informed of the exact position of the vehicle. | The method involves connecting fully automated and the autonomously driving vehicle (1) to the vehicle (13) driving in front by a wireless coupling (15) after detection of partially functionally impairing condition of an environment sensor system (7, 9, 11) arranged in the fully automated and autonomously driving vehicle and guided by the vehicle driving in front, until the partial functional impairment of the environment sensors is remedied. Method for improved implementation of fully automated trip with vehicle, such as fully automated and autonomous commercial vehicle parked in parking lot of central goods transhipment point. The problems of environment sensor system of the vehicle are reliably eliminated during fully automated and autonomous driving operation of the vehicle. The vehicle with the disturbed surroundings sensors is reliably kept in lane, so that situations that endanger traffic are largely prevented. The collisions between the two vehicles can be reliably avoided. The drawing shows a schematic view illustrating the process for improved implementation of fully automated trip with vehicle. 1Autonomously driving vehicle3Parking lot7, 9, 11Environment sensor system13Vehicle15Wireless coupling |
Please summarize the input | How autonomous vehicles workThe present invention relates to a method for driving an autonomous vehicle 1 in an intersection (K) area. According to the present invention, - another vehicle 2 having right-of-way at the intersection K is detected and an intended route is calculated, - based on the intended route, another vehicle 2 passes through the intersection K After doing so, it is calculated whether the path of the other vehicle (2) is blocked by an obstacle, if it is calculated that the vehicle (1) has to pass next to it, - it is calculated that the path of the other vehicle (2) is blocked by an obstacle In this case, a communication connection with the other vehicle 2 is established, through which the vehicle 1 informs the other vehicle 2 that the route is blocked.|1. As a method for operating an autonomous vehicle (1) in an intersection (K) area, - another vehicle (2) having a right-of-way at an intersection (K) is detected and an intended route is calculated, - based on the intended route, if it is calculated that the other vehicle 2 has to pass next to the vehicle 1 after passing the intersection K, it is calculated whether the path of the other vehicle 2 is blocked by an obstacle; When it is calculated that the path of the vehicle 2 is blocked by an obstacle, a communication connection with the other vehicle 2 is established, through which the vehicle 1 informs the other vehicle 2 that the path is blocked. How to.
| 2. Method according to claim 1, characterized in that the order in which the two vehicles (1, 2) pass the intersection (K) is adjusted.
| 3. A method according to claim 1 or 2, characterized in that it is calculated whether the view of another vehicle (2) is blocked by the vehicle (1) at an obstacle blocking the path.
| 3. The method according to claim 1 or 2, characterized in that the remaining lane width between the obstacle blocking the path of the vehicle (1) and the other vehicle (2) and the width of the other vehicle (2) are calculated to calculate the occurrence of a blockage situation. How to.
| 5. A method according to claim 4, characterized in that the remaining lane width is compared with the width of another vehicle (2) in order to calculate the occurrence of a blockage situation.
| 3. A method according to claim 1 or 2, characterized in that the intended route of another vehicle (2) is calculated on the basis of an activated turn signal (8) and/or information transmitted via vehicle-to-vehicle communication.
| 7. The method according to claim 6, wherein, by the other vehicle (2), the width of the other vehicle (2) and the signal of the activated turn signal lamp (9) are communicated to the vehicle (1) via vehicle-to-vehicle communication in order to verify that the width of the other vehicle (2) is reasonable. characterized in that transmitted to.
| 3. A method according to claim 1 or 2, characterized in that the other vehicle (2) is operated unmanned in autonomous driving. | The method involves detecting a further vehicle (2) with right of way at the intersection (K). The intended route is used to determine the further vehicle and a vehicle (1) is driven after passing the intersection. Determination is made on whether the route of the further vehicle is blocked by an obstacle. The route of the further vehicle is blocked by the obstacle, and a communication connection to the further vehicle is established through the vehicle which informs the further vehicle about the blocked route. Method for operating automated vehicle in area of intersection. The traffic flow can be optimized, since the occurrence of a blockage situation can be counteracted. The plausibility of the route of the further vehicle and the remaining lane width between the vehicle and the obstacle can be checked. The drawing shows a schematic view of the intersection area without traffic signs regulating the right of way, light signal systems or police officers. 1Vehicle2Further vehicle7First parked vehicle8Activated direction indicatorKIntersection |
Please summarize the input | Method for determining action strategy of vehicle driving in automated driving operationThe invention relates to a method for determining an action strategy of a vehicle (1) driving in automated driving operation, for the situation in which the vehicle approaches a vehicle in front (2) which is parked in the lane (F) of the vehicle. According to the invention, - the vehicle (1) stops behind the parked vehicle in front (2) and waits to continue its driving operation until the vehicle in front (2) starts off, - if the waiting time of the vehicle (1) exceeds a predefined waiting time, the vehicle (1) initiates a passing maneuver for passing the vehicle in front (2), requests assistance from a teleoperator and/or outputs a take-over request for a driving task to a vehicle user of the vehicle (1).|1. A method for determining an action strategy of a vehicle (1) travelling under an automatic driving operation when approaching a front vehicle (2) parked on its lane (F), wherein the vehicle (1) stops behind the parked front vehicle (2) and waits to continue its travelling operation. until the front vehicle (2) is started, - when the waiting time of the vehicle (1) exceeds a predetermined waiting time, the vehicle (1) starts overtaking action to exceed the front vehicle (2), A remote operator assistance is requested and/or a takeover request for a driving task is issued to a vehicle user of the vehicle (1).
| 2. The method according to claim 1, wherein the waiting time is predetermined according to the environment condition of the vehicle (1).
| 3. The method according to claim 1 or 2, wherein the environmental condition is checked whether there is an indication that the front vehicle (2) constitutes the end of the queue of the stopped other vehicle (3).
| 4. The method according to claim 3, wherein when the indication is found, the correction value of the waiting time is determined according to the found indication; the waiting time of the vehicle (1) is prolonged by the correction value.
| 5. The method according to any one of said claims, wherein the distance between the vehicle (1) and the intersection (K) is considered when the waiting time is determined.
| 6. The method according to claim 5, wherein when the distance to the intersection (K) is lower than a predetermined threshold, it is assumed that there is a queue of other vehicles (3) in front of the front vehicle (2).
| 7. The method according to any one of said claims, wherein the waiting time is determined according to the available traffic information of the vehicle (1).
| 8. The method according to any one of claims 1 to 6, wherein the information about the queue length and/or the length of the predicted waiting time of the waiting other vehicle (3) is sent to the vehicle (1) by the vehicle-to-vehicle communication and/or the vehicle-to-infrastructure communication. | The method involves allowing a vehicle (1) to stop behind the stationary vehicle in front-end (2), and allowing the vehicle to wait for continuous driving until the vehicle in front starts moving. The vehicle is allowed to initiate an overtaking maneuver for overtaking the front-end vehicle, to request assistance through a teleoperator and/or to output a request for a driving task to a vehicle user of the vehicle, when the vehicle waits longer than a predetermined waiting time. The distance of the vehicle to an intersection (K) is taken into account when determining the waiting time. Method for determining action strategy of vehicle driving in automated driving mode when approaching front-end vehicle standing in lane. The action strategy of the vehicle driving in automated driving mode can be determined. The accuracy to display such queues of vehicles can be improved in real time. The drawing shows a schematic view illustrating the traffic situation with vehicle driving towards queue at intersection in automated driving mode. 1Vehicle1.1Sensor system2Front-end vehicleHStop lineKIntersection |
Please summarize the input | Positioning system and method for parking and waiting for automatic driving vehicle on roadThe invention relates to the technical field of automatic driving, specifically relates to a locating system for parking and waiting on road for automatic driving vehicle, comprising: an obtaining unit for obtaining the front environment information of the automatic driving vehicle on the road and the side environment information of the adjacent lane, a judging unit for judging whether the vehicle needs to be stopped in the front according to the front environment information, a determining unit for determining the parking place of the automatic driving vehicle according to the front environment information, an optimizing unit for optimizing the parking place according to the side environment information and a control unit for controlling the automatic driving vehicle to automatically drive to the optimized parking place to wait. The invention further relates to a positioning method for automatically driving a vehicle to park and wait on a road, a computer program product and a corresponding automatically driving vehicle. The embodiment of the invention improves the parking safety and efficiency of the automatic driving vehicle and enables the automatic driving vehicle to better adapt to the complex and variable urban traffic environment.|1. A positioning system (1000) for automatically driving a vehicle (ADV) to stop and wait on a road (S), wherein the positioning system (1000) at least comprises: an obtaining unit (100), the obtaining unit (100) is configured to obtain the front environment information of the automatic driving vehicle (ADV) on the road (S) and the side environment information of the adjacent lane (S '); a judging unit (200), the judging unit (200) is configured to according to the front environment information judging whether need to stop waiting in front; a determining unit (300), the determining unit (300) is configured to determine the parking place of the automatic driving vehicle (ADV) according to the front environment information when the judging result is that the vehicle needs to be parked in the front; an optimizing unit (400), the optimizing unit (400) is configured to optimize the parking place according to the side environment information; and a control unit (500) configured to control the automatic driving vehicle (ADV) to automatically travel to an optimized parking place for waiting.
| 2. The positioning system (1000) according to claim 1, wherein the optimizing unit (400) is further configured to optimize a travel speed at which the automatic driving vehicle (ADV) should travel to the parking place according to the side environment information. the control unit (500) is further configured to control the automatic driving vehicle (ADV) to automatically travel to an optimized parking place for waiting at an optimized travel speed; and/or the front environment information comprises a traffic light (L) in front of the road (S) and its signal state and/or a parking indication line (P); and/or the side environment information comprises the large vehicle (V) on the adjacent lane (S ') and the blind area range (B) and/or the sight state of the driver (F); and/or the obtaining unit (100) comprises a forward camera (101) and/or a lateral camera (102) and/or a V2X communication module (103) arranged on the automatic driving vehicle (ADV).
| 3. A positioning method (2000) for parking and waiting on a road (S) by an automatic driving vehicle (ADV), wherein the positioning method (2000) at least comprises the following steps: S100: obtaining the front environment information of the automatic driving vehicle (ADV) on the road (S) and the side environment information of the adjacent lane (S '); S200: according to the front environment information, judging whether it is necessary to stop and wait in the front; S300: determining the parking place of the automatic driving vehicle (ADV) according to the front environment information when the judging result is that the vehicle needs to be parked in the front; S400: optimizing the parking place according to the side environment information; S500: controlling the automatic driving vehicle (ADV) to automatically travel to the optimized parking place for waiting, wherein the positioning method (2000) is carried out by means of the positioning system (1000) according to claim 1 or 2.
| 4. The positioning method (2000) according to claim 3, wherein the positioning method (2000) further comprises the following steps: S401: optimizing the driving speed of the automatic driving vehicle (ADV) to the parking place according to the side environment information; S501: The automatic driving vehicle (ADV) is controlled to automatically travel to an optimized parking place for waiting at an optimized travel speed.
| 5. The positioning method (2000) according to claim 3 or 4, wherein the positioning method (2000) further comprises the following steps: S210: judging whether there is traffic light (L) in front of the road (S) according to the front environment information; S310: when the judging result is that there is traffic light (L) in front of the road (S), determining the signal state of the traffic light (L) and the position of the parking indicating line (P) to determine the parking place.
| 6. The positioning method (2000) according to claim 1, wherein the positioning method (2000) further comprises the following steps: S411: judging whether there is a large vehicle (V) on the adjacent lane (S ') according to the side environment information; S412: obtaining the blind area (B) of the large vehicle (V) when the judging result is that there is large vehicle (V) on the adjacent lane (S '); S413: The parking place of the automatic driving vehicle (ADV) is optimized according to the blind area (B) of the large vehicle (V).
| 7. The positioning method (2000) according to claim 1, wherein the positioning method (2000) further comprises the following steps: S421: obtaining a subsequent travel route of the large vehicle (V); S422: judging whether the parking place of the automatic driving vehicle (ADV) and the subsequent driving route of the large vehicle (V) have the possibility of collision; S423: obtaining the sight line state of the driver (F) of the large vehicle (V), especially the gaze direction of the driver (F) when the judging result is that the parking place of the automatic driving vehicle (ADV) has the possibility of collision with the subsequent driving route of the large vehicle (V); S424: judging whether the driver (F) of the large vehicle (V) pays attention to the automatic driving vehicle (ADV); S425: when the driver (F) of the large vehicle (V) does not notice the automatic driving vehicle (ADV), optimizing the parking place of the automatic driving vehicle (ADV) to the parking place which can be noticed by the driver (F) of the large vehicle (V).
| 8. The positioning method (2000) according to claim 7, wherein the positioning method (2000) further comprises the following steps: S431: obtaining the parking sequencing position of the automatic driving vehicle (ADV) on the road (S); S432: obtaining the parking sequencing position of the large vehicle (V) on the adjacent lane (S '); S433: judging whether the parking sequencing position of the automatic driving vehicle (ADV) on the road (S) and the parking sequencing position of the large vehicle (V) on the adjacent lane (S ') are the first position.
| 9. A computer program product comprising computer program instructions, wherein the computer program instructions, when executed by a processor, implement the positioning method (2000) according to any one of claims 1 to 8.
| 10. An automatic driving vehicle (ADV), comprising the positioning system (1000) according to claim 1 or 2 and/or the computer program product according to claim 9. | Positioning system (1000) has an obtaining unit (100) configured to obtain a front environment information of an automatic driving vehicle (ADV) on a road (S) and a side environment information of an adjacent lane. A judging unit (200) is configured to according to the front environment information judging whether need to stop waiting in front. A determining unit (300) is configured to determine a parking place of the automatic driving vehicle (ADV) according to the front environment information when a judging result is that the vehicle needs to be parked in the front. An optimizing unit (400) is configured to optimize the parking place according to the side environment information. A control unit (500) is configured to control the automatic driving vehicle (ADV) to automatically travel to an optimized parking place for waiting. INDEPENDENT CLAIMS are also included for:1. a positioning method for parking and waiting on road S by automatic driving vehicle ADV; and2. a computer-readable storage medium comprising a set of instructions for positioning system for automatically driving vehicle i.e. automatic driving vehicle (ADV), to stop and wait on urban road. Positioning system for automatically driving vehicle i.e. automatic driving vehicle (ADV), to stop and wait on urban road (Claimed). The control unit controls the automatic driving vehicle to automatically drive to the optimized parking place to wait, improves the parking safety and efficiency of the vehicle, and enables the vehicle to better adapt to the complex and variable urban traffic environment. The drawing shows a schematic view of the positioning system for automatically driving vehicle.100Obtaining unit 200Judging unit 300Determining unit 400Optimizing unit 500Control unit 1000Obtaining unit |