Release 260111

This commit is contained in:
Comma Device
2026-01-11 18:23:29 +08:00
commit 3721ecbf8a
2601 changed files with 855070 additions and 0 deletions

148
tools/replay/README.md Normal file
View File

@@ -0,0 +1,148 @@
# Replay
`replay` allows you to simulate a driving session by replaying all messages logged during the use of openpilot. This provides a way to analyze and visualize system behavior as if it were live.
## Setup
Before starting a replay, you need to authenticate with your comma account using `auth.py`. This will allow you to access your routes from the server.
```bash
# Authenticate to access routes from your comma account:
python3 tools/lib/auth.py
```
## Replay a Remote Route
You can replay a route from your comma account by specifying the route name.
```bash
# Start a replay with a specific route:
tools/replay/replay <route-name>
# Example:
tools/replay/replay 'a2a0ccea32023010|2023-07-27--13-01-19'
# Replay the default demo route:
tools/replay/replay --demo
```
## Replay a Local Route
To replay a route stored locally on your machine, specify the route name and provide the path to the directory where the route files are stored.
```bash
# Replay a local route
tools/replay/replay <route-name> --data_dir="/path_to/route"
# Example:
# If you have a local route stored at /path_to_routes with segments like:
# a2a0ccea32023010|2023-07-27--13-01-19--0
# a2a0ccea32023010|2023-07-27--13-01-19--1
# You can replay it like this:
tools/replay/replay "a2a0ccea32023010|2023-07-27--13-01-19" --data_dir="/path_to_routes"
```
## Send Messages via ZMQ
By default, replay sends messages via MSGQ. To switch to ZMQ, set the ZMQ environment variable.
```bash
# Start replay and send messages via ZMQ:
ZMQ=1 tools/replay/replay <route-name>
```
## Usage
For more information on available options and arguments, use the help command:
``` bash
$ tools/replay/replay -h
Usage: tools/replay/replay [options] route
Mock openpilot components by publishing logged messages.
Options:
-h, --help Displays this help.
-a, --allow <allow> whitelist of services to send (comma-separated)
-b, --block <block> blacklist of services to send (comma-separated)
-c, --cache <n> cache <n> segments in memory. default is 5
-s, --start <seconds> start from <seconds>
-x <speed> playback <speed>. between 0.2 - 3
--demo use a demo route instead of providing your own
--data_dir <data_dir> local directory with routes
--prefix <prefix> set OPENPILOT_PREFIX
--dcam load driver camera
--ecam load wide road camera
--no-loop stop at the end of the route
--no-cache turn off local cache
--qcam load qcamera
--no-hw-decoder disable HW video decoding
--no-vipc do not output video
--all do output all messages including uiDebug, userFlag.
this may causes issues when used along with UI
Arguments:
route the drive to replay. find your drives at
connect.comma.ai
```
## Visualize the Replay in the Openpilot UI
To visualize the replay within the openpilot UI, run the following commands:
```bash
tools/replay/replay <route-name>
cd selfdrive/ui && ./ui
```
## Try Radar Point Visualization with Rerun
To visualize radar points, run rp_visualization.py while tools/replay/replay is active.
```bash
tools/replay/replay <route-name>
python3 replay/rp_visualization.py
```
## Work with plotjuggler
If you want to use replay with plotjuggler, you can stream messages by running:
```bash
tools/replay/replay <route-name>
tools/plotjuggler/juggle.py --stream
```
## watch3
watch all three cameras simultaneously from your comma three routes with watch3
simply replay a route using the `--dcam` and `--ecam` flags:
```bash
# start a replay
cd tools/replay && ./replay --demo --dcam --ecam
# then start watch3
cd selfdrive/ui && ./watch3
```
![](https://i.imgur.com/IeaOdAb.png)
## Stream CAN messages to your device
Replay CAN messages as they were recorded using a [panda jungle](https://comma.ai/shop/products/panda-jungle). The jungle has 6x OBD-C ports for connecting all your comma devices. Check out the [jungle repo](https://github.com/commaai/panda_jungle) for more info.
In order to run your device as if it was in a car:
* connect a panda jungle to your PC
* connect a comma device or panda to the jungle via OBD-C
* run `can_replay.py`
``` bash
batman:replay$ ./can_replay.py -h
usage: can_replay.py [-h] [route_or_segment_name]
Replay CAN messages from a route to all connected pandas and jungles
in a loop.
positional arguments:
route_or_segment_name
The route or segment name to replay. If not
specified, a default public route will be
used. (default: None)
optional arguments:
-h, --help show this help message and exit
```

0
tools/replay/__init__.py Normal file
View File

15
tools/replay/api.h Normal file
View File

@@ -0,0 +1,15 @@
#pragma once
#include <curl/curl.h>
#include <string>
#include "common/util.h"
#include "third_party/json11/json11.hpp"
namespace CommaApi2 {
const std::string BASE_URL = util::getenv("API_HOST", "https://api.commadotai.com").c_str();
std::string create_token(bool use_jwt, const json11::Json& payloads = {}, int expiry = 3600);
std::string httpGet(const std::string &url, long *response_code = nullptr);
} // namespace CommaApi2

43
tools/replay/camera.h Normal file
View File

@@ -0,0 +1,43 @@
#pragma once
#include <memory>
#include <set>
#include <tuple>
#include <utility>
#include "msgq/visionipc/visionipc_server.h"
#include "common/queue.h"
#include "tools/replay/framereader.h"
#include "tools/replay/logreader.h"
std::tuple<size_t, size_t, size_t> get_nv12_info(int width, int height);
class CameraServer {
public:
CameraServer(std::pair<int, int> camera_size[MAX_CAMERAS] = nullptr);
~CameraServer();
void pushFrame(CameraType type, FrameReader* fr, const Event *event);
void waitForSent();
protected:
struct Camera {
CameraType type;
VisionStreamType stream_type;
int width;
int height;
std::thread thread;
SafeQueue<std::pair<FrameReader*, const Event *>> queue;
std::set<VisionBuf *> cached_buf;
};
void startVipcServer();
void cameraThread(Camera &cam);
VisionBuf *getFrame(Camera &cam, FrameReader *fr, int32_t segment_id, uint32_t frame_id);
Camera cameras_[MAX_CAMERAS] = {
{.type = RoadCam, .stream_type = VISION_STREAM_ROAD},
{.type = DriverCam, .stream_type = VISION_STREAM_DRIVER},
{.type = WideRoadCam, .stream_type = VISION_STREAM_WIDE_ROAD},
};
std::atomic<int> publishing_ = 0;
std::unique_ptr<VisionIpcServer> vipc_server_;
};

109
tools/replay/can_replay.py Executable file
View File

@@ -0,0 +1,109 @@
#!/usr/bin/env python3
import argparse
import os
import time
import usb1
import threading
os.environ['FILEREADER_CACHE'] = '1'
from openpilot.common.realtime import config_realtime_process, Ratekeeper, DT_CTRL
from openpilot.selfdrive.pandad import can_capnp_to_list
from openpilot.tools.lib.logreader import LogReader
from panda import PandaJungle
# set both to cycle power or ignition
PWR_ON = int(os.getenv("PWR_ON", "0"))
PWR_OFF = int(os.getenv("PWR_OFF", "0"))
IGN_ON = int(os.getenv("ON", "0"))
IGN_OFF = int(os.getenv("OFF", "0"))
ENABLE_IGN = IGN_ON > 0 and IGN_OFF > 0
ENABLE_PWR = PWR_ON > 0 and PWR_OFF > 0
def send_thread(j: PandaJungle, flock):
if "FLASH" in os.environ:
with flock:
j.flash()
j.reset()
for i in [0, 1, 2, 3, 0xFFFF]:
j.can_clear(i)
j.set_can_speed_kbps(i, 500)
j.set_ignition(True)
j.set_panda_power(True)
j.set_can_loopback(False)
rk = Ratekeeper(1 / DT_CTRL, print_delay_threshold=None)
while True:
# handle cycling
if ENABLE_PWR:
i = (rk.frame*DT_CTRL) % (PWR_ON + PWR_OFF) < PWR_ON
j.set_panda_power(i)
if ENABLE_IGN:
i = (rk.frame*DT_CTRL) % (IGN_ON + IGN_OFF) < IGN_ON
j.set_ignition(i)
send = CAN_MSGS[rk.frame % len(CAN_MSGS)]
send = list(filter(lambda x: x[-1] <= 2, send))
try:
j.can_send_many(send)
except usb1.USBErrorTimeout:
# timeout is fine, just means the CAN TX buffer is full
pass
# Drain panda message buffer
j.can_recv()
rk.keep_time()
def connect():
config_realtime_process(3, 55)
serials = {}
flashing_lock = threading.Lock()
while True:
# look for new devices
for s in PandaJungle.list():
if s not in serials:
print("starting send thread for", s)
serials[s] = threading.Thread(target=send_thread, args=(PandaJungle(s), flashing_lock))
serials[s].start()
# try to join all send threads
cur_serials = serials.copy()
for s, t in cur_serials.items():
if t is not None:
t.join(0.01)
if not t.is_alive():
del serials[s]
time.sleep(1)
def load_route(route_or_segment_name):
print("Loading log...")
lr = LogReader(route_or_segment_name)
CP = lr.first("carParams")
print(f"carFingerprint: '{CP.carFingerprint}'")
mbytes = [m.as_builder().to_bytes() for m in lr if m.which() == 'can']
return [m[1] for m in can_capnp_to_list(mbytes)]
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Replay CAN messages from a route to all connected pandas and jungles in a loop.",
formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument("route_or_segment_name", nargs='?', help="The route or segment name to replay. If not specified, a default public route will be used.")
args = parser.parse_args()
if args.route_or_segment_name is None:
args.route_or_segment_name = "77611a1fac303767/2020-03-24--09-50-38/2:4"
CAN_MSGS = load_route(args.route_or_segment_name)
if ENABLE_PWR:
print(f"Cycling power: on for {PWR_ON}s, off for {PWR_OFF}s")
if ENABLE_IGN:
print(f"Cycling ignition: on for {IGN_ON}s, off for {IGN_OFF}s")
connect()

43
tools/replay/consoleui.h Normal file
View File

@@ -0,0 +1,43 @@
#pragma once
#include <array>
#include <mutex>
#include <vector>
#include "tools/replay/replay.h"
#include <ncurses.h>
class ConsoleUI {
public:
ConsoleUI(Replay *replay);
~ConsoleUI();
int exec();
inline static const std::array speed_array = {0.2f, 0.5f, 1.0f, 2.0f, 3.0f};
private:
void initWindows();
void handleKey(char c);
void displayHelp();
void displayTimelineDesc();
void updateTimeline();
void updateSummary();
void updateStatus();
void pauseReplay(bool pause);
void updateSize();
void updateProgressBar();
void logMessage(ReplyMsgType type, const std::string &msg);
enum Status { Playing, Paused };
enum Win { Title, Stats, Log, LogBorder, DownloadBar, Timeline, TimelineDesc, Help, CarState, Max};
std::array<WINDOW*, Win::Max> w{};
SubMaster sm;
Replay *replay;
int max_width, max_height;
Status status = Status::Playing;
std::mutex mutex;
std::vector<std::pair<ReplyMsgType, std::string>> logs;
uint64_t progress_cur = 0;
uint64_t progress_total = 0;
bool download_success = false;
};

20
tools/replay/filereader.h Normal file
View File

@@ -0,0 +1,20 @@
#pragma once
#include <atomic>
#include <string>
class FileReader {
public:
FileReader(bool cache_to_local, size_t chunk_size = 0, int retries = 3)
: cache_to_local_(cache_to_local), chunk_size_(chunk_size), max_retries_(retries) {}
virtual ~FileReader() {}
std::string read(const std::string &file, std::atomic<bool> *abort = nullptr);
private:
std::string download(const std::string &url, std::atomic<bool> *abort);
size_t chunk_size_;
int max_retries_;
bool cache_to_local_;
};
std::string cacheFilePath(const std::string &url);

View File

@@ -0,0 +1,57 @@
#pragma once
#include <string>
#include <vector>
#include "msgq/visionipc/visionbuf.h"
#include "tools/replay/filereader.h"
#include "tools/replay/util.h"
extern "C" {
#include <libavcodec/avcodec.h>
#include <libavformat/avformat.h>
}
class VideoDecoder;
class FrameReader {
public:
FrameReader();
~FrameReader();
bool load(CameraType type, const std::string &url, bool no_hw_decoder = false, std::atomic<bool> *abort = nullptr, bool local_cache = false,
int chunk_size = -1, int retries = 0);
bool loadFromFile(CameraType type, const std::string &file, bool no_hw_decoder = false, std::atomic<bool> *abort = nullptr);
bool get(int idx, VisionBuf *buf);
size_t getFrameCount() const { return packets_info.size(); }
int width = 0, height = 0;
VideoDecoder *decoder_ = nullptr;
AVFormatContext *input_ctx = nullptr;
int prev_idx = -1;
struct PacketInfo {
int flags;
int64_t pos;
};
std::vector<PacketInfo> packets_info;
};
class VideoDecoder {
public:
VideoDecoder();
~VideoDecoder();
bool open(AVCodecParameters *codecpar, bool hw_decoder);
bool decode(FrameReader *reader, int idx, VisionBuf *buf);
int width = 0, height = 0;
private:
bool initHardwareDecoder(AVHWDeviceType hw_device_type);
AVFrame *decodeFrame(AVPacket *pkt);
bool copyBuffer(AVFrame *f, VisionBuf *buf);
AVFrame *av_frame_, *hw_frame_;
AVCodecContext *decoder_ctx = nullptr;
AVPixelFormat hw_pix_fmt = AV_PIX_FMT_NONE;
AVBufferRef *hw_device_ctx = nullptr;
};

View File

View File

@@ -0,0 +1,109 @@
import numpy as np
from openpilot.selfdrive.controls.radard import RADAR_TO_CAMERA
# Color palette used for rerun AnnotationContext
rerunColorPalette = [(96, "red", (255, 0, 0)),
(100, "pink", (255, 36, 0)),
(124, "yellow", (255, 255, 0)),
(230, "vibrantpink", (255, 36, 170)),
(240, "orange", (255, 146, 0)),
(255, "white", (255, 255, 255)),
(110, "carColor", (255,0,127)),
(0, "background", (0, 0, 0))]
class UIParams:
lidar_x, lidar_y, lidar_zoom = 384, 960, 6
lidar_car_x, lidar_car_y = lidar_x / 2., lidar_y / 1.1
car_hwidth = 1.7272 / 2 * lidar_zoom
car_front = 2.6924 * lidar_zoom
car_back = 1.8796 * lidar_zoom
car_color = rerunColorPalette[6][0]
UP = UIParams
def to_topdown_pt(y, x):
px, py = x * UP.lidar_zoom + UP.lidar_car_x, -y * UP.lidar_zoom + UP.lidar_car_y
if px > 0 and py > 0 and px < UP.lidar_x and py < UP.lidar_y:
return int(px), int(py)
return -1, -1
def draw_path(path, lid_overlay, lid_color=None):
x, y = np.asarray(path.x), np.asarray(path.y)
# draw lidar path point on lidar
if lid_color is not None and lid_overlay is not None:
for i in range(len(x)):
px, py = to_topdown_pt(x[i], y[i])
if px != -1:
lid_overlay[px, py] = lid_color
def plot_model(m, lid_overlay):
if lid_overlay is None:
return
for lead in m.leadsV3:
if lead.prob < 0.5:
continue
x, y = lead.x[0], lead.y[0]
x_std = lead.xStd[0]
x -= RADAR_TO_CAMERA
_, py_top = to_topdown_pt(x + x_std, y)
px, py_bottom = to_topdown_pt(x - x_std, y)
lid_overlay[int(round(px - 4)):int(round(px + 4)), py_top:py_bottom] = rerunColorPalette[2][0]
for path in m.laneLines:
draw_path(path, lid_overlay, rerunColorPalette[2][0])
for edge in m.roadEdges:
draw_path(edge, lid_overlay, rerunColorPalette[0][0])
draw_path(m.position, lid_overlay, rerunColorPalette[0][0])
def plot_lead(rs, lid_overlay):
for lead in [rs.leadOne, rs.leadTwo]:
if not lead.status:
continue
x = lead.dRel
px_left, py = to_topdown_pt(x, -10)
px_right, _ = to_topdown_pt(x, 10)
lid_overlay[px_left:px_right, py] = rerunColorPalette[0][0]
def update_radar_points(lt, lid_overlay):
ar_pts = []
if lt is not None:
ar_pts = {}
for track in lt:
ar_pts[track.trackId] = [track.dRel, track.yRel, track.vRel, track.aRel, track.oncoming, track.stationary]
for ids, pt in ar_pts.items():
# negative here since radar is left positive
px, py = to_topdown_pt(pt[0], -pt[1])
if px != -1:
if pt[-1]:
color = rerunColorPalette[4][0]
elif pt[-2]:
color = rerunColorPalette[3][0]
else:
color = rerunColorPalette[5][0]
if int(ids) == 1:
lid_overlay[px - 2:px + 2, py - 10:py + 10] = rerunColorPalette[1][0]
else:
lid_overlay[px - 2:px + 2, py - 2:py + 2] = color
def get_blank_lid_overlay(UP):
lid_overlay = np.zeros((UP.lidar_x, UP.lidar_y), 'uint8')
# Draw the car.
lid_overlay[int(round(UP.lidar_car_x - UP.car_hwidth)):int(
round(UP.lidar_car_x + UP.car_hwidth)), int(round(UP.lidar_car_y -
UP.car_front))] = UP.car_color
lid_overlay[int(round(UP.lidar_car_x - UP.car_hwidth)):int(
round(UP.lidar_car_x + UP.car_hwidth)), int(round(UP.lidar_car_y +
UP.car_back))] = UP.car_color
lid_overlay[int(round(UP.lidar_car_x - UP.car_hwidth)), int(
round(UP.lidar_car_y - UP.car_front)):int(round(
UP.lidar_car_y + UP.car_back))] = UP.car_color
lid_overlay[int(round(UP.lidar_car_x + UP.car_hwidth)), int(
round(UP.lidar_car_y - UP.car_front)):int(round(
UP.lidar_car_y + UP.car_back))] = UP.car_color
return lid_overlay

View File

@@ -0,0 +1,226 @@
import itertools
from typing import Any
import matplotlib.pyplot as plt
import numpy as np
import pygame
from matplotlib.backends.backend_agg import FigureCanvasAgg
from openpilot.common.transformations.camera import get_view_frame_from_calib_frame
from openpilot.selfdrive.controls.radard import RADAR_TO_CAMERA
RED = (255, 0, 0)
GREEN = (0, 255, 0)
BLUE = (0, 0, 255)
YELLOW = (255, 255, 0)
BLACK = (0, 0, 0)
WHITE = (255, 255, 255)
class UIParams:
lidar_x, lidar_y, lidar_zoom = 384, 960, 6
lidar_car_x, lidar_car_y = lidar_x / 2., lidar_y / 1.1
car_hwidth = 1.7272 / 2 * lidar_zoom
car_front = 2.6924 * lidar_zoom
car_back = 1.8796 * lidar_zoom
car_color = 110
UP = UIParams
METER_WIDTH = 20
class Calibration:
def __init__(self, num_px, rpy, intrinsic, calib_scale):
self.intrinsic = intrinsic
self.extrinsics_matrix = get_view_frame_from_calib_frame(rpy[0], rpy[1], rpy[2], 0.0)[:,:3]
self.zoom = calib_scale
def car_space_to_ff(self, x, y, z):
car_space_projective = np.column_stack((x, y, z)).T
ep = self.extrinsics_matrix.dot(car_space_projective)
kep = self.intrinsic.dot(ep)
return (kep[:-1, :] / kep[-1, :]).T
def car_space_to_bb(self, x, y, z):
pts = self.car_space_to_ff(x, y, z)
return pts / self.zoom
_COLOR_CACHE : dict[tuple[int, int, int], Any] = {}
def find_color(lidar_surface, color):
if color in _COLOR_CACHE:
return _COLOR_CACHE[color]
tcolor = 0
ret = 255
for x in lidar_surface.get_palette():
if x[0:3] == color:
ret = tcolor
break
tcolor += 1
_COLOR_CACHE[color] = ret
return ret
def to_topdown_pt(y, x):
px, py = x * UP.lidar_zoom + UP.lidar_car_x, -y * UP.lidar_zoom + UP.lidar_car_y
if px > 0 and py > 0 and px < UP.lidar_x and py < UP.lidar_y:
return int(px), int(py)
return -1, -1
def draw_path(path, color, img, calibration, top_down, lid_color=None, z_off=0):
x, y, z = np.asarray(path.x), np.asarray(path.y), np.asarray(path.z) + z_off
pts = calibration.car_space_to_bb(x, y, z)
pts = np.round(pts).astype(int)
# draw lidar path point on lidar
# find color in 8 bit
if lid_color is not None and top_down is not None:
tcolor = find_color(top_down[0], lid_color)
for i in range(len(x)):
px, py = to_topdown_pt(x[i], y[i])
if px != -1:
top_down[1][px, py] = tcolor
height, width = img.shape[:2]
for x, y in pts:
if 1 < x < width - 1 and 1 < y < height - 1:
for a, b in itertools.permutations([-1, 0, -1], 2):
img[y + a, x + b] = color
def init_plots(arr, name_to_arr_idx, plot_xlims, plot_ylims, plot_names, plot_colors, plot_styles):
color_palette = { "r": (1, 0, 0),
"g": (0, 1, 0),
"b": (0, 0, 1),
"k": (0, 0, 0),
"y": (1, 1, 0),
"p": (0, 1, 1),
"m": (1, 0, 1)}
dpi = 90
fig = plt.figure(figsize=(575 / dpi, 600 / dpi), dpi=dpi)
canvas = FigureCanvasAgg(fig)
fig.set_facecolor((0.2, 0.2, 0.2))
axs = []
for pn in range(len(plot_ylims)):
ax = fig.add_subplot(len(plot_ylims), 1, len(axs)+1)
ax.set_xlim(plot_xlims[pn][0], plot_xlims[pn][1])
ax.set_ylim(plot_ylims[pn][0], plot_ylims[pn][1])
ax.patch.set_facecolor((0.4, 0.4, 0.4))
axs.append(ax)
plots, idxs, plot_select = [], [], []
for i, pl_list in enumerate(plot_names):
for j, item in enumerate(pl_list):
plot, = axs[i].plot(arr[:, name_to_arr_idx[item]],
label=item,
color=color_palette[plot_colors[i][j]],
linestyle=plot_styles[i][j])
plots.append(plot)
idxs.append(name_to_arr_idx[item])
plot_select.append(i)
axs[i].set_title(", ".join(f"{nm} ({cl})"
for (nm, cl) in zip(pl_list, plot_colors[i], strict=False)), fontsize=10)
axs[i].tick_params(axis="x", colors="white")
axs[i].tick_params(axis="y", colors="white")
axs[i].title.set_color("white")
if i < len(plot_ylims) - 1:
axs[i].set_xticks([])
canvas.draw()
def draw_plots(arr):
for ax in axs:
ax.draw_artist(ax.patch)
for i in range(len(plots)):
plots[i].set_ydata(arr[:, idxs[i]])
axs[plot_select[i]].draw_artist(plots[i])
raw_data = canvas.buffer_rgba()
plot_surface = pygame.image.frombuffer(raw_data, canvas.get_width_height(), "RGBA").convert()
return plot_surface
return draw_plots
def pygame_modules_have_loaded():
return pygame.display.get_init() and pygame.font.get_init()
def plot_model(m, img, calibration, top_down):
if calibration is None or top_down is None:
return
for lead in m.leadsV3:
if lead.prob < 0.5:
continue
x, y = lead.x[0], lead.y[0]
x_std = lead.xStd[0]
x -= RADAR_TO_CAMERA
_, py_top = to_topdown_pt(x + x_std, y)
px, py_bottom = to_topdown_pt(x - x_std, y)
top_down[1][int(round(px - 4)):int(round(px + 4)), py_top:py_bottom] = find_color(top_down[0], YELLOW)
for path, prob, _ in zip(m.laneLines, m.laneLineProbs, m.laneLineStds, strict=True):
color = (0, int(255 * prob), 0)
draw_path(path, color, img, calibration, top_down, YELLOW)
for edge, std in zip(m.roadEdges, m.roadEdgeStds, strict=True):
prob = max(1 - std, 0)
color = (int(255 * prob), 0, 0)
draw_path(edge, color, img, calibration, top_down, RED)
color = (255, 0, 0)
draw_path(m.position, color, img, calibration, top_down, RED, 1.22)
def plot_lead(rs, top_down):
for lead in [rs.leadOne, rs.leadTwo]:
if not lead.status:
continue
x = lead.dRel
px_left, py = to_topdown_pt(x, -10)
px_right, _ = to_topdown_pt(x, 10)
top_down[1][px_left:px_right, py] = find_color(top_down[0], RED)
def maybe_update_radar_points(lt, lid_overlay):
ar_pts = []
if lt is not None:
ar_pts = {}
for track in lt:
ar_pts[track.trackId] = [track.dRel, track.yRel, track.vRel, track.aRel]
for ids, pt in ar_pts.items():
# negative here since radar is left positive
px, py = to_topdown_pt(pt[0], -pt[1])
if px != -1:
color = 255
if int(ids) == 1:
lid_overlay[px - 2:px + 2, py - 10:py + 10] = 100
else:
lid_overlay[px - 2:px + 2, py - 2:py + 2] = color
def get_blank_lid_overlay(UP):
lid_overlay = np.zeros((UP.lidar_x, UP.lidar_y), 'uint8')
# Draw the car.
lid_overlay[int(round(UP.lidar_car_x - UP.car_hwidth)):int(
round(UP.lidar_car_x + UP.car_hwidth)), int(round(UP.lidar_car_y -
UP.car_front))] = UP.car_color
lid_overlay[int(round(UP.lidar_car_x - UP.car_hwidth)):int(
round(UP.lidar_car_x + UP.car_hwidth)), int(round(UP.lidar_car_y +
UP.car_back))] = UP.car_color
lid_overlay[int(round(UP.lidar_car_x - UP.car_hwidth)), int(
round(UP.lidar_car_y - UP.car_front)):int(round(
UP.lidar_car_y + UP.car_back))] = UP.car_color
lid_overlay[int(round(UP.lidar_car_x + UP.car_hwidth)), int(
round(UP.lidar_car_y - UP.car_front)):int(round(
UP.lidar_car_y + UP.car_back))] = UP.car_color
return lid_overlay

42
tools/replay/logreader.h Normal file
View File

@@ -0,0 +1,42 @@
#pragma once
#include <string>
#include <vector>
#include "cereal/gen/cpp/log.capnp.h"
#include "tools/replay/util.h"
const CameraType ALL_CAMERAS[] = {RoadCam, DriverCam, WideRoadCam};
const int MAX_CAMERAS = std::size(ALL_CAMERAS);
class Event {
public:
Event(cereal::Event::Which which, uint64_t mono_time, const kj::ArrayPtr<const capnp::word> &data, int eidx_segnum = -1)
: which(which), mono_time(mono_time), data(data), eidx_segnum(eidx_segnum) {}
bool operator<(const Event &other) const {
return mono_time < other.mono_time || (mono_time == other.mono_time && which < other.which);
}
uint64_t mono_time;
cereal::Event::Which which;
kj::ArrayPtr<const capnp::word> data;
int32_t eidx_segnum;
};
class LogReader {
public:
LogReader(const std::vector<bool> &filters = {}) { filters_ = filters; }
bool load(const std::string &url, std::atomic<bool> *abort = nullptr,
bool local_cache = false, int chunk_size = -1, int retries = 0);
bool load(const char *data, size_t size, std::atomic<bool> *abort = nullptr);
std::vector<Event> events;
private:
void migrateOldEvents();
std::string raw_;
bool requires_migration = true;
std::vector<bool> filters_;
MonotonicBuffer buffer_{1024 * 1024};
};

110
tools/replay/replay.h Normal file
View File

@@ -0,0 +1,110 @@
#pragma once
#include <algorithm>
#include <functional>
#include <memory>
#include <mutex>
#include <optional>
#include <string>
#include <vector>
#include "tools/replay/camera.h"
#include "tools/replay/seg_mgr.h"
#include "tools/replay/timeline.h"
#define DEMO_ROUTE "a2a0ccea32023010|2023-07-27--13-01-19"
enum REPLAY_FLAGS {
REPLAY_FLAG_NONE = 0x0000,
REPLAY_FLAG_DCAM = 0x0002,
REPLAY_FLAG_ECAM = 0x0004,
REPLAY_FLAG_NO_LOOP = 0x0010,
REPLAY_FLAG_NO_FILE_CACHE = 0x0020,
REPLAY_FLAG_QCAMERA = 0x0040,
REPLAY_FLAG_NO_HW_DECODER = 0x0100,
REPLAY_FLAG_NO_VIPC = 0x0400,
REPLAY_FLAG_ALL_SERVICES = 0x0800,
};
class Replay {
public:
Replay(const std::string &route, std::vector<std::string> allow, std::vector<std::string> block, SubMaster *sm = nullptr,
uint32_t flags = REPLAY_FLAG_NONE, const std::string &data_dir = "");
~Replay();
bool load();
RouteLoadError lastRouteError() const { return route().lastError(); }
void start(int seconds = 0) { seekTo(min_seconds_ + seconds, false); }
void pause(bool pause);
void seekToFlag(FindFlag flag);
void seekTo(double seconds, bool relative);
inline bool isPaused() const { return user_paused_; }
inline int segmentCacheLimit() const { return seg_mgr_->segment_cache_limit_; }
inline void setSegmentCacheLimit(int n) { seg_mgr_->segment_cache_limit_ = std::max(MIN_SEGMENTS_CACHE, n); }
inline bool hasFlag(REPLAY_FLAGS flag) const { return flags_ & flag; }
void setLoop(bool loop) { loop ? flags_ &= ~REPLAY_FLAG_NO_LOOP : flags_ |= REPLAY_FLAG_NO_LOOP; }
bool loop() const { return !(flags_ & REPLAY_FLAG_NO_LOOP); }
const Route &route() const { return seg_mgr_->route_; }
inline double currentSeconds() const { return double(cur_mono_time_ - route_start_ts_) / 1e9; }
inline std::time_t routeDateTime() const { return route_date_time_; }
inline uint64_t routeStartNanos() const { return route_start_ts_; }
inline double toSeconds(uint64_t mono_time) const { return (mono_time - route_start_ts_) / 1e9; }
inline double minSeconds() const { return min_seconds_; }
inline double maxSeconds() const { return max_seconds_; }
inline void setSpeed(float speed) { speed_ = speed; }
inline float getSpeed() const { return speed_; }
inline const std::string &carFingerprint() const { return car_fingerprint_; }
inline const std::shared_ptr<std::vector<Timeline::Entry>> getTimeline() const { return timeline_.getEntries(); }
inline const std::optional<Timeline::Entry> findAlertAtTime(double sec) const { return timeline_.findAlertAtTime(sec); }
const std::shared_ptr<SegmentManager::EventData> getEventData() const { return seg_mgr_->getEventData(); }
void installEventFilter(std::function<bool(const Event *)> filter) { event_filter_ = filter; }
// Event callback functions
std::function<void()> onSegmentsMerged = nullptr;
std::function<void(double)> onSeeking = nullptr;
std::function<void(double)> onSeekedTo = nullptr;
std::function<void(std::shared_ptr<LogReader>)> onQLogLoaded = nullptr;
private:
void setupServices(const std::vector<std::string> &allow, const std::vector<std::string> &block);
void setupSegmentManager(bool has_filters);
void startStream(const std::shared_ptr<Segment> segment);
void streamThread();
void handleSegmentMerge();
void interruptStream(const std::function<bool()>& update_fn);
std::vector<Event>::const_iterator publishEvents(std::vector<Event>::const_iterator first,
std::vector<Event>::const_iterator last);
void publishMessage(const Event *e);
void publishFrame(const Event *e);
void checkSeekProgress();
std::unique_ptr<SegmentManager> seg_mgr_;
Timeline timeline_;
pthread_t stream_thread_id = 0;
std::thread stream_thread_;
std::mutex stream_lock_;
bool user_paused_ = false;
std::condition_variable stream_cv_;
std::atomic<int> current_segment_ = 0;
std::atomic<double> seeking_to_ = -1.0;
std::atomic<bool> exit_ = false;
std::atomic<bool> interrupt_requested_ = false;
bool events_ready_ = false;
std::time_t route_date_time_;
uint64_t route_start_ts_ = 0;
std::atomic<uint64_t> cur_mono_time_ = 0;
cereal::Event::Which cur_which_ = cereal::Event::Which::INIT_DATA;
double min_seconds_ = 0;
double max_seconds_ = 0;
SubMaster *sm_ = nullptr;
std::unique_ptr<PubMaster> pm_;
std::vector<const char*> sockets_;
std::unique_ptr<CameraServer> camera_server_;
std::atomic<uint32_t> flags_ = REPLAY_FLAG_NONE;
std::string car_fingerprint_;
std::atomic<float> speed_ = 1.0;
std::function<bool(const Event *)> event_filter_ = nullptr;
std::shared_ptr<SegmentManager::EventData> event_data_ = std::make_shared<SegmentManager::EventData>();
};

90
tools/replay/route.h Normal file
View File

@@ -0,0 +1,90 @@
#pragma once
#include <ctime>
#include <map>
#include <memory>
#include <mutex>
#include <string>
#include <thread>
#include <vector>
#include "tools/replay/framereader.h"
#include "tools/replay/logreader.h"
#include "tools/replay/util.h"
enum class RouteLoadError {
None,
Unauthorized,
AccessDenied,
NetworkError,
FileNotFound,
UnknownError
};
struct RouteIdentifier {
std::string dongle_id;
std::string timestamp;
int begin_segment = 0;
int end_segment = -1;
std::string str;
};
struct SegmentFile {
std::string rlog;
std::string qlog;
std::string road_cam;
std::string driver_cam;
std::string wide_road_cam;
std::string qcamera;
};
class Route {
public:
Route(const std::string &route, const std::string &data_dir = {});
bool load();
RouteLoadError lastError() const { return err_; }
inline const std::string &name() const { return route_.str; }
inline const std::time_t datetime() const { return date_time_; }
inline const std::string &dir() const { return data_dir_; }
inline const RouteIdentifier &identifier() const { return route_; }
inline const std::map<int, SegmentFile> &segments() const { return segments_; }
inline const SegmentFile &at(int n) { return segments_.at(n); }
static RouteIdentifier parseRoute(const std::string &str);
protected:
bool loadFromLocal();
bool loadFromServer(int retries = 3);
bool loadFromJson(const std::string &json);
void addFileToSegment(int seg_num, const std::string &file);
RouteIdentifier route_ = {};
std::string data_dir_;
std::map<int, SegmentFile> segments_;
std::time_t date_time_;
RouteLoadError err_ = RouteLoadError::None;
};
class Segment {
public:
enum class LoadState {Loading, Loaded, Failed};
Segment(int n, const SegmentFile &files, uint32_t flags, const std::vector<bool> &filters,
std::function<void(int, bool)> callback);
~Segment();
LoadState getState();
const int seg_num = 0;
std::unique_ptr<LogReader> log;
std::unique_ptr<FrameReader> frames[MAX_CAMERAS] = {};
protected:
void loadFile(int id, const std::string file);
std::atomic<bool> abort_ = false;
std::atomic<int> loading_ = 0;
std::mutex mutex_;
std::vector<std::thread> threads_;
std::function<void(int, bool)> on_load_finished_ = nullptr;
uint32_t flags;
std::vector<bool> filters_;
LoadState load_state_ = LoadState::Loading;
};

View File

@@ -0,0 +1,60 @@
#!/usr/bin/env python3
import argparse
import os
import sys
import numpy as np
import rerun as rr
import cereal.messaging as messaging
from openpilot.common.basedir import BASEDIR
from openpilot.tools.replay.lib.rp_helpers import (UP, rerunColorPalette,
get_blank_lid_overlay,
update_radar_points, plot_lead,
plot_model)
from msgq.visionipc import VisionIpcClient, VisionStreamType
os.environ['BASEDIR'] = BASEDIR
UP.lidar_zoom = 6
def visualize(addr):
sm = messaging.SubMaster(['radarState', 'liveTracks', 'modelV2'], addr=addr)
vipc_client = VisionIpcClient("camerad", VisionStreamType.VISION_STREAM_ROAD, True)
while True:
if not vipc_client.is_connected():
vipc_client.connect(True)
new_data = vipc_client.recv()
if new_data is None or not new_data.data.any():
continue
sm.update(0)
lid_overlay = get_blank_lid_overlay(UP)
if sm.recv_frame['modelV2']:
plot_model(sm['modelV2'], lid_overlay)
if sm.recv_frame['radarState']:
plot_lead(sm['radarState'], lid_overlay)
liveTracksTime = sm.logMonoTime['liveTracks']
if sm.updated['liveTracks']:
update_radar_points(sm['liveTracks'], lid_overlay)
rr.set_time_nanos("TIMELINE", liveTracksTime)
rr.log("tracks", rr.SegmentationImage(np.flip(np.rot90(lid_overlay, k=-1), axis=1)))
def get_arg_parser():
parser = argparse.ArgumentParser(
description="Show replay data in a UI.",
formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument("ip_address", nargs="?", default="127.0.0.1",
help="The ip address on which to receive zmq messages.")
parser.add_argument("--frame-address", default=None,
help="The frame address (fully qualified ZMQ endpoint for frames) on which to receive zmq messages.")
return parser
if __name__ == "__main__":
args = get_arg_parser().parse_args(sys.argv[1:])
if args.ip_address != "127.0.0.1":
os.environ["ZMQ"] = "1"
messaging.reset_context()
rr.init("RadarPoints", spawn= True)
rr.log("tracks", rr.AnnotationContext(rerunColorPalette), static=True)
visualize(args.ip_address)

56
tools/replay/seg_mgr.h Normal file
View File

@@ -0,0 +1,56 @@
#pragma once
#include <condition_variable>
#include <map>
#include <mutex>
#include <set>
#include <vector>
#include "tools/replay/route.h"
constexpr int MIN_SEGMENTS_CACHE = 5;
using SegmentMap = std::map<int, std::shared_ptr<Segment>>;
class SegmentManager {
public:
struct EventData {
std::vector<Event> events; // Events extracted from the segments
SegmentMap segments; // Associated segments that contributed to these events
bool isSegmentLoaded(int n) const { return segments.find(n) != segments.end(); }
};
SegmentManager(const std::string &route_name, uint32_t flags, const std::string &data_dir = "")
: flags_(flags), route_(route_name, data_dir), event_data_(std::make_shared<EventData>()) {}
~SegmentManager();
bool load();
void setCurrentSegment(int seg_num);
void setCallback(const std::function<void()> &callback) { onSegmentMergedCallback_ = callback; }
void setFilters(const std::vector<bool> &filters) { filters_ = filters; }
const std::shared_ptr<EventData> getEventData() const { return std::atomic_load(&event_data_); }
bool hasSegment(int n) const { return segments_.find(n) != segments_.end(); }
Route route_;
int segment_cache_limit_ = MIN_SEGMENTS_CACHE;
private:
void manageSegmentCache();
void loadSegmentsInRange(SegmentMap::iterator begin, SegmentMap::iterator cur, SegmentMap::iterator end);
bool mergeSegments(const SegmentMap::iterator &begin, const SegmentMap::iterator &end);
std::vector<bool> filters_;
uint32_t flags_;
std::mutex mutex_;
std::condition_variable cv_;
std::thread thread_;
int cur_seg_num_ = -1;
bool needs_update_ = false;
bool exit_ = false;
SegmentMap segments_;
std::shared_ptr<EventData> event_data_;
std::function<void()> onSegmentMergedCallback_ = nullptr;
std::set<int> merged_segments_;
};

46
tools/replay/timeline.h Normal file
View File

@@ -0,0 +1,46 @@
#pragma once
#include <atomic>
#include <optional>
#include <thread>
#include <vector>
#include "tools/replay/route.h"
enum class TimelineType { None, Engaged, AlertInfo, AlertWarning, AlertCritical, UserFlag };
enum class FindFlag { nextEngagement, nextDisEngagement, nextUserFlag, nextInfo, nextWarning, nextCritical };
class Timeline {
public:
struct Entry {
double start_time;
double end_time;
TimelineType type;
std::string text1;
std::string text2;
};
Timeline() : timeline_entries_(std::make_shared<std::vector<Entry>>()) {}
~Timeline();
void initialize(const Route &route, uint64_t route_start_ts, bool local_cache,
std::function<void(std::shared_ptr<LogReader>)> callback);
std::optional<uint64_t> find(double cur_ts, FindFlag flag) const;
std::optional<Entry> findAlertAtTime(double target_time) const;
const std::shared_ptr<std::vector<Entry>> getEntries() const { return std::atomic_load(&timeline_entries_); }
private:
void buildTimeline(const Route &route, uint64_t route_start_ts, bool local_cache,
std::function<void(std::shared_ptr<LogReader>)> callback);
void updateEngagementStatus(const cereal::SelfdriveState::Reader &cs, std::optional<size_t> &idx, double seconds);
void updateAlertStatus(const cereal::SelfdriveState::Reader &cs, std::optional<size_t> &idx, double seconds);
std::thread thread_;
std::atomic<bool> should_exit_ = false;
// Temporarily holds entries before they are sorted and finalized
std::vector<Entry> staging_entries_;
// Final sorted timeline entries
std::shared_ptr<std::vector<Entry>> timeline_entries_;
};

237
tools/replay/ui.py Executable file
View File

@@ -0,0 +1,237 @@
#!/usr/bin/env python3
import argparse
import os
import sys
import cv2
import numpy as np
import pygame
import cereal.messaging as messaging
from openpilot.common.basedir import BASEDIR
from openpilot.common.transformations.camera import DEVICE_CAMERAS
from openpilot.tools.replay.lib.ui_helpers import (UP,
BLACK, GREEN,
YELLOW, Calibration,
get_blank_lid_overlay, init_plots,
maybe_update_radar_points, plot_lead,
plot_model,
pygame_modules_have_loaded)
from msgq.visionipc import VisionIpcClient, VisionStreamType
os.environ['BASEDIR'] = BASEDIR
ANGLE_SCALE = 5.0
def ui_thread(addr):
cv2.setNumThreads(1)
pygame.init()
pygame.font.init()
assert pygame_modules_have_loaded()
disp_info = pygame.display.Info()
max_height = disp_info.current_h
hor_mode = os.getenv("HORIZONTAL") is not None
hor_mode = True if max_height < 960+300 else hor_mode
if hor_mode:
size = (640+384+640, 960)
write_x = 5
write_y = 680
else:
size = (640+384, 960+300)
write_x = 645
write_y = 970
pygame.display.set_caption("openpilot debug UI")
screen = pygame.display.set_mode(size, pygame.DOUBLEBUF)
alert1_font = pygame.font.SysFont("arial", 30)
alert2_font = pygame.font.SysFont("arial", 20)
info_font = pygame.font.SysFont("arial", 15)
camera_surface = pygame.surface.Surface((640, 480), 0, 24).convert()
top_down_surface = pygame.surface.Surface((UP.lidar_x, UP.lidar_y), 0, 8)
sm = messaging.SubMaster(['carState', 'longitudinalPlan', 'carControl', 'radarState', 'liveCalibration', 'controlsState',
'selfdriveState', 'liveTracks', 'modelV2', 'liveParameters', 'roadCameraState'], addr=addr)
img = np.zeros((480, 640, 3), dtype='uint8')
imgff = None
num_px = 0
calibration = None
lid_overlay_blank = get_blank_lid_overlay(UP)
# plots
name_to_arr_idx = { "gas": 0,
"computer_gas": 1,
"user_brake": 2,
"computer_brake": 3,
"v_ego": 4,
"v_pid": 5,
"angle_steers_des": 6,
"angle_steers": 7,
"angle_steers_k": 8,
"steer_torque": 9,
"v_override": 10,
"v_cruise": 11,
"a_ego": 12,
"a_target": 13}
plot_arr = np.zeros((100, len(name_to_arr_idx.values())))
plot_xlims = [(0, plot_arr.shape[0]), (0, plot_arr.shape[0]), (0, plot_arr.shape[0]), (0, plot_arr.shape[0])]
plot_ylims = [(-0.1, 1.1), (-ANGLE_SCALE, ANGLE_SCALE), (0., 75.), (-3.0, 2.0)]
plot_names = [["gas", "computer_gas", "user_brake", "computer_brake"],
["angle_steers", "angle_steers_des", "angle_steers_k", "steer_torque"],
["v_ego", "v_override", "v_pid", "v_cruise"],
["a_ego", "a_target"]]
plot_colors = [["b", "b", "g", "r", "y"],
["b", "g", "y", "r"],
["b", "g", "r", "y"],
["b", "r"]]
plot_styles = [["-", "-", "-", "-", "-"],
["-", "-", "-", "-"],
["-", "-", "-", "-"],
["-", "-"]]
draw_plots = init_plots(plot_arr, name_to_arr_idx, plot_xlims, plot_ylims, plot_names, plot_colors, plot_styles)
vipc_client = VisionIpcClient("camerad", VisionStreamType.VISION_STREAM_ROAD, True)
while True:
for event in pygame.event.get():
if event.type == pygame.QUIT:
pygame.quit()
sys.exit()
screen.fill((64, 64, 64))
lid_overlay = lid_overlay_blank.copy()
top_down = top_down_surface, lid_overlay
# ***** frame *****
if not vipc_client.is_connected():
vipc_client.connect(True)
yuv_img_raw = vipc_client.recv()
if yuv_img_raw is None or not yuv_img_raw.data.any():
continue
sm.update(0)
camera = DEVICE_CAMERAS[("tici", str(sm['roadCameraState'].sensor))]
imgff = np.frombuffer(yuv_img_raw.data, dtype=np.uint8).reshape((len(yuv_img_raw.data) // vipc_client.stride, vipc_client.stride))
num_px = vipc_client.width * vipc_client.height
rgb = cv2.cvtColor(imgff[:vipc_client.height * 3 // 2, :vipc_client.width], cv2.COLOR_YUV2RGB_NV12)
qcam = "QCAM" in os.environ
bb_scale = (528 if qcam else camera.fcam.width) / 640.
calib_scale = camera.fcam.width / 640.
zoom_matrix = np.asarray([
[bb_scale, 0., 0.],
[0., bb_scale, 0.],
[0., 0., 1.]])
cv2.warpAffine(rgb, zoom_matrix[:2], (img.shape[1], img.shape[0]), dst=img, flags=cv2.WARP_INVERSE_MAP)
intrinsic_matrix = camera.fcam.intrinsics
w = sm['controlsState'].lateralControlState.which()
if w == 'lqrStateDEPRECATED':
angle_steers_k = sm['controlsState'].lateralControlState.lqrStateDEPRECATED.steeringAngleDeg
elif w == 'indiState':
angle_steers_k = sm['controlsState'].lateralControlState.indiState.steeringAngleDeg
else:
angle_steers_k = np.inf
plot_arr[:-1] = plot_arr[1:]
plot_arr[-1, name_to_arr_idx['angle_steers']] = sm['carState'].steeringAngleDeg
plot_arr[-1, name_to_arr_idx['angle_steers_des']] = sm['carControl'].actuators.steeringAngleDeg
plot_arr[-1, name_to_arr_idx['angle_steers_k']] = angle_steers_k
plot_arr[-1, name_to_arr_idx['gas']] = sm['carState'].gas
# TODO gas is deprecated
plot_arr[-1, name_to_arr_idx['computer_gas']] = np.clip(sm['carControl'].actuators.accel/4.0, 0.0, 1.0)
plot_arr[-1, name_to_arr_idx['user_brake']] = sm['carState'].brake
plot_arr[-1, name_to_arr_idx['steer_torque']] = sm['carControl'].actuators.torque * ANGLE_SCALE
# TODO brake is deprecated
plot_arr[-1, name_to_arr_idx['computer_brake']] = np.clip(-sm['carControl'].actuators.accel/4.0, 0.0, 1.0)
plot_arr[-1, name_to_arr_idx['v_ego']] = sm['carState'].vEgo
plot_arr[-1, name_to_arr_idx['v_cruise']] = sm['carState'].cruiseState.speed
plot_arr[-1, name_to_arr_idx['a_ego']] = sm['carState'].aEgo
if len(sm['longitudinalPlan'].accels):
plot_arr[-1, name_to_arr_idx['a_target']] = sm['longitudinalPlan'].accels[0]
if sm.recv_frame['modelV2']:
plot_model(sm['modelV2'], img, calibration, top_down)
if sm.recv_frame['radarState']:
plot_lead(sm['radarState'], top_down)
# draw all radar points
maybe_update_radar_points(sm['liveTracks'].points, top_down[1])
if sm.updated['liveCalibration'] and num_px:
rpyCalib = np.asarray(sm['liveCalibration'].rpyCalib)
calibration = Calibration(num_px, rpyCalib, intrinsic_matrix, calib_scale)
# *** blits ***
pygame.surfarray.blit_array(camera_surface, img.swapaxes(0, 1))
screen.blit(camera_surface, (0, 0))
# display alerts
alert_line1 = alert1_font.render(sm['selfdriveState'].alertText1, True, (255, 0, 0))
alert_line2 = alert2_font.render(sm['selfdriveState'].alertText2, True, (255, 0, 0))
screen.blit(alert_line1, (180, 150))
screen.blit(alert_line2, (180, 190))
if hor_mode:
screen.blit(draw_plots(plot_arr), (640+384, 0))
else:
screen.blit(draw_plots(plot_arr), (0, 600))
pygame.surfarray.blit_array(*top_down)
screen.blit(top_down[0], (640, 0))
SPACING = 25
lines = [
info_font.render("ENABLED", True, GREEN if sm['selfdriveState'].enabled else BLACK),
info_font.render("SPEED: " + str(round(sm['carState'].vEgo, 1)) + " m/s", True, YELLOW),
info_font.render("LONG CONTROL STATE: " + str(sm['controlsState'].longControlState), True, YELLOW),
info_font.render("LONG MPC SOURCE: " + str(sm['longitudinalPlan'].longitudinalPlanSource), True, YELLOW),
None,
info_font.render("ANGLE OFFSET (AVG): " + str(round(sm['liveParameters'].angleOffsetAverageDeg, 2)) + " deg", True, YELLOW),
info_font.render("ANGLE OFFSET (INSTANT): " + str(round(sm['liveParameters'].angleOffsetDeg, 2)) + " deg", True, YELLOW),
info_font.render("STIFFNESS: " + str(round(sm['liveParameters'].stiffnessFactor * 100., 2)) + " %", True, YELLOW),
info_font.render("STEER RATIO: " + str(round(sm['liveParameters'].steerRatio, 2)), True, YELLOW)
]
for i, line in enumerate(lines):
if line is not None:
screen.blit(line, (write_x, write_y + i * SPACING))
# this takes time...vsync or something
pygame.display.flip()
def get_arg_parser():
parser = argparse.ArgumentParser(
description="Show replay data in a UI.",
formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument("ip_address", nargs="?", default="127.0.0.1",
help="The ip address on which to receive zmq messages.")
parser.add_argument("--frame-address", default=None,
help="The frame address (fully qualified ZMQ endpoint for frames) on which to receive zmq messages.")
return parser
if __name__ == "__main__":
args = get_arg_parser().parse_args(sys.argv[1:])
if args.ip_address != "127.0.0.1":
os.environ["ZMQ"] = "1"
messaging.reset_context()
ui_thread(args.ip_address)

108
tools/replay/unlog_ci_segment.py Executable file
View File

@@ -0,0 +1,108 @@
#!/usr/bin/env python3
import argparse
import bisect
import select
import sys
import termios
import time
import tty
from collections import defaultdict
import cereal.messaging as messaging
from openpilot.tools.lib.framereader import FrameReader
from openpilot.tools.lib.logreader import LogReader
from openpilot.tools.lib.openpilotci import get_url
IGNORE = ['initData', 'sentinel']
def input_ready():
return select.select([sys.stdin], [], [], 0) == ([sys.stdin], [], [])
def replay(route, segment, loop):
route = route.replace('|', '/')
lr = LogReader(get_url(route, segment, "rlog.bz2"))
fr = FrameReader(get_url(route, segment, "fcamera.hevc"), readahead=True)
# Build mapping from frameId to segmentId from roadEncodeIdx, type == fullHEVC
msgs = [m for m in lr if m.which() not in IGNORE]
msgs = sorted(msgs, key=lambda m: m.logMonoTime)
times = [m.logMonoTime for m in msgs]
frame_idx = {m.roadEncodeIdx.frameId: m.roadEncodeIdx.segmentId for m in msgs if m.which() == 'roadEncodeIdx' and m.roadEncodeIdx.type == 'fullHEVC'}
socks = {}
lag = 0.0
i = 0
max_i = len(msgs) - 2
while True:
msg = msgs[i].as_builder()
next_msg = msgs[i + 1]
start_time = time.time()
w = msg.which()
if w == 'roadCameraState':
try:
img = fr.get(frame_idx[msg.roadCameraState.frameId], pix_fmt="rgb24")
img = img[0][:, :, ::-1] # Convert RGB to BGR, which is what the camera outputs
msg.roadCameraState.image = img.flatten().tobytes()
except (KeyError, ValueError):
pass
if w not in socks:
socks[w] = messaging.pub_sock(w)
try:
if socks[w]:
socks[w].send(msg.to_bytes())
except messaging.messaging_pyx.MultiplePublishersError:
socks[w] = None
lag += (next_msg.logMonoTime - msg.logMonoTime) / 1e9
lag -= time.time() - start_time
dt = max(lag, 0.0)
lag -= dt
time.sleep(dt)
if lag < -1.0 and i % 1000 == 0:
print(f"{-lag:.2f} s behind")
if input_ready():
key = sys.stdin.read(1)
# Handle pause
if key == " ":
while True:
if input_ready() and sys.stdin.read(1) == " ":
break
time.sleep(0.01)
# Handle seek
dt = defaultdict(int, s=10, S=-10)[key]
new_time = msgs[i].logMonoTime + dt * 1e9
i = bisect.bisect_left(times, new_time)
i = (i + 1) % max_i if loop else min(i + 1, max_i)
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument("--loop", action='store_true')
parser.add_argument("route")
parser.add_argument("segment")
args = parser.parse_args()
orig_settings = termios.tcgetattr(sys.stdin)
tty.setcbreak(sys.stdin)
try:
replay(args.route, args.segment, args.loop)
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, orig_settings)
except Exception:
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, orig_settings)
raise

74
tools/replay/util.h Normal file
View File

@@ -0,0 +1,74 @@
#pragma once
#include <atomic>
#include <deque>
#include <functional>
#include <sstream>
#include <string>
#include <string_view>
#include <vector>
#include "cereal/messaging/messaging.h"
enum CameraType {
RoadCam = 0,
DriverCam,
WideRoadCam
};
enum class ReplyMsgType {
Info,
Debug,
Warning,
Critical
};
typedef std::function<void(ReplyMsgType type, const std::string msg)> ReplayMessageHandler;
void installMessageHandler(ReplayMessageHandler);
void logMessage(ReplyMsgType type, const char* fmt, ...);
#define rInfo(fmt, ...) ::logMessage(ReplyMsgType::Info, fmt, ## __VA_ARGS__)
#define rDebug(fmt, ...) ::logMessage(ReplyMsgType::Debug, fmt, ## __VA_ARGS__)
#define rWarning(fmt, ...) ::logMessage(ReplyMsgType::Warning, fmt, ## __VA_ARGS__)
#define rError(fmt, ...) ::logMessage(ReplyMsgType::Critical , fmt, ## __VA_ARGS__)
class MonotonicBuffer {
public:
MonotonicBuffer(size_t initial_size) : next_buffer_size(initial_size) {}
~MonotonicBuffer();
void *allocate(size_t bytes, size_t alignment = 16ul);
void deallocate(void *p) {}
private:
void *current_buf = nullptr;
size_t next_buffer_size = 0;
size_t available = 0;
std::deque<void *> buffers;
static constexpr float growth_factor = 1.5;
};
std::string sha256(const std::string &str);
void precise_nano_sleep(int64_t nanoseconds, std::atomic<bool> &interrupt_requested);
std::string decompressBZ2(const std::string &in, std::atomic<bool> *abort = nullptr);
std::string decompressBZ2(const std::byte *in, size_t in_size, std::atomic<bool> *abort = nullptr);
std::string decompressZST(const std::string &in, std::atomic<bool> *abort = nullptr);
std::string decompressZST(const std::byte *in, size_t in_size, std::atomic<bool> *abort = nullptr);
std::string getUrlWithoutQuery(const std::string &url);
size_t getRemoteFileSize(const std::string &url, std::atomic<bool> *abort = nullptr);
std::string httpGet(const std::string &url, size_t chunk_size = 0, std::atomic<bool> *abort = nullptr);
typedef std::function<void(uint64_t cur, uint64_t total, bool success)> DownloadProgressHandler;
void installDownloadProgressHandler(DownloadProgressHandler);
bool httpDownload(const std::string &url, const std::string &file, size_t chunk_size = 0, std::atomic<bool> *abort = nullptr);
std::string formattedDataSize(size_t size);
std::string extractFileName(const std::string& file);
std::vector<std::string> split(std::string_view source, char delimiter);
template <typename Iterable>
std::string join(const Iterable& elements, const std::string& separator) {
std::ostringstream oss;
for (auto it = elements.begin(); it != elements.end(); ++it) {
if (it != elements.begin()) oss << separator;
oss << *it;
}
return oss.str();
}