restream/reStream.sh

237 lines
7.1 KiB
Bash
Raw Normal View History

2020-02-12 13:13:14 +01:00
#!/bin/sh
# default values for arguments
Allow choosing the video output Instead of simply playing back the frames through `ffplay`, I thought it might be interesting to be able to record the sequence to a video file or to use it as part of a stream. I have in mind the use case of making educational videos/live streams where the tablet can be used as a kind of remote blackboard by teachers, which is especially relevant currently. But there are certainly other use cases! Changes ======= This commit adds two new options to that effect: * `-o --output`: Path of the output as understood by `ffmpeg` (usually a file name). If this is `-` (as it is by default), the existing behavior of playing the stream through `ffplay` is restored. * `-f --format`: When recording to an output, this option can be used to force the encoding format. If this is `-` (again, the default), `ffmpeg`’s auto format detection is used (based on the file extension). Because of the possible confusion between the newly added `--output` option and the existing `--destination` option for specifying the source address, I suggest renaming the `--destination` option to `--source` (this is implemented in this commit). Examples ======== Record to a file ---------------- ```sh ./reStream.sh -o remarkable.mp4 ``` Caveat: The recorded file plays back too fast. I am not sure how to fix this. Create an UDP MPEG-TS stream ---------------------------- ```sh ./reStream.sh -o "udp://127.0.0.1:1234" -f "mpegts" ``` This sends frames over UDP to the specified port using the MPEG-TS format (see <https://trac.ffmpeg.org/wiki/StreamingGuide>). This stream can then be connected, for example, to OBS for live streaming (see <https://connect.ed-diamond.com/Linux-Pratique/LP-096/Enrichir-sa-diffusion-de-contenus-multimedias-avec-OBS> in French).
2020-04-02 19:32:59 +02:00
ssh_host="root@10.11.99.1" # remarkable connected through USB
landscape=true # rotate 90 degrees to the right
Allow choosing the video output Instead of simply playing back the frames through `ffplay`, I thought it might be interesting to be able to record the sequence to a video file or to use it as part of a stream. I have in mind the use case of making educational videos/live streams where the tablet can be used as a kind of remote blackboard by teachers, which is especially relevant currently. But there are certainly other use cases! Changes ======= This commit adds two new options to that effect: * `-o --output`: Path of the output as understood by `ffmpeg` (usually a file name). If this is `-` (as it is by default), the existing behavior of playing the stream through `ffplay` is restored. * `-f --format`: When recording to an output, this option can be used to force the encoding format. If this is `-` (again, the default), `ffmpeg`’s auto format detection is used (based on the file extension). Because of the possible confusion between the newly added `--output` option and the existing `--destination` option for specifying the source address, I suggest renaming the `--destination` option to `--source` (this is implemented in this commit). Examples ======== Record to a file ---------------- ```sh ./reStream.sh -o remarkable.mp4 ``` Caveat: The recorded file plays back too fast. I am not sure how to fix this. Create an UDP MPEG-TS stream ---------------------------- ```sh ./reStream.sh -o "udp://127.0.0.1:1234" -f "mpegts" ``` This sends frames over UDP to the specified port using the MPEG-TS format (see <https://trac.ffmpeg.org/wiki/StreamingGuide>). This stream can then be connected, for example, to OBS for live streaming (see <https://connect.ed-diamond.com/Linux-Pratique/LP-096/Enrichir-sa-diffusion-de-contenus-multimedias-avec-OBS> in French).
2020-04-02 19:32:59 +02:00
output_path=- # display output through ffplay
format=- # automatic output format
2020-04-14 10:46:07 +02:00
webcam=false # not to a webcam
2020-04-19 15:53:14 +02:00
measure_throughput=false # measure how fast data is being transferred
2020-11-20 12:29:32 +01:00
window_title=reStream # stream window title is reStream
2020-04-01 16:51:01 +02:00
# loop through arguments and process them
while [ $# -gt 0 ]; do
case "$1" in
2020-04-03 12:15:53 +02:00
-p | --portrait)
landscape=false
shift
;;
2020-04-07 11:21:10 +02:00
-s | --source)
ssh_host="$2"
shift
shift
;;
2020-04-07 11:21:10 +02:00
-o | --output)
Allow choosing the video output Instead of simply playing back the frames through `ffplay`, I thought it might be interesting to be able to record the sequence to a video file or to use it as part of a stream. I have in mind the use case of making educational videos/live streams where the tablet can be used as a kind of remote blackboard by teachers, which is especially relevant currently. But there are certainly other use cases! Changes ======= This commit adds two new options to that effect: * `-o --output`: Path of the output as understood by `ffmpeg` (usually a file name). If this is `-` (as it is by default), the existing behavior of playing the stream through `ffplay` is restored. * `-f --format`: When recording to an output, this option can be used to force the encoding format. If this is `-` (again, the default), `ffmpeg`’s auto format detection is used (based on the file extension). Because of the possible confusion between the newly added `--output` option and the existing `--destination` option for specifying the source address, I suggest renaming the `--destination` option to `--source` (this is implemented in this commit). Examples ======== Record to a file ---------------- ```sh ./reStream.sh -o remarkable.mp4 ``` Caveat: The recorded file plays back too fast. I am not sure how to fix this. Create an UDP MPEG-TS stream ---------------------------- ```sh ./reStream.sh -o "udp://127.0.0.1:1234" -f "mpegts" ``` This sends frames over UDP to the specified port using the MPEG-TS format (see <https://trac.ffmpeg.org/wiki/StreamingGuide>). This stream can then be connected, for example, to OBS for live streaming (see <https://connect.ed-diamond.com/Linux-Pratique/LP-096/Enrichir-sa-diffusion-de-contenus-multimedias-avec-OBS> in French).
2020-04-02 19:32:59 +02:00
output_path="$2"
shift
shift
;;
2020-04-07 11:21:10 +02:00
-f | --format)
Allow choosing the video output Instead of simply playing back the frames through `ffplay`, I thought it might be interesting to be able to record the sequence to a video file or to use it as part of a stream. I have in mind the use case of making educational videos/live streams where the tablet can be used as a kind of remote blackboard by teachers, which is especially relevant currently. But there are certainly other use cases! Changes ======= This commit adds two new options to that effect: * `-o --output`: Path of the output as understood by `ffmpeg` (usually a file name). If this is `-` (as it is by default), the existing behavior of playing the stream through `ffplay` is restored. * `-f --format`: When recording to an output, this option can be used to force the encoding format. If this is `-` (again, the default), `ffmpeg`’s auto format detection is used (based on the file extension). Because of the possible confusion between the newly added `--output` option and the existing `--destination` option for specifying the source address, I suggest renaming the `--destination` option to `--source` (this is implemented in this commit). Examples ======== Record to a file ---------------- ```sh ./reStream.sh -o remarkable.mp4 ``` Caveat: The recorded file plays back too fast. I am not sure how to fix this. Create an UDP MPEG-TS stream ---------------------------- ```sh ./reStream.sh -o "udp://127.0.0.1:1234" -f "mpegts" ``` This sends frames over UDP to the specified port using the MPEG-TS format (see <https://trac.ffmpeg.org/wiki/StreamingGuide>). This stream can then be connected, for example, to OBS for live streaming (see <https://connect.ed-diamond.com/Linux-Pratique/LP-096/Enrichir-sa-diffusion-de-contenus-multimedias-avec-OBS> in French).
2020-04-02 19:32:59 +02:00
format="$2"
shift
shift
;;
2020-12-31 15:22:11 +01:00
-m | --measure)
2020-04-19 15:53:14 +02:00
measure_throughput=true
shift
;;
2020-04-14 10:46:07 +02:00
-w | --webcam)
webcam=true
format="v4l2"
# check if there is a modprobed v4l2 loopback device
# use the first cam as default if there is no output_path already
cam_path=$(v4l2-ctl --list-devices \
| sed -n '/^[^\s]\+platform:v4l2loopback/{n;s/\s*//g;p;q}')
# fail if there is no such device
if [ -e "$cam_path" ]; then
if [ "$output_path" = "-" ]; then
output_path="$cam_path"
fi
else
echo "Could not find a video loopback device, did you"
echo "sudo modprobe v4l2loopback"
exit 1
fi
shift
;;
2020-12-31 15:22:11 +01:00
-t | --title)
window_title="$2"
shift
shift
;;
2020-04-07 11:48:46 +02:00
-h | --help | *)
2020-12-31 15:22:11 +01:00
echo "Usage: $0 [-p] [-s <source>] [-o <output>] [-f <format>] [-t <title>]"
2020-04-07 11:48:46 +02:00
echo "Examples:"
echo " $0 # live view in landscape"
echo " $0 -p # live view in portrait"
echo " $0 -s 192.168.0.10 # connect to different IP"
echo " $0 -o remarkable.mp4 # record to a file"
echo " $0 -o udp://dest:1234 -f mpegts # record to a stream"
2020-04-14 10:46:07 +02:00
echo " $0 -w # write to a webcam (yuv420p + resize)"
exit 1
2020-04-03 12:15:53 +02:00
;;
2020-04-01 16:51:01 +02:00
esac
done
2020-03-12 00:17:13 +01:00
2020-04-03 12:15:53 +02:00
ssh_cmd() {
echo "[SSH]" "$@" >&2
2020-04-03 12:15:53 +02:00
ssh -o ConnectTimeout=1 "$ssh_host" "$@"
}
# check if we are able to reach the remarkable
2020-04-03 12:15:53 +02:00
if ! ssh_cmd true; then
2020-02-12 13:13:14 +01:00
echo "$ssh_host unreachable"
exit 1
fi
rm_version="$(ssh_cmd cat /sys/devices/soc0/machine)"
case "$rm_version" in
"reMarkable 1.0")
width=1408
height=1872
bytes_per_pixel=2
pixel_format="rgb565le"
# calculate how much bytes the window is
window_bytes="$((width * height * bytes_per_pixel))"
# read the first $window_bytes of the framebuffer
head_fb0="dd if=/dev/fb0 count=1 bs=$window_bytes 2>/dev/null"
;;
"reMarkable 2.0")
pixel_format="gray8"
2020-10-31 18:53:49 +01:00
width=1872
height=1404
bytes_per_pixel=1
# calculate how much bytes the window is
window_bytes="$((width * height * bytes_per_pixel))"
# find xochitl's process
pid="$(ssh_cmd pidof xochitl)"
echo "xochitl's PID: $pid"
# find framebuffer location in memory
# it is actually the map allocated _after_ the fb0 mmap
read_address="grep -C1 '/dev/fb0' /proc/$pid/maps | tail -n1 | sed 's/-.*$//'"
skip_bytes_hex="$(ssh_cmd "$read_address")"
2020-11-01 20:28:51 +01:00
skip_bytes="$((0x$skip_bytes_hex + 8))"
echo "framebuffer is at 0x$skip_bytes_hex"
# carve the framebuffer out of the process memory
page_size=4096
window_start_blocks="$((skip_bytes / page_size))"
window_offset="$((skip_bytes % page_size))"
window_length_blocks="$((window_bytes / page_size + 1))"
# Using dd with bs=1 is too slow, so we first carve out the pages our desired
# bytes are located in, and then we trim the resulting data with what we need.
2020-11-01 20:28:51 +01:00
head_fb0="dd if=/proc/$pid/mem bs=$page_size skip=$window_start_blocks count=$window_length_blocks 2>/dev/null | tail -c+$window_offset | head -c $window_bytes"
;;
*)
echo "Unsupported reMarkable version: $rm_version."
echo "Please visit https://github.com/rien/reStream/ for updates."
exit 1
;;
esac
# technical parameters
loop_wait="true"
loglevel="info"
2020-03-12 00:17:13 +01:00
fallback_to_gzip() {
echo "Falling back to gzip, your experience may not be optimal."
echo "Go to https://github.com/rien/reStream/#sub-second-latency for a better experience."
compress="gzip"
decompress="gzip -d"
2020-03-12 00:17:13 +01:00
sleep 2
}
# check if lz4 is present on remarkable
2020-04-03 12:15:53 +02:00
if ssh_cmd "[ -f /opt/bin/lz4 ]"; then
compress="/opt/bin/lz4"
2020-04-03 12:15:53 +02:00
elif ssh_cmd "[ -f ~/lz4 ]"; then
compress="\$HOME/lz4"
fi
2020-02-12 13:13:14 +01:00
# gracefully degrade to gzip if is not present on remarkable or host
2020-03-12 00:17:13 +01:00
if [ -z "$compress" ]; then
echo "Your remarkable does not have lz4."
2020-03-12 00:17:13 +01:00
fallback_to_gzip
2020-05-06 14:47:37 +02:00
elif ! lz4 -V >/dev/null; then
echo "Your host does not have lz4."
2020-03-12 00:17:13 +01:00
fallback_to_gzip
else
decompress="lz4 -d"
2020-03-12 00:17:13 +01:00
fi
2020-04-19 15:53:14 +02:00
# use pv to measure throughput if desired, else we just pipe through cat
if $measure_throughput; then
2020-05-06 14:47:37 +02:00
if ! pv --version >/dev/null; then
2020-04-19 15:53:14 +02:00
echo "You need to install pv to measure data throughput."
exit 1
else
loglevel="error" # verbose ffmpeg output interferes with pv
host_passthrough="pv"
fi
else
host_passthrough="cat"
fi
2020-04-07 11:17:45 +02:00
# list of ffmpeg filters to apply
video_filters=""
Allow choosing the video output Instead of simply playing back the frames through `ffplay`, I thought it might be interesting to be able to record the sequence to a video file or to use it as part of a stream. I have in mind the use case of making educational videos/live streams where the tablet can be used as a kind of remote blackboard by teachers, which is especially relevant currently. But there are certainly other use cases! Changes ======= This commit adds two new options to that effect: * `-o --output`: Path of the output as understood by `ffmpeg` (usually a file name). If this is `-` (as it is by default), the existing behavior of playing the stream through `ffplay` is restored. * `-f --format`: When recording to an output, this option can be used to force the encoding format. If this is `-` (again, the default), `ffmpeg`’s auto format detection is used (based on the file extension). Because of the possible confusion between the newly added `--output` option and the existing `--destination` option for specifying the source address, I suggest renaming the `--destination` option to `--source` (this is implemented in this commit). Examples ======== Record to a file ---------------- ```sh ./reStream.sh -o remarkable.mp4 ``` Caveat: The recorded file plays back too fast. I am not sure how to fix this. Create an UDP MPEG-TS stream ---------------------------- ```sh ./reStream.sh -o "udp://127.0.0.1:1234" -f "mpegts" ``` This sends frames over UDP to the specified port using the MPEG-TS format (see <https://trac.ffmpeg.org/wiki/StreamingGuide>). This stream can then be connected, for example, to OBS for live streaming (see <https://connect.ed-diamond.com/Linux-Pratique/LP-096/Enrichir-sa-diffusion-de-contenus-multimedias-avec-OBS> in French).
2020-04-02 19:32:59 +02:00
2020-04-07 11:17:45 +02:00
# store extra ffmpeg arguments in $@
set --
2020-03-12 00:17:13 +01:00
# rotate 90 degrees if landscape=true
$landscape && video_filters="$video_filters,transpose=1"
2020-03-12 00:17:13 +01:00
2020-04-14 10:46:07 +02:00
# Scale and add padding if we are targeting a webcam because a lot of services
# expect a size of exactly 1280x720 (tested in Firefox, MS Teams, and Skype for
# for business). Send a PR is you can get a heigher resolution working.
if $webcam; then
video_filters="$video_filters,format=pix_fmts=yuv420p"
video_filters="$video_filters,scale=-1:720"
video_filters="$video_filters,pad=1280:0:-1:0:#eeeeee"
2020-04-14 10:46:07 +02:00
fi
2020-04-07 11:17:45 +02:00
# set each frame presentation time to the time it is received
video_filters="$video_filters,setpts=(RTCTIME - RTCSTART) / (TB * 1000000)"
2020-03-12 00:17:13 +01:00
# loop that keeps on reading and compressing, to be executed remotely
read_loop="while $head_fb0; do $loop_wait; done | $compress"
2020-04-07 11:17:45 +02:00
set -- "$@" -vf "${video_filters#,}"
Allow choosing the video output Instead of simply playing back the frames through `ffplay`, I thought it might be interesting to be able to record the sequence to a video file or to use it as part of a stream. I have in mind the use case of making educational videos/live streams where the tablet can be used as a kind of remote blackboard by teachers, which is especially relevant currently. But there are certainly other use cases! Changes ======= This commit adds two new options to that effect: * `-o --output`: Path of the output as understood by `ffmpeg` (usually a file name). If this is `-` (as it is by default), the existing behavior of playing the stream through `ffplay` is restored. * `-f --format`: When recording to an output, this option can be used to force the encoding format. If this is `-` (again, the default), `ffmpeg`’s auto format detection is used (based on the file extension). Because of the possible confusion between the newly added `--output` option and the existing `--destination` option for specifying the source address, I suggest renaming the `--destination` option to `--source` (this is implemented in this commit). Examples ======== Record to a file ---------------- ```sh ./reStream.sh -o remarkable.mp4 ``` Caveat: The recorded file plays back too fast. I am not sure how to fix this. Create an UDP MPEG-TS stream ---------------------------- ```sh ./reStream.sh -o "udp://127.0.0.1:1234" -f "mpegts" ``` This sends frames over UDP to the specified port using the MPEG-TS format (see <https://trac.ffmpeg.org/wiki/StreamingGuide>). This stream can then be connected, for example, to OBS for live streaming (see <https://connect.ed-diamond.com/Linux-Pratique/LP-096/Enrichir-sa-diffusion-de-contenus-multimedias-avec-OBS> in French).
2020-04-02 19:32:59 +02:00
if [ "$output_path" = - ]; then
output_cmd=ffplay
2020-11-20 12:23:15 +01:00
window_title_option="-window_title $window_title"
Allow choosing the video output Instead of simply playing back the frames through `ffplay`, I thought it might be interesting to be able to record the sequence to a video file or to use it as part of a stream. I have in mind the use case of making educational videos/live streams where the tablet can be used as a kind of remote blackboard by teachers, which is especially relevant currently. But there are certainly other use cases! Changes ======= This commit adds two new options to that effect: * `-o --output`: Path of the output as understood by `ffmpeg` (usually a file name). If this is `-` (as it is by default), the existing behavior of playing the stream through `ffplay` is restored. * `-f --format`: When recording to an output, this option can be used to force the encoding format. If this is `-` (again, the default), `ffmpeg`’s auto format detection is used (based on the file extension). Because of the possible confusion between the newly added `--output` option and the existing `--destination` option for specifying the source address, I suggest renaming the `--destination` option to `--source` (this is implemented in this commit). Examples ======== Record to a file ---------------- ```sh ./reStream.sh -o remarkable.mp4 ``` Caveat: The recorded file plays back too fast. I am not sure how to fix this. Create an UDP MPEG-TS stream ---------------------------- ```sh ./reStream.sh -o "udp://127.0.0.1:1234" -f "mpegts" ``` This sends frames over UDP to the specified port using the MPEG-TS format (see <https://trac.ffmpeg.org/wiki/StreamingGuide>). This stream can then be connected, for example, to OBS for live streaming (see <https://connect.ed-diamond.com/Linux-Pratique/LP-096/Enrichir-sa-diffusion-de-contenus-multimedias-avec-OBS> in French).
2020-04-02 19:32:59 +02:00
else
output_cmd=ffmpeg
Allow choosing the video output Instead of simply playing back the frames through `ffplay`, I thought it might be interesting to be able to record the sequence to a video file or to use it as part of a stream. I have in mind the use case of making educational videos/live streams where the tablet can be used as a kind of remote blackboard by teachers, which is especially relevant currently. But there are certainly other use cases! Changes ======= This commit adds two new options to that effect: * `-o --output`: Path of the output as understood by `ffmpeg` (usually a file name). If this is `-` (as it is by default), the existing behavior of playing the stream through `ffplay` is restored. * `-f --format`: When recording to an output, this option can be used to force the encoding format. If this is `-` (again, the default), `ffmpeg`’s auto format detection is used (based on the file extension). Because of the possible confusion between the newly added `--output` option and the existing `--destination` option for specifying the source address, I suggest renaming the `--destination` option to `--source` (this is implemented in this commit). Examples ======== Record to a file ---------------- ```sh ./reStream.sh -o remarkable.mp4 ``` Caveat: The recorded file plays back too fast. I am not sure how to fix this. Create an UDP MPEG-TS stream ---------------------------- ```sh ./reStream.sh -o "udp://127.0.0.1:1234" -f "mpegts" ``` This sends frames over UDP to the specified port using the MPEG-TS format (see <https://trac.ffmpeg.org/wiki/StreamingGuide>). This stream can then be connected, for example, to OBS for live streaming (see <https://connect.ed-diamond.com/Linux-Pratique/LP-096/Enrichir-sa-diffusion-de-contenus-multimedias-avec-OBS> in French).
2020-04-02 19:32:59 +02:00
if [ "$format" != - ]; then
2020-04-07 11:17:45 +02:00
set -- "$@" -f "$format"
Allow choosing the video output Instead of simply playing back the frames through `ffplay`, I thought it might be interesting to be able to record the sequence to a video file or to use it as part of a stream. I have in mind the use case of making educational videos/live streams where the tablet can be used as a kind of remote blackboard by teachers, which is especially relevant currently. But there are certainly other use cases! Changes ======= This commit adds two new options to that effect: * `-o --output`: Path of the output as understood by `ffmpeg` (usually a file name). If this is `-` (as it is by default), the existing behavior of playing the stream through `ffplay` is restored. * `-f --format`: When recording to an output, this option can be used to force the encoding format. If this is `-` (again, the default), `ffmpeg`’s auto format detection is used (based on the file extension). Because of the possible confusion between the newly added `--output` option and the existing `--destination` option for specifying the source address, I suggest renaming the `--destination` option to `--source` (this is implemented in this commit). Examples ======== Record to a file ---------------- ```sh ./reStream.sh -o remarkable.mp4 ``` Caveat: The recorded file plays back too fast. I am not sure how to fix this. Create an UDP MPEG-TS stream ---------------------------- ```sh ./reStream.sh -o "udp://127.0.0.1:1234" -f "mpegts" ``` This sends frames over UDP to the specified port using the MPEG-TS format (see <https://trac.ffmpeg.org/wiki/StreamingGuide>). This stream can then be connected, for example, to OBS for live streaming (see <https://connect.ed-diamond.com/Linux-Pratique/LP-096/Enrichir-sa-diffusion-de-contenus-multimedias-avec-OBS> in French).
2020-04-02 19:32:59 +02:00
fi
2020-04-07 11:17:45 +02:00
set -- "$@" "$output_path"
Allow choosing the video output Instead of simply playing back the frames through `ffplay`, I thought it might be interesting to be able to record the sequence to a video file or to use it as part of a stream. I have in mind the use case of making educational videos/live streams where the tablet can be used as a kind of remote blackboard by teachers, which is especially relevant currently. But there are certainly other use cases! Changes ======= This commit adds two new options to that effect: * `-o --output`: Path of the output as understood by `ffmpeg` (usually a file name). If this is `-` (as it is by default), the existing behavior of playing the stream through `ffplay` is restored. * `-f --format`: When recording to an output, this option can be used to force the encoding format. If this is `-` (again, the default), `ffmpeg`’s auto format detection is used (based on the file extension). Because of the possible confusion between the newly added `--output` option and the existing `--destination` option for specifying the source address, I suggest renaming the `--destination` option to `--source` (this is implemented in this commit). Examples ======== Record to a file ---------------- ```sh ./reStream.sh -o remarkable.mp4 ``` Caveat: The recorded file plays back too fast. I am not sure how to fix this. Create an UDP MPEG-TS stream ---------------------------- ```sh ./reStream.sh -o "udp://127.0.0.1:1234" -f "mpegts" ``` This sends frames over UDP to the specified port using the MPEG-TS format (see <https://trac.ffmpeg.org/wiki/StreamingGuide>). This stream can then be connected, for example, to OBS for live streaming (see <https://connect.ed-diamond.com/Linux-Pratique/LP-096/Enrichir-sa-diffusion-de-contenus-multimedias-avec-OBS> in French).
2020-04-02 19:32:59 +02:00
fi
2020-03-12 00:17:13 +01:00
set -e # stop if an error occurs
2020-02-12 13:13:14 +01:00
2020-04-03 12:15:53 +02:00
# shellcheck disable=SC2086
ssh_cmd "$read_loop" \
| $decompress \
2020-04-19 15:53:14 +02:00
| $host_passthrough \
| "$output_cmd" \
2020-04-07 11:21:10 +02:00
-vcodec rawvideo \
-loglevel "$loglevel" \
-f rawvideo \
-pixel_format "$pixel_format" \
2020-04-07 11:21:10 +02:00
-video_size "$width,$height" \
2020-11-20 12:23:15 +01:00
$window_title_option \
2020-04-07 11:21:10 +02:00
-i - \
"$@"