Compare commits

...

11 Commits

Author SHA1 Message Date
250feb54cd Merge pull request 'integration-test' (#9) from integration-test into main
All checks were successful
CI/CD Pipeline / test (push) Successful in 1m23s
CI/CD Pipeline / deploy (push) Successful in 1m20s
Reviewed-on: #9
2025-10-29 12:47:45 +01:00
Philipp Hofer
9403b19c71 Merge branch 'main' into integration-test
Some checks failed
CI/CD Pipeline / deploy (push) Has been cancelled
CI/CD Pipeline / test (push) Has been cancelled
2025-10-29 12:47:30 +01:00
Philipp Hofer
1e9fea17e3 fix ci
Some checks failed
CI/CD Pipeline / test (push) Successful in 1m20s
CI/CD Pipeline / deploy (push) Has been cancelled
2025-10-29 12:46:07 +01:00
2e11261179 integration-test (#8)
Some checks failed
CI/CD Pipeline / test (push) Failing after 53s
CI/CD Pipeline / deploy (push) Has been skipped
Co-authored-by: Philipp Hofer <philipp.hofer@mag.linz.at>
Reviewed-on: #8
2025-10-29 12:18:09 +01:00
Philipp Hofer
704708e37f push
Some checks failed
CI/CD Pipeline / test (push) Failing after 46s
CI/CD Pipeline / deploy (push) Has been skipped
2025-10-29 12:17:08 +01:00
b833b2a27a Update tests/integration.rs
All checks were successful
CI/CD Pipeline / test (push) Successful in 1m17s
CI/CD Pipeline / deploy (push) Has been skipped
2025-10-29 12:13:46 +01:00
Philipp Hofer
fb7674eac1 add integration test
All checks were successful
CI/CD Pipeline / test (push) Successful in 1m33s
CI/CD Pipeline / deploy (push) Has been skipped
2025-10-29 12:11:58 +01:00
Philipp Hofer
ad759e1ca9 add tests for broadcasts
All checks were successful
CI/CD Pipeline / test (push) Successful in 3m46s
CI/CD Pipeline / deploy (push) Successful in 6m32s
2025-10-29 11:55:14 +01:00
Philipp Hofer
1cab4d0bf5 restructure + first tests
Some checks failed
CI/CD Pipeline / test (push) Failing after 3m23s
CI/CD Pipeline / deploy (push) Has been skipped
2025-10-29 11:30:16 +01:00
Philipp Hofer
e1e21f8837 Revert "add test structs"
All checks were successful
CI/CD Pipeline / test (push) Successful in 2m0s
CI/CD Pipeline / deploy (push) Successful in 1m45s
This reverts commit 14ee4d2767.
2025-10-16 11:53:11 +02:00
Philipp Hofer
5ee6a679c5 Revert "add tracing + custom error type"
This reverts commit 4702017914.
2025-10-16 11:53:06 +02:00
12 changed files with 312 additions and 252 deletions

View File

@@ -21,8 +21,8 @@ jobs:
- name: Build
run: cargo build
- name: Backend tests
run: cargo test --verbose
- name: Tests
run: cargo test --verbose -- --ignored
deploy:
runs-on: ubuntu-latest

15
Cargo.lock generated
View File

@@ -723,10 +723,9 @@ dependencies = [
"chrono",
"quick-xml",
"reqwest",
"serde",
"serde_json",
"thiserror",
"tokio",
"tracing",
]
[[package]]
@@ -1292,21 +1291,9 @@ checksum = "784e0ac535deb450455cbfa28a6f0df145ea1bb7ae51b821cf5e7927fdcfbdd0"
dependencies = [
"log",
"pin-project-lite",
"tracing-attributes",
"tracing-core",
]
[[package]]
name = "tracing-attributes"
version = "0.1.30"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "81383ab64e72a7a8b8e13130c49e3dab29def6d0c7d76a03087b3cf71c5c6903"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "tracing-core"
version = "0.1.34"

View File

@@ -10,5 +10,4 @@ reqwest = { version = "0.12", features = ["stream", "json", "rustls-tls"], defau
serde_json = "1"
chrono = "0.4"
quick-xml = "0.38"
thiserror = "2"
tracing = "0.1"
serde = { version = "1", features = ["derive"] }

19
README.md Normal file
View File

@@ -0,0 +1,19 @@
# Plöyer (Ö1 player)
Creates an RSS Feed for some Ö1 journals.
## Motivation
This project started because of a simple goal: I wanted to listen to a few Ö1 episodes regularly. I had 2 requirements:
1. Be able to live-stream (even if I want to start a few minutes after the episode has started).
2. Track the progress to be able to continue later.
[ORF already provides a similar](https://sound.orf.at/podcast/oe1/oe1-journale), but it creates the entry only ~15 minutes AFTER the episodes has ended. Thus, if it's 7:10 and I want to listen to e.g. `Morgenjournal` which starts at 7, I can't do that with the ORF feed.
Another option is to use [the live player](https://oe1.orf.at/player/live). There seems to be an issue when I start playing the 7 am episode at 7:10; then (at least in October '25 and on my smartphone) it stops at 7:20 and I can only listen again if I refresh the page, but then I lose the current playtime and I have to seek to the time where I have been. Furthermore, if I only manage to listen to e.g. half the episode, chances are high that the time is not stored, because I refresh the page et al.
I tried a bunch of different options (webplayer + official feed; streaming the episodes to a mp3 file on my server, then serving this file; just redirecting to current episode, letting default webbrowser mp3 player handle everything; writing my own javascript player; using existing javascript player library to consistently stream the episodes) over some months. This project is what I've eventually landed on, it's quite stable for a few months (starting in September '25). It creates an RSS "Podcast" feed, which I consume on my phone with AntennaPod.
## Data
In `https://audioapi.orf.at/oe1/api/json/current/broadcasts` you can find the broadcasts for the last 7 days---including the current day. Each broadcast has a `href` url attached. If you query this url, you can find details about this broadcast, e.g. under `streams` there's a `loopStreamId`. You can stream your episode from this url `https://loopstream01.apa.at/?channel=oe1&shoutcast=0&id={id}`.

112
src/fetch/broadcast.rs Normal file
View File

@@ -0,0 +1,112 @@
use crate::Backend;
use chrono::DateTime;
use serde::{Deserialize, Serialize};
#[derive(Debug, Serialize, Deserialize)]
struct InferenceStream {
#[serde(rename = "loopStreamId")]
pub loop_stream_id: String,
#[serde(rename = "start")]
pub start_timestamp: i64,
}
#[derive(Debug, Serialize, Deserialize)]
struct InferenceData {
pub title: String,
pub streams: Vec<InferenceStream>,
}
impl Backend {
pub(crate) async fn get_broadcast(
&self,
url: String,
) -> Result<Option<Broadcast>, Box<dyn std::error::Error>> {
let data: InferenceData = match self {
Backend::Prod => reqwest::get(&url).await?.json::<InferenceData>().await?,
#[cfg(test)]
Backend::Test => InferenceData {
title: "test-title".into(),
streams: vec![InferenceStream {
loop_stream_id: "test.mp3".into(),
start_timestamp: 1761734636000,
}],
},
};
Broadcast::from_data(url, data).await
}
}
#[derive(Debug, Clone)]
pub(crate) struct Broadcast {
pub(crate) url: String,
pub(crate) media_url: String,
pub(crate) title: String,
pub(crate) timestamp: DateTime<chrono::Utc>,
}
impl Broadcast {
async fn from_data(
url: String,
data: InferenceData,
) -> Result<Option<Self>, Box<dyn std::error::Error>> {
if data.streams.is_empty() {
return Ok(None);
}
assert_eq!(data.streams.len(), 1);
let stream = &data.streams[0];
let media_url = format!(
"https://loopstream01.apa.at/?channel=oe1&shoutcast=0&id={}",
stream.loop_stream_id
);
let Some(timestamp) = DateTime::from_timestamp(stream.start_timestamp / 1000, 0) else {
return Err(format!(
"broadcastDay in {url} not in a valid format (unix timestamp): {}",
stream.start_timestamp
)
.into());
};
Ok(Some(Self {
url,
media_url,
title: data.title,
timestamp,
}))
}
}
impl PartialEq for Broadcast {
fn eq(&self, other: &Self) -> bool {
self.url == other.url
}
}
#[cfg(test)]
mod tests {
use crate::Backend;
use chrono::{TimeZone, Utc};
#[tokio::test]
async fn happy() {
let backend = Backend::Test;
let actual = backend
.get_broadcast("test-url".into())
.await
.unwrap()
.unwrap();
assert_eq!(&actual.url, "test-url");
assert_eq!(
&actual.media_url,
"https://loopstream01.apa.at/?channel=oe1&shoutcast=0&id=test.mp3"
);
assert_eq!(&actual.title, "test-title");
assert_eq!(
actual.timestamp,
Utc.with_ymd_and_hms(2025, 10, 29, 10, 43, 56).unwrap()
);
}
}

2
src/fetch/mod.rs Normal file
View File

@@ -0,0 +1,2 @@
pub(super) mod broadcast;
pub(super) mod overview;

98
src/fetch/overview.rs Normal file
View File

@@ -0,0 +1,98 @@
use crate::Backend;
use serde::{Deserialize, Serialize};
#[derive(Debug, Serialize, Deserialize)]
struct InferenceResponse {
pub title: String,
pub href: String,
}
#[derive(Debug, Serialize, Deserialize)]
struct InferenceDay {
pub broadcasts: Vec<InferenceResponse>,
}
type InferenceData = Vec<InferenceDay>;
impl Backend {
/// Gets a list of all broadcasts
pub(crate) async fn get_broadcasts(
&self,
filter_titles: &[String],
) -> Result<Vec<String>, Box<dyn std::error::Error>> {
let data: InferenceData = match self {
Backend::Prod => {
let url = "https://audioapi.orf.at/oe1/api/json/current/broadcasts";
reqwest::get(url).await?.json::<InferenceData>().await?
}
#[cfg(test)]
Backend::Test => {
vec![InferenceDay {
broadcasts: vec![InferenceResponse {
title: "test-title".into(),
href: "test-href".into(),
}],
}]
}
};
get_broadcasts(data, filter_titles).await
}
}
/// Returns a list of urls of all filtered broadcasts.
async fn get_broadcasts(
days: InferenceData,
filter_titles: &[String],
) -> Result<Vec<String>, Box<dyn std::error::Error>> {
let mut ret: Vec<String> = Vec::new();
for day in days {
for broadcast in day.broadcasts {
if filter_titles.is_empty() || filter_titles.contains(&broadcast.title) {
{
ret.push(broadcast.href);
}
}
}
}
Ok(ret)
}
#[cfg(test)]
mod tests {
use crate::Backend;
#[tokio::test]
async fn happy() {
let backend = Backend::Test;
let actual = backend.get_broadcasts(&[]).await.unwrap();
assert_eq!(actual, vec!["test-href"]);
}
#[ignore]
#[tokio::test]
async fn happy_prod() {
let backend = Backend::Prod;
let actual = backend.get_broadcasts(&[]).await.unwrap();
assert!(actual.len() > 200);
assert!(actual[0].starts_with("https://audioapi.orf.at/oe1/api/json/4.0/broadcast/"))
}
#[tokio::test]
async fn filter_no_result() {
let backend = Backend::Test;
let actual = backend.get_broadcasts(&["no-match".into()]).await.unwrap();
assert!(actual.is_empty());
}
#[tokio::test]
async fn filter_result() {
let backend = Backend::Test;
let actual = backend
.get_broadcasts(&["test-title".into()])
.await
.unwrap();
assert_eq!(actual, vec!["test-href"]);
}
}

View File

@@ -1,12 +1,10 @@
mod fetch;
mod rss;
mod web;
use chrono::DateTime;
use serde_json::Value;
use fetch::broadcast::Broadcast;
use std::sync::Arc;
use thiserror::Error;
use tokio::{net::TcpListener, sync::RwLock};
use tracing::warn;
pub async fn start(
title: String,
@@ -14,9 +12,10 @@ pub async fn start(
desc: String,
filter_titles: Vec<String>,
listener: TcpListener,
backend: Backend,
) -> Result<(), Box<dyn std::error::Error>> {
let state = Arc::new(RwLock::new(
LiveFeed::new(title, link, desc, filter_titles)
Feed::new(title, link, desc, filter_titles, backend)
.await
.unwrap(),
));
@@ -26,76 +25,49 @@ pub async fn start(
Ok(())
}
#[derive(Error, Debug)]
enum FetchError {
#[error("error fetching url")]
Fetching(reqwest::Error),
#[error("error parsing json")]
JsonParsing(reqwest::Error),
pub enum Backend {
Prod,
#[cfg(test)]
Test,
}
trait Feed {
async fn fetch(&mut self) -> Result<(), FetchError>;
}
#[cfg(test)]
struct TestFeed {
episodes: Vec<Broadcast>,
pub(crate) amount_fetch_calls: usize,
}
#[cfg(test)]
impl Default for TestFeed {
fn default() -> Self {
Self {
episodes: vec![Broadcast::test()],
amount_fetch_calls: 0,
}
}
}
#[cfg(test)]
impl rss::ToRss for TestFeed {
fn title(&self) -> &str {
"Test RSS Title"
}
fn link(&self) -> &str {
"https://test.rss"
}
fn desc(&self) -> &str {
"Test RSS Desc"
}
fn episodes(&self) -> &Vec<crate::Broadcast> {
&self.episodes
}
}
#[cfg(test)]
impl Feed for TestFeed {
async fn fetch(&mut self) -> Result<(), FetchError> {
self.amount_fetch_calls += 1;
Ok(())
}
}
struct LiveFeed {
struct Feed {
episodes: Vec<Broadcast>,
title: String,
link: String,
desc: String,
filter_titles: Vec<String>,
backend: Backend,
}
impl Feed for LiveFeed {
async fn fetch(&mut self) -> Result<(), FetchError> {
let broadcasts = self.get_all_broadcasts().await?;
impl Feed {
async fn new(
title: String,
link: String,
desc: String,
filter_titles: Vec<String>,
backend: Backend,
) -> Result<Self, Box<dyn std::error::Error>> {
let mut ret = Self {
episodes: Vec::new(),
title,
link,
desc,
filter_titles,
backend,
};
ret.fetch().await?;
Ok(ret)
}
async fn fetch(&mut self) -> Result<(), Box<dyn std::error::Error>> {
let broadcasts = self.backend.get_broadcasts(&self.filter_titles).await?;
for broadcast in broadcasts {
if !self.has_broadcast_url(&broadcast) {
if let Some(broadcast) = Broadcast::from_url(broadcast).await.unwrap() {
if let Some(broadcast) = self.backend.get_broadcast(broadcast).await.unwrap() {
self.episodes.push(broadcast);
} else {
return Ok(());
@@ -107,27 +79,6 @@ impl Feed for LiveFeed {
Ok(())
}
}
impl LiveFeed {
async fn new(
title: String,
link: String,
desc: String,
filter_titles: Vec<String>,
) -> Result<Self, Box<dyn std::error::Error>> {
let mut ret = Self {
episodes: Vec::new(),
title,
link,
desc,
filter_titles,
};
ret.fetch().await?;
Ok(ret)
}
fn only_keep_last_episodes(&mut self) {
self.episodes = self.episodes.clone().into_iter().rev().take(10).collect();
@@ -137,111 +88,4 @@ impl LiveFeed {
fn has_broadcast_url(&self, url: &str) -> bool {
self.episodes.iter().any(|e| e.url == url)
}
async fn get_all_broadcasts(&self) -> Result<Vec<String>, FetchError> {
// List of broadcasts: https://audioapi.orf.at/oe1/api/json/current/broadcasts
//
// ^ contains link, e.g. https://audioapi.orf.at/oe1/api/json/4.0/broadcast/797577/20250611
let url = "https://audioapi.orf.at/oe1/api/json/current/broadcasts";
let data: Value = reqwest::get(url)
.await
.map_err(FetchError::Fetching)?
.json()
.await
.map_err(FetchError::JsonParsing)?;
let mut ret: Vec<String> = Vec::new();
if let Some(days) = data.as_array() {
for day in days {
if let Some(broadcasts) = day["broadcasts"].as_array() {
for broadcast in broadcasts {
let Some(title) = broadcast["title"].as_str() else {
warn!("Broadcast has no 'title' attribute, skipping broadcast");
continue;
};
let Some(href) = broadcast["href"].as_str() else {
warn!("Broadcast has no 'href' attribute, skipping broadcast");
continue;
};
if self.filter_titles.is_empty()
|| self.filter_titles.contains(&title.into())
{
{
ret.push(href.into());
}
}
}
}
}
}
Ok(ret)
}
}
#[derive(Clone)]
struct Broadcast {
url: String,
media_url: String,
title: String,
timestamp: DateTime<chrono::Utc>,
}
impl Broadcast {
#[cfg(test)]
fn test() -> Self {
use chrono::Local;
Self {
url: "test.url".into(),
media_url: "test.media.url".into(),
title: "Test title".into(),
timestamp: Local::now().into(),
}
}
async fn from_url(url: String) -> Result<Option<Self>, Box<dyn std::error::Error>> {
let data: Value = reqwest::get(&url).await?.json().await?;
let Some(streams) = data["streams"].as_array() else {
return Err(String::from("No 'streams' found").into());
};
if streams.is_empty() {
return Ok(None);
}
assert_eq!(streams.len(), 1);
let Some(id) = streams[0]["loopStreamId"].as_str() else {
return Err(String::from("No 'loopStreamId' found").into());
};
let media_url = format!("https://loopstream01.apa.at/?channel=oe1&shoutcast=0&id={id}");
let Some(title) = data["title"].as_str() else {
return Err(format!("{url} has no title").into());
};
let Some(timestamp) = data["start"].as_number() else {
return Err(format!("{url} has no start").into());
};
let Some(timestamp) = DateTime::from_timestamp(timestamp.as_i64().unwrap() / 1000, 0)
else {
return Err(format!(
"broadcastDay in {url} not in a valid format (unix timestamp): {timestamp}"
)
.into());
};
Ok(Some(Self {
url,
media_url,
title: title.into(),
timestamp,
}))
}
}
impl PartialEq for Broadcast {
fn eq(&self, other: &Self) -> bool {
self.url == other.url
}
}

View File

@@ -1,3 +1,5 @@
use player::Backend;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let listener = tokio::net::TcpListener::bind("0.0.0.0:3029").await?;
@@ -12,6 +14,7 @@ async fn main() -> Result<(), Box<dyn std::error::Error>> {
"Ö1 Abendjournal".into(),
],
listener,
Backend::Prod,
)
.await?;

View File

@@ -1,4 +1,4 @@
use crate::{Broadcast, LiveFeed};
use crate::{fetch::broadcast::Broadcast, Feed};
pub(super) trait ToRss {
fn title(&self) -> &str;
@@ -39,7 +39,7 @@ pub(super) trait ToRss {
}
}
impl ToRss for LiveFeed {
impl ToRss for Feed {
fn title(&self) -> &str {
&self.title
}

View File

@@ -1,12 +1,10 @@
use crate::{rss::ToRss, Feed, LiveFeed};
use crate::{rss::ToRss, Feed};
use axum::{extract::State, http::HeaderMap, response::IntoResponse, routing::get, Router};
use reqwest::header;
use std::sync::Arc;
use tokio::{net::TcpListener, sync::RwLock};
async fn stream_handler<T: Feed + ToRss + Send>(
State(state): State<Arc<RwLock<T>>>,
) -> impl IntoResponse {
async fn stream_handler(State(state): State<Arc<RwLock<Feed>>>) -> impl IntoResponse {
state.write().await.fetch().await.unwrap();
let content = state.read().await.to_rss();
@@ -17,7 +15,7 @@ async fn stream_handler<T: Feed + ToRss + Send>(
}
pub(super) async fn serve(
state: Arc<RwLock<LiveFeed>>,
state: Arc<RwLock<Feed>>,
listener: TcpListener,
) -> Result<(), Box<dyn std::error::Error>> {
let app = Router::new()
@@ -28,39 +26,3 @@ pub(super) async fn serve(
Ok(())
}
//#[cfg(test)]
//mod tests {
// use crate::{rss::ToRss, TestFeed};
// use axum::http::StatusCode;
// use std::sync::Arc;
// use tokio::sync::RwLock;
//
// #[tokio::test]
// async fn serve_serves_rss() {
// let feed = Arc::new(RwLock::new(TestFeed::default()));
//
// let listener = tokio::net::TcpListener::bind("127.0.0.1:0").await.unwrap();
// let addr = listener.local_addr().unwrap();
//
// tokio::spawn(super::serve(feed.clone(), listener));
//
// let client = reqwest::Client::new();
// let resp = client.get(format!("http://{}", addr)).send().await.unwrap();
//
// assert_eq!(resp.status(), StatusCode::OK);
// assert_eq!(
// resp.headers()
// .get("content-type")
// .unwrap()
// .to_str()
// .unwrap(),
// "application/rss+xml"
// );
//
// let body = resp.text().await.unwrap();
// assert_eq!(body, feed.read().await.to_rss());
//
// assert_eq!(feed.read().await.amount_fetch_calls, 1);
// }
//}

34
tests/integration.rs Normal file
View File

@@ -0,0 +1,34 @@
use player::Backend;
#[ignore]
#[tokio::test]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let listener = tokio::net::TcpListener::bind("0.0.0.0:0").await?;
let addr = listener.local_addr().unwrap();
// Start server in background task
tokio::spawn(async move {
if let Err(e) = player::start(
"Test Feed".into(),
"http://test.example".into(),
"Test description".into(),
vec!["Test Journal".into()],
listener,
Backend::Prod,
)
.await
{
eprintln!("Server failed to start: {e}");
}
});
// Allow server startup time
tokio::time::sleep(tokio::time::Duration::from_millis(3000)).await;
// Verify route responds with success status
let response = reqwest::get(format!("http://{addr}/")).await.unwrap();
assert_eq!(response.status(), 200);
Ok(())
}