This is usually referred to as "syndicated content", and the common protocol for it is RSS.
There is a good reason for duplicating content via RSS rather than automated reposting. Google and other search engines need to know the origin of any original content to attribute it correctly. They can not rely on meta data or reported schema, so they have to rely on date of initial crawl. If you post original content to multiple sites at once, Google can potentially pick up any one of them as "origin" and attribute all other as "duplicates". In all likelihood it would be getting pretty confused about which site is indeed the main site. Also, all satellite sites are likely to trigger penalties for high proportion of duplicate content. RSS avoids that problem by loading content from the origin and attributing it in a verifiable way - so users see this content on-site, but Google know that it's loaded from elsewhere.
Unfortunately RSS and content syndication is not very popular anymore, in part because of specifications wars, but mostly because modern social networks introduced so much user-curated sharing that reading syndicated channels is no longer as engaging as reading say, Twitter feed. There was a good article on this in Vice last year - https://www.vice.com/en_us/article/a3mm4z/the-rise-and-demise-of-rss
So, all in all... I would probably look into engaging your satellite sites communities to cross-share interesting content with added comments and direct links, rather than intorducing some form of automated double-posting.