Skip to content

Examples

Complete end-to-end examples using the Monid CLI.

Search Twitter for recent AI posts
bash
# 1. Discover Twitter endpoints
monid discover -q "twitter posts"

# 2. Inspect the tweet scraper to see input schema
monid inspect -p apify -e /apidojo/tweet-scraper

# 3. Run with a small limit
monid run -p apify -e /apidojo/tweet-scraper \
  -i '{"searchTerms":["AI"],"maxItems":10}'
# → Run ID: 01HXYZ...

# 4. Get the results
monid runs get -r 01HXYZ... -o ai_tweets.json
Compare data across platforms (Twitter + LinkedIn)

Break the request into separate queries -- one per platform.

bash
# 1. Discover endpoints for each platform
monid discover -q "twitter posts"
monid discover -q "linkedin posts"

# 2. Inspect each to learn their input schemas
monid inspect -p apify -e /apidojo/tweet-scraper
monid inspect -p apify -e /harvestapi/linkedin-post-search

# 3. Run both
monid run -p apify -e /apidojo/tweet-scraper \
  -i '{"searchTerms":["AI"],"maxItems":10}'
# → Run ID: 01H_TW...

monid run -p apify -e /harvestapi/linkedin-post-search \
  -i '{"keywords":"AI","maxResults":10}'
# → Run ID: 01H_LI...

# 4. Get results for each
monid runs get -r 01H_TW... -o twitter_ai.json
monid runs get -r 01H_LI... -o linkedin_ai.json
Scrape Google Maps reviews
bash
# 1. Discover Google Maps endpoints
monid discover -q "google maps reviews"

# 2. Inspect the scraper
monid inspect -p apify -e /damilo/google-maps-scraper

# 3. Create an input file for complex parameters
cat > params.json << 'EOF'
{
  "searchStringsArray": ["restaurants in San Francisco"],
  "maxReviews": 10,
  "language": "en"
}
EOF

# 4. Run with input from file
monid run -p apify -e /damilo/google-maps-scraper -i @params.json
# → Run ID: 01HGEO...

# 5. Get the results
monid runs get -r 01HGEO... -o reviews.json

The data layer for AI agents.