Skip to main content
Back to the directory
web-infra-dev/midscene-skillsSoftware EngineeringFrontend and Design

ios-device-automation

Vision-driven iOS automation using natural language commands and screenshot analysis.

SkillJury keeps community verdicts, source metadata, and external repository signals in separate lanes so ranking data never pretends to be a review.

SkillJury verdict
Pending

No approved reviews yet

Would recommend
Pending

Waiting on enough review volume

Install signal
1

Weekly or total install activity from catalog data

Sign in to review
0 review requests
Install command
npx skills add https://github.com/web-infra-dev/midscene-skills --skill ios-device-automation
SkillJury does not have enough approved reviews to publish a community verdict yet. Source metadata and repository proof are still available above.
SkillJury Signal Summary

As of Apr 30, 2026, ios-device-automation has 1 weekly installs, 0 community reviews on SkillJury. Community votes currently stand at 0 upvotes and 0 downvotes. Source: web-infra-dev/midscene-skills. Canonical URL: https://skills.sh/web-infra-dev/midscene-skills/ios-device-automation.

Security audits
Gen Agent Trust HubPASS
SocketPASS
SnykFAIL
About this skill
Vision-driven iOS automation using natural language commands and screenshot analysis. CRITICAL RULES — VIOLATIONS WILL BREAK THE WORKFLOW: Automate iOS devices using npx -y @midscene/ios@1 . Each CLI command maps directly to an MCP tool — you (the AI agent) act as the brain, deciding which actions to take based on screenshots. Inside a single act call on iOS, Midscene can tap, double-tap, long-press, type, clear text, scroll, drag items, zoom with two fingers, press keys, and use system navigation such as Home or the app switcher while working from the current visible screen. Midscene requires models with strong visual grounding capabilities. The following environment variables must be configured — either as system environment variables or in a .env file in the current working directory (Midscene loads .env automatically): Example: Gemini (Gemini-3-Flash) Example: Qwen 3.5 Example: Doubao Seed 2.0 Lite Commonly used models: Doubao Seed 2.0 Lite, Qwen 3.5, Zhipu GLM-4.6V, Gemini-3-Pro, Gemini-3-Flash. If the model is not configured, ask the user to set it up. See Model Configuration for supported providers.

Source description provided by the upstream listing. Community review signal and install context stay separate from this narrative layer.

Community reviews

Latest reviews

No community reviews yet. Be the first to review.

Browse this skill in context
FAQ
What does ios-device-automation do?

Vision-driven iOS automation using natural language commands and screenshot analysis.

Is ios-device-automation good?

ios-device-automation does not have approved reviews yet, so SkillJury cannot publish a community verdict.

Which AI agents support ios-device-automation?

ios-device-automation currently lists compatibility with Gemini CLI, Skills CLI.

Is ios-device-automation safe to install?

ios-device-automation has been scanned by security audit providers tracked on SkillJury. Check the security audits section on this page for detailed results from Socket.dev and Snyk.

What are alternatives to ios-device-automation?

Skills in the same category include grimoire-morpho-blue, conversation-memory, second-brain-ingest, zai-tts.

How do I install ios-device-automation?

Run the following command to install ios-device-automation: npx skills add https://github.com/web-infra-dev/midscene-skills --skill ios-device-automation

Related skills

More from web-infra-dev/midscene-skills

Related skills

Alternatives in Software Engineering