Functional Usage
-
Q: How do I upload data into the system?
A: Use the Discovery Module's source connectors to connect and load data from 15+ supported sources, including databases, warehouses, and lakes. -
Q: Can I schedule recurring tasks?
A: Yes. Pipelines can be triggered by schedule, event, or API. -
Q: How do I generate reports or visualizations?
A: Use the inbuilt visualization features to create customizable charts without code. -
Q: Can I query data in plain language?
A: Yes, the NEXEN AI Chatbot supports natural language database queries. -
Q:Can I use DANGLES for both experimental and production-grade workflows?
A: Yes. It's used for quick prototypes as well as large-scale, mission-critical data and ML workflows in production. -
Q: Can I run custom Python scripts, SQL queries, or containers in a pipeline?
A: Yes. You can execute custom code blocks, parameterized SQL scripts, or Docker containers as part of your pipeline logic. -
Q:What happens if a pipeline fails midway?
A: DANGLES shows detailed error logs, GenAI-powered failure explanations, and lets you retry failed steps or skip them based on your policy.