AI literacy and prohibited practices
Competency obligations and Article 5 prohibitions are in force.
A practical decision framework for organizations using AI. Answer short questions, receive a recommendation, and get a ready-to-use 0-30-90 day action plan.
Key dates resulting from the AI Act (EU 2024/1689) that should be included in your implementation timeline.
Competency obligations and Article 5 prohibitions are in force.
Requirements for general-purpose AI models begin to apply.
Most provisions, including transparency obligations, fully apply.
An additional stage applies to selected high-risk systems.
Step-by-step schema of actions needed to achieve and maintain compliance. Items marked as cyclical require regular reporting.
Identify and document all AI systems used in your organization — from chatbots to recommendation engines. The register must be updated annually.
Art. 6 AI ActVerify that none of your AI systems perform prohibited practices: social scoring, subliminal manipulation, mass biometric recognition.
Art. 5 AI ActConduct a risk assessment for each AI system. For high-risk systems processing personal data — a mandatory DPIA (Art. 35 GDPR).
Art. 9 AI Act + Art. 35 RODOImplement AI-generated content labeling, inform users of AI interactions, disclose deepfakes. Update whenever the system changes.
Art. 50 AI ActProvide documented training for staff working with AI: model limitations, risks, human oversight principles. Repeat at least annually.
Art. 4 AI ActEnsure lawful basis for AI data processing, fulfill information duties, handle data subject rights, and maintain retention policies.
Art. 6, 13, 14, 15–22 RODOImplement human review mechanisms for AI decisions, decision logging, incident procedures, and the ability for human intervention.
Art. 14 AI ActMaintain AI interfaces compliant with WCAG 2.1 AA: contrast, keyboard navigation, ARIA, accessible forms. Audit regularly.
EN 301 549 / EAAMaintain up-to-date technical documentation for AI systems: architecture, training data, metrics, testing procedures. Update on every significant change.
Art. 11, 12 AI ActConduct a comprehensive internal audit covering all AI Act, GDPR, and WCAG requirements. Report findings and implement recommendations.
Art. 9, 61 AI ActPresent audit results and risk assessment to the board. Obtain formal approval of AI strategy and compliance budget.
Art. 26 AI ActAnswer 11 questions about your organization and receive a personalized report with risk assessment, readiness level, and a 0-30-90 day action plan.
Start the readiness wizardComplete the step-by-step framework and run GDPR and WCAG scans to reduce risks and scale AI deployments safely.
Below you will find legal acts underpinning this form and answers to the most common practical questions.
Official EU and Polish legal acts worth including in your implementation documentation.
Most common operational and legal questions when implementing AI in a company.
No. The AI Act and GDPR apply in parallel. If your AI process involves personal data, you must also meet GDPR requirements (including legal basis, transparency notices, data subject rights, and security).
When processing may create high risk to individuals' rights and freedoms, especially in profiling, large-scale monitoring, or automated decision-making. In such cases, DPIA should be completed before rollout.
As a rule, yes. When a user interacts with AI or consumes generated/synthetic content, you should provide clear notice and appropriate labeling under Article 50 of the AI Act.
Not every use, but it is critical when a decision is solely automated and produces legal or similarly significant effects. In that case, additional safeguards, human intervention, and an appeal path are required.
Yes. If content or interfaces are available to end users, their accessibility must meet WCAG 2.1 AA and relevant sector-specific rules (EAA/PAD/public sector).
Yes, although typically of a different type. Governance, AI literacy, risk analysis, and documentation still apply. Consumer-accessibility obligations may be smaller, but GDPR still applies where personal data is involved.
No. Both the AI Act and GDPR require a continuous approach: updating registers, reviewing risks, monitoring incidents, and refreshing policies as models and processes change.
In the public sector, it is a standard obligation. For EAA/PAD-covered services, you must ensure accessibility compliance and provide transparent information on compliance level and barrier-removal plans.
No. Responsibility remains with the organization using AI. Vendor agreements (DPA/SCC, role and liability split) are important, but they do not replace your own compliance program.
No. This is a screening tool for quick risk mapping and prioritization. Final compliance assessment should be validated by legal/compliance and technical experts based on your organization's actual data.