Criteria-led review
Entries are strongest when they align clearly with the brief and show control over the chosen medium.
This public guide explains the criteria-led review approach used across Crenova competitions. It is designed to help creators understand what judges look for, how entries are compared fairly, and which mistakes most often weaken an otherwise promising submission.
Entries are strongest when they align clearly with the brief and show control over the chosen medium.
Creators should know what the platform values before submitting, including originality, clarity, and technical readiness.
Eligibility checks, category rules, and originality standards are part of fair review, not separate from it.
Not every competition uses the exact same emphasis, but most entries are reviewed through a consistent set of lenses. These lenses help judges compare different works without reducing the process to only popularity or surface polish.
Public judging guidance helps creators improve. It also reduces confusion around results because the review standard becomes visible before submission, not only after a decision is made.
These criteria are not a fixed point system for every challenge, but they reflect the main areas that creators should strengthen before submitting.
Judges look for work that feels owned by the creator. Originality can appear in perspective, concept, arrangement, language, tone, composition, or performance choices.
A technically polished piece can still be weak if it does not match the challenge. Strong entries interpret the prompt clearly and intentionally.
Execution is about how well the idea is carried out. In writing this may be structure and language. In visual work it may be composition or finish. In speech it may be pace and articulation.
Judges need clear files, readable formatting, stable recordings, and enough detail to assess the work fairly. Technical issues can block fair evaluation.
Judges do not review a speech in the same way they review a photograph. The examples below explain how the same core criteria can appear differently in each type of submission.
Judges often look for idea clarity, composition, subject control, relevance to theme, and whether technical treatment supports the concept instead of distracting from it.
Strong signals include structure, originality of thought, language control, coherence, and whether the ending feels earned rather than abrupt or generic.
Judges notice articulation, confidence, pacing, listener engagement, and whether the participant can hold the idea together instead of depending only on memorized lines.
Expression, timing, steadiness, interpretation, and complete presentation matter. Review quality falls quickly when recordings are rushed or difficult to hear.
Many disappointing results are not caused by lack of talent. They happen because the final submission does not show the talent clearly enough. These are common reasons strong ideas lose strength at review time.
Judges usually reward direct, thoughtful interpretation. If they have to work too hard to find the link between the entry and the brief, the entry becomes less competitive.
Blurry uploads, noisy audio, poor cropping, bad lighting, or spelling errors do not always erase a good idea, but they reduce confidence in the final presentation.
Some entries use a safe idea without adding a clear point of view, fresh angle, or strong structure. Originality often lives in development, not only in topic choice.
Dead space at the start or end, abrupt cuts, draft formatting, or an unclear description can make a final entry feel incomplete even if the core work is strong.
Judging quality is not only about artistic taste. Fair review also means checking whether entries meet the competition rules, stay within the correct age group, and respect originality and community safety standards.
Entries are easier to compare when they sit in the correct age group and correct medium. This helps review stay fair to both younger and older participants.
Plagiarized, unsafe, or rule-breaking content should not compete on the same footing as original entries that followed the brief honestly.
If creators review these questions honestly, they usually improve the final entry before the judging stage even begins.
Judging guidance is most useful when read alongside preparation and category guidance.
Improve titles, descriptions, files, originality checks, and your final pre-submit workflow.
Open creator guideSee how different categories and age groups are organized so you can choose a better fit.
Explore categoriesReview the rules around originality, prohibited content, account use, and the platform's broader policy standards.
Read termsEntries usually perform better when creators know how they will be reviewed and prepare their work around that reality.