shape
Search
Account
Cart
0

DSA policy

DSA POLICY
CONCERNING THE POSTING AND VERIFICATION OF USER CONTENT

Definitions

  1. Social Features – Social Features should be understood as the functionalities of the respective Portal, which are used by the Administrator to interact with Users or to allow Users to interact with each other. Social Features include, in particular, profiles, groups, forums, fanpages or other resources managed by the Administrator on the respective Portal.
  2. Customer – means a party to whom, in accordance with the Policy and the law, electronic services may be provided or with whom a Sales Agreement may be concluded.
  3. Portal – the Portal should be understood as any ICT system of a social nature that belongs to the external owner of the Portal, e.g. Facebook, Instagram, Tik-Tok, LinkedIn and other social networks and sites.
  4. Policy – means the terms and conditions of the online store operating at the domain: woseba.pl.
  5. Service Provider – means P.P.U.H. WOSEBA SPÓŁKA Z OGRANICZONĄ ODPOWIEDZIALNOŚCIĄ with its registered office in Odolanów (63-430), ul. Krotoszyńska 150, NIP (Tax Identification Number): 6220006580, REGON (National Business Registry Number): 003376160, BDO (Waste Database Number): 000063982, entered in the register of entrepreneurs kept by the District Court Poznań – Nowe Miasto and Wilda in Poznań, 9th Economic Department of the National Court Register under the KRS number 0000013901, with the share capital of PLN 150,480; email: sklep@woseba.pl, which is also the owner of the Online Platform.
  6. Parties – the Parties should be understood as the Service Provider or the User.
  7. System – the System should be understood as the Portal or the Website.
  8. Store Website – means the sites via which the Seller runs the Online Store, operating at the domain: woseba.pl.
  9. Good(s) – means a product presented by the Seller via the Store Website which may be the subject of a Sales Agreement.
  10. User Content – User Content should be understood as any data produced or provided by the User in the context of using the respective System, in particular for the purposes of using the Social Features.
  11. Sales Agreement – means a sales agreement concluded electronically under the rules specified in the Policy between the Customer and the Seller.
  12. Portal Owner – means the party to which the relevant Portal belongs.

§ 1 General provisions

User Content

  1. The User who intends to post any User Content via the available functions of the System, including the Social Features, is obliged to edit the User Content according to the rules of the Polish language, in a clear and fact-based manner.
  2. By posting the User Content and sharing it, the User voluntarily distributes such Content. The posted Content does not reflect the views of the Seller and should not be treated as linked with the Seller’s business. The Seller is not the provider of the Content, but only the party that provides the necessary ICT resources for this purpose.
  3. A Customer declares that:
    1. the Customer is entitled to exercise author’s economic rights, industrial property rights and/or related rights in – as appropriate – creative works, objects of industrial property rights (e.g. trademarks) and/or objects of related rights, which constitute the Content;
    2. the posting and sharing of personal data, images and information regarding third parties within the posted Content is legal, voluntary and with the consent of the persons concerned;
    3. he Customer agrees that the posted Content may be viewed by other Customers and the Seller, and authorises the Seller to use the Content free of charge in accordance with this Policy;
    4. e Customer consents to the compilation of creative works within the meaning of the Act on Copyright and Related Rights.
  4. The User may not post any User Content that constitutes illegal content as defined in the Digital Services Act (DSA) or is otherwise inconsistent with the Policy or good practice, which means that it is not allowed to post any User Content that:
    1. contains links or other content that constitutes spam;
    2. includes personal data of a third party or disseminates an image of a third party without the required legal authorisation or consent of the third party;
    3. is incompatible with the theme or the available functionalities of the System – the content should be related to the content appearing in the available functionality, in particular the Social Feature (e.g. within a community group);
    4. is of an advertising and/or promotional nature;
    5. reproduces the same User Content previously posted in the System;
    6. was posted in bad faith, e.g. with the intention of infringing personality rights of third parties;
    7. infringes any third party rights, including those related to the protection of copyright and related rights, the protection of industrial property rights, business secrets, or those related to confidentiality obligations;
    8. is abusive or threatening towards other persons, or use obscene language (e.g. profanity or expressions which are generally regarded as offensive);
    9. material for another company or product; Content that is not related to the Seller’s business; Content that is false or misleading;
    10. otherwise violates this Policy, good practices, applicable laws, social norms or customs;
    11. concerns technical issues related to the functioning of the System; technical issues should be reported by Users electronically to the address of the Service Provider, and if using the Portal – directly to the Portal Owner;
    12. serves to carry out activities prohibited by law, e.g. attempts of fraud or fraudulently obtaining financial resources from other Users;
    13. incites to or advocates violence against any living thing, including animals;
    14. promotes any fascist or other totalitarian state system;
    15. incites to or advocates hatred on grounds of gender, sexual orientation, nationality, ethnicity, race, religion or on grounds of irreligiousness;
    16. insults a group of people or specific individuals on grounds of their sexual, gender, national, ethnic, racial or religious affiliation or because they are irreligious;
    17. includes chauvinistic or misogynistic content, as well as sexually discriminatory content;
    18. defames or insults any third party;
    19. contains profanity or other content of an offensive nature;
    20. offend religious feelings;
    21. may cause discomfort to other Users, in particular by lacking empathy or respect for other Users.
  5. In the event of receipt of a notification under the Policy, the Seller reserves the right to modify or remove Content posted by Customers, in particular in relation to content which, based on reports from third parties or the relevant authorities, has been found to be in breach of this Policy or applicable law.
  6. The Customer agrees that the Seller may use, free of charge, the Content posted by the Customer on the Store Website.

§ 2 Verification of User Content

  1. The Service Provider may verify the User Content posted by Users at any time. The Service Provider carries out the verification in a non-arbitrary, objective manner and with due diligence. At the same time, the Service Provider stipulates that it is not obliged to check the User Content posted by Users in advance, in particular as part of a preventive check (e.g. by prior approval of the User Content added by Users) or in any other form of checking such User Content.
  2. The Service Provider may use automated infringement detection mechanisms as part of the functionality of the relevant System, which means that the User Content may be analysed for potential violations of the Policy by an appropriate algorithm. In the event that such an algorithm is activated, the Administrator will provide information on the principles of its operation before the User Content is added by the User, subject to paragraphs 3-4 below of this Policy.
  3. In the case of an account maintained in the YouTube System, before uploading the materials, the Service Provider may decide whether comments can be added under this video and, if so, whether comments are to be moderated to a certain extent. The Service Provider may choose options from the most liberal approach (no moderation of comments) to the most restrictive approach (moderation of all comments). By activating the option of even partial moderation of comments added by users, the Service Provider makes use of automated infringement detection mechanisms, which are applied by the YouTube Portal Owner. In this case, the User Content may be flagged as potentially inappropriate as a result of being “picked up” by the YouTube algorithm.
  4. In the case of an account maintained in the Instagram System, the Service Provider may use the option to automatically hide certain comments and messages in the “Other” folder. The Service Provider may also use other automated infringement detection mechanisms that are applied by the Instagram Portal Owner, such as:
    1. automatic transfer of comments which may potentially violate the Policy (e.g. are offensive or constitute spam) to a separate section – after reviewing their content, the Service Provider may decide to display them;
    2. automatic filtering of comments with an indication of specific words, phrases or emotions in messages or comments – if the algorithm detects the defined criteria, comments may be visible only to their author.
  5. If the User Content is found to be in breach of the Policy, the Service Provider may decide to:
    1. block access to such User Content in the System;
    2. hide such User Content in the System so that it becomes invisible to other Users;
    3. permanently deleting User Content from the System;
    4. report such User Content directly to the Portal Owner in order for the Portal Owner to take action.
  6. In the event of blocking, hiding, deleting or reporting the User Content, the Service Provider will immediately inform the User who posted the User Content subject to blocking, hiding, deleting or reporting, giving reasons for its decision.
  7. In the event that the User Content is blocked, hidden, deleted or reported as in breach of the Policy, the User who posted the User Content may file an appeal according to the rules described in § 5 of the Policy.
  8. The Service Provider warrants that appeals regarding the User Content will not be processed in an automated manner – it will be the responsibility of the Administrator’s staff to verify the legitimacy of blocking, hiding, deleting or reporting the User Content.
  9. The actions of the Service Provider in relation to the User Content remain completely independent of the actions that the Portal Owner may take in relation to the same User Content. The procedures related to the verification and deletion of the User Content in the Portal will be determined by the Portal Owner at its discretion. The Service Provider has no influence on any actions taken by the Portal Owner, and therefore appeals concerning the Portal Owner’s actions addressed to the Service Provider will be left unresolved by the Service Provider.

§ 3 Reporting User Content

  1. In the event that the User Content posted by the User in the System may violate the Policy, another User or a third party may report such User Content for verification by the Service Provider. A violation may be reported by:
    1. sending a message to the Service Provider’s email address;
    2. via instant messaging on the respective Portal;
    3. by means of other functionalities available in the System.
  2. The Report referred to in paragraph 1 must contain all the elements required under the Digital Services Act (DSA), such as:
    1. a sufficiently substantiated explanation of the reasons why the individual or entity alleges the information in question to be in breach of the Policy;
    2. a clear indication of the exact electronic location of that information, such as the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content adapted to the type of content and to the specific type of hosting service;
    3. the name and email address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;
    4. a statement confirming the bona fide belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.
  3. After receipt of the notice referred to in paragraph 1, the Service Provider will immediately acknowledge receipt of the notice to the notifier – using the same means by which the notice was submitted. If the notice is incomplete or contains other errors, the Service Provider may ask the notifier to complete or correct the notice. If the notifier fails to complete or correct the notice at the latest within 14 (in words: fourteen) days of the Service Provider’s request, the notice will be left unprocessed.
  4. The User Content will be verified by the Service Provider no later than 14 (in words: fourteen) days after receipt of a complete and correct notice. The Service Provider carries out the verification in a non-arbitrary, objective manner and with due diligence. In order to verify the User Content, the Service Provider may request additional information or documents from the notifier, e.g. confirming the ownership of the rights that the verified User Content potentially infringes.
  5. During the verification process, the Service Provider may block the User Content in such a way that it becomes invisible to other Users.
  6. After verification, the Service Provider may permanently block, hide or remove the User Content as violating the Policy, or may consider that the User Content does not violate the Policy. If the User Content has previously been blocked and, after verification, it has become evident that the User Content does not violate the Policy, the Service Provider will immediately reinstate the User Content and inform the notifier, stating the reasons for its decision.
  7. In the event of blocking, hiding, deleting or reporting the User Content, the Service Provider will immediately inform the notifier and the User who posted the User Content subject to blocking, hiding, deleting or reporting, giving reasons for its decision.
  8. In the event of the application of a measure referred to in paragraph 1, the User against whom the measure has been applied may lodge an appeal according to the rules described in § 6.
  9. The Service Provider warrants that any appeals regarding the User Content will not be processed in an automated manner – it will be the responsibility of the Service Provider’s staff to verify the legitimacy of blocking, hiding, deleting or reporting the User Content.

§ 4 Sanctions for posting the User Content that is in breach of the Policy

  1. If the User uses the System in breach of the Policy by posting any User Content that violates the Policy, the Service Provider may:
    1. block the User’s access to their profile, group or channel;
    2. hide the User’s profile in the System so that it becomes invisible to other users;
    3. impose restrictions on the user’s account;
    4. block access to the user’s account;
    5. suspend certain functions for the User, including Social Features;
    6. permanently block certain functions for the User, including Social Features.
  2. The choice of the measure referred to in paragraph 1 will depend on the circumstances of the case and the seriousness of the violation committed by the User while using the System. These actions remain independent of any other action that the Service Provider may take in relation to the User Content, such as blocking access to the User Content or permanently deleting it.
  3. In choosing the measure referred to in paragraph 1, the Service Provider will act with due diligence, in an objective and proportionate manner, and with due regard to the rights and legitimate interests of all parties involved.
  4. The blocking of the User’s account or the suspension of certain functionalities of the System may be imposed for a definite or indefinite period of time, depending on the circumstances of the case. In the event of a blocking or suspension that has been imposed for a limited period of time, at the end of this period, the Service Provider will restore access to the functions that were blocked or suspended.
  5. In the event of the application of a measure referred to in paragraph 1, the User against whom the measure has been applied may lodge an appeal according to the rules described in § 6.
  6. The Service Provider warrants that any appeals regarding the measure referred to in paragraph 1 will not be processed in an automated manner – it will be the responsibility of the Service Provider’s staff to verify the legitimacy of application of a particular measure.

§ 5 Appeal procedure

  1. Where:
    1. the Service Provider has not blocked, hidden, deleted or reported the User Content despite a notice from another User or a third party;
    2. the User Content has been blocked, hidden, deleted or reported in breach of the Policy;
    3. the Service Provider has applied any sanctions against the User in relation to the User Content (suspension or removal of access to certain features of the System, including Social Features, etc.).
  2. The User who has posted the relevant User Content, or the person who has reported the User Content for verification, may file an appeal.
  3. Any decision by the Service Provider related to the User Content must contain a statement of reasons that will enable filing an appeal – except where the Service Provider receives an order related to the User Content from the relevant public authority or service. The statement of reasons must meet the requirements of the Digital Services Act (DSA) and include information such as:
    1. information on whether the decision entails either the removal of, the disabling of access to, the demotion of or the restriction of the visibility of the User Content, or the suspension or termination of monetary payments related to that User Content, or imposes other measures referred to in the Policy with regard to the User Content, and, where relevant, the territorial scope of the decision and its duration;
    2. the facts and circumstances relied on in taking the decision, including, where relevant, information on whether the decision was taken pursuant to a notice submitted by another User or a third party, or based on voluntary own-initiative investigations of the Service Provider and, where strictly necessary, the identity of the notifier;
    3. where applicable, information on the use made of automated means in taking the decision, including information on whether the decision was taken in respect of the User Content detected or identified using automated means;
    4. where the decision concerns allegedly illegal User Content, a reference to the legal or contractual ground relied on and explanations as to why the User Content is considered to be prohibited on that ground;
    5. clear and User-friendly information on the possibilities for appeal available to the User or the notifier in respect of the decision.
  4. An appeal can be made by sending in a completed form:
    1. to the Service Provider’s email address;
    2. by means of functionalities available in the System;
    3. in writing, preferably by registered mail, to the Service Provider’s registered address.
  5. The appeal should include:
    1. first and last name (or business name) of the appellant;
    2. contact details;
    3. detailed reasons why, in the appellant’s view, the Service Provider’s decision was wrong and should be reversed.
  6. After receipt of the appeal, the Service Provider will immediately acknowledge its receipt – electronically to the email address provided. If the appeal is incomplete or contains other errors, the Service Provider may ask the notifier to complete or correct the appeal. If the notifier fails to complete or correct the appeal at the latest within 14 (in words: fourteen) days of the Service Provider’s request, the appeal will be left unprocessed.
  7. Appeals will be processed within 14 (in words: fourteen) days from the date of submitting the appeal.
  8. If the appeal is rejected by the Service Provider and the notifier submits an appeal again, on the same issue and based on the same facts, subsequent appeals will be left unprocessed.

§ 6 Personal data protection

  1. The principles for the protection of Personal Data are set out in the Privacy Policy.

§ 7 Final provisions

  1. If the Service Provider becomes aware of credible information about the possibility of a criminal offence or misdemeanour being committed by the User, the Service Provider is entitled and obliged to notify the relevant public authorities or services, as well as to provide them with data concerning the User. The same applies if the public authorities or services request the Administrator to access the User’s data, in particular for the purposes of civil or criminal proceedings.
  2. The Service Provider is not liable for the User Content posted in the System, on condition that the Service Provider:
    1. has no actual knowledge of illegal activities or illegal User Content and, with respect to claims for damages, is not aware of the facts or circumstances which clearly point to illegal activities or illegal User Content; or;
    2. takes prompt and appropriate action to remove or block access to illegal User Content when the Service Provider becomes aware of it.
  3. The DSA Policy is subject to change. If the User is registered on the Website, they will receive an email informing them of any amendments to the Policy at least 7 days prior to the planned effective date of the Policy amendments.
  4. The Policy in force at the time of the event will apply to events occurring before the Policy was amended.
  5. If the User does not accept the amended Policy, they should refrain from using the Systems to which the Policy applies.
  6. Amendments to the Policy do not affect the rights acquired by the User prior to the effective date of such amendments.
  7. Date of the last modification: 27.09.2024