Tag: USA

Montana, Tennessee join Indiana and Iowa as next States to pass comprehensive data protection laws

28. April 2023

Montana and Tennessee have both passed comprehensive bills in their state legislatures on April 21st, making them the latest additions to the states that have enacted privacy laws this year, alongside Indiana and Iowa.

Iowa Data Privacy Act (IDPA)

Iowa joined Connecticut, Utah, Virginia, Colorado, and California on March 29th as the sixth state to approve a comprehensive privacy law. The law will become effective on January 1st, 2025, which provides organizations with 21 months to meet the new requirements. Even though the law shares several similarities with other state privacy laws, organizations need to pay attention to a few distinctions as they broaden their compliance efforts across the United States.

The Iowa Data Privacy Act (IDPA) applies to businesses that operate in Iowa or target products or services to Iowa consumers and control or process personal data of 100,000 or more Iowa consumers or 25,000 or more Iowa consumers and derive over 50% of gross revenue from the sale of that data. The IDPA’s definition of a “consumer” includes natural persons who are Iowa residents acting in a personal (noncommercial and nonemployment) context, and excludes employees and B2B contacts. The IDPA imposes obligations on data controllers, such as limiting the purpose of processing personal data, implementing reasonable safeguards, refraining from discrimination, being transparent in their privacy notice, and ensuring contracts control relationships with their processors. It provides Iowa consumers with opt-out, deletion, access, appeal, and data portability rights. Sensitive personal information includes racial/ethnic origin, religious beliefs, and geolocation data, among others, and controllers must provide clear notice and an opportunity to opt-out of nonexempt processing.

The Iowa Attorney General has exclusive enforcement authority, and the IDPA does not allow for a private right of action.

Indiana Bill on Consumer data protection

Indiana is set to become the seventh state to enact a comprehensive privacy law when Senate Bill No. 5 is signed by Governor Eric Holcomb. The law goes into effect on January 1, 2026.

The Indiana privacy law applies to businesses that process the personal data of at least 100,000 Indiana residents or 25,000 Indiana residents and generate more than 50% of their gross revenue from the sale of personal data. Certain entities and data are exempt from the law. The law requires businesses to provide consumers with a clear and meaningful privacy notice and gives consumers the right to confirm, access, correct, delete, and port their personal data. Consumers can also opt-out of the processing of their personal data for targeted advertising, sale of personal data, or profiling that produces significant effects. There is no private right of action, and businesses have a 30-day cure period for any alleged violations. The Indiana privacy law is similar to other comprehensive state privacy laws, such as the Virginia Consumer Data Protection Act.

Montana Consumer Data Privacy Act (MCDPA)

After passing both houses of the Montana legislature, the Montana Consumer Data Privacy Act (MCDPA) now awaits Governor Greg Gianforte’s signature. The MCDPA is similar to the laws in Connecticut and Virginia, suggesting that these models are becoming the foundation for other state privacy laws concerning consumers.

The Montana Consumer Data Privacy Act (MCDPA) applies to companies that do business in Montana, control or process personal data of 50,000 or more Montana consumers or 25,000 or more Montana consumers and derive over 25% of gross revenue from the sale of that data. “Consumer” is defined as a natural person who is a resident of Montana acting in a personal context. Personal data includes information that is linked or reasonably linkable to an identified or identifiable individual. Sensitive data includes information about a person’s race/ethnic origin, religion, health diagnosis, sex life, sexual orientation, citizenship, immigration status, and genetic or biometric information. Companies must provide a standard set of consumer rights, including opt-out rights related to the sale of personal data, deletion rights, access rights, correction rights, appeal rights, opt-in rights for advertising and targeted marketing to individuals aged 13 to 16, and data portability rights. Sensitive data cannot be processed without obtaining the consumer’s consent or, in the case of a child, complying with COPPA. The MCDPA requires controllers to limit the purpose of processing personal data to that which is reasonably necessary and proportional, take steps to implement reasonable safeguards for the personal data within their control, refrain from discriminating against consumers for exercising their rights, and be transparent in their privacy notice.

The Montana Attorney General has exclusive enforcement authority, and there is no private right of action. The MCDPA will go into effect on October 1, 2024.

Tennessee Information Privacy Act (TIPA)

If Governor Bill Lee approves, Tennessee will soon join the states with comprehensive privacy laws with the implementation of the Tennessee Information Privacy Act (TIPA). The TIPA largely follows the model of California’s CCPA, but with one notable exception.

TIPA applies to companies doing business in or targeting products or services to Tennessee residents, and processing personal information of at least 100,000 consumers, or 25,000 consumers and deriving more than 50% of their gross revenues from the sale of personal information. Compliance with CCPA obligations will likely result in compliance with TIPA, subject to obligations with respect to the NIST Privacy Framework. The NIST Privacy Framework requires companies to identify, govern, control, communicate and protect privacy risks. Failure to comply with TIPA may result in penalties of up to $15,000 per violation, enforced by the Tennessee Attorney General.

Outlook

Several states are currently working on passing their own comprehensive consumer privacy bills this year, and there are also plans for more specialized privacy laws. For example, there are proposed laws focused on children, social media (such as Utah’s Social Media Regulation Act), and health information not covered by HIPAA (such as Washington’s My Health My Data Act). In addition, there is also the draft legislation for a comprehensive data protection law at the federal level.

FCC proposes updated data breach reporting requirements

10. January 2023

In the first week of January 2023, the Federal Communications Commission voted on a Proposed Rulemaking, which was passed with 4 votes in favour against none opposed, in order to strengthen the Commission’s rules for notifying customers and federal law enforcement agencies of breaches of customary proprietary network information.

The proposition was made after the wave of new legislations regarding the right to privacy and personal data protection both on a State and a federal level across the U.S.

One of the most relevant proposals contained in the document is to eliminate the current mandatory seven day waiting period for notifying customers of a data breach.

The FCCs Chairman, Jessica Rosenworcel, stated that the rules which were applied until now need to be updated. The FCC will open a formal phase in order to gather more information on how to implement the proposed changes and will also take into account comments made by the FCC Board.

Category: Data Protection · USA
Tags: ,

Danish watchdogs ban Google Chromebooks and Google Workspace in municipality

26. August 2022

In July 2022, after an investigation related to a data breach was carried out by the Danish Data Protection Authority (Datailsynet), Google Chromebooks and Google Workspace were banned in schools in the municipality of Helsingor. The DPA ruled that the risk assessment carried out by city officials shows that the processing of personal data by Google does not meet GDPR requirements. In particular, data transfers have been targeted by the Authority: the Data Processing Agreement allows data transfer to third countries for analytical and statistical support, though the data are primarily stored in Google’s European facilities.

This decision comes in a moment of tension in the world of personal data between Europe and the United States of America: other notorious cases (some still ongoing) are the case of the Irish Data Protection Authority vs. Facebook (now part of Meta Inc.), and the case of the German Federal Cartel Office vs. Facebook. European watchdogs have found that in many cases the American tech giants’ policies do not meet the requirements established by the GDPR. This could be traced back to a lack of legal framework in the field of privacy and personal data protection in the United States, were these companies are based.

This decision was taken in the aftermath of the Schrems II ruling by the European Court of Justice, which stated that the pre-existing agreement on data transfers between Europe and the US (so-called Privacy Shield)was not compatible with the GDPR. A new deal is on the table, but not yet approved nor effective.

Google is becoming the target of various investigations by European data watchdogs, above all because of its tool Google Analytics. In January the Austrian Data Protection Authority published an opinion in which it stated that companies using Google Analytics inadvertently transferred customers’ personal data such as IP addresses to the United States, in breach of the GDPR. Italy’s Garante per la Protezione dei Dati Personali published a similar opinion a few weeks later, stating that “the current methods adopted by Google do not guarantee an adequate level of protection of personal data”.

Personal data risks in the aftermath of the overturning of Roe vs. Wade

23. August 2022

At the end of June 2022, the United States Supreme Court overturned its 1973 ruling in the case of Roe vs. Wade, thus concretely ending federal abortion rights. The decision caused a worldwide outrage, but now a concerning situation presents itself: the massive use of social media and the Internet by the population could result in serious personal privacy violations by the authorities. For example, tech giants such as Apple, Google and Meta Inc. could share users’ data if law enforcement authorities suspect a felony is being committed. This could especially be the case in those States who chose to make abortion illegal after the Supreme Court’s ruling. According to the United States’ Federal Rules of Civil Procedure no. 45, this kind of personal data could be made object of a subpoena, thus forcing the subject to produce them in court. In such a scenario tech companies would have no choice than to provide the consumer’s data. It is clear that this is a high risk for the consumer’s privacy.

In particular, location data could show if a person visited an abortion clinic. Many women use specific apps in order to track periods, fertility and an eventual pregnancy. All these data could be put under surveillance and seized by law enforcement in order to investigate and prosecute abortion – related cases.

In some States this already happened. In 2018 in Mississippi a woman was charged with second – degree murder after seeking health care for a pregnancy loss which happened at home. Prosecutors produced her Internet browser history as proof. After two years she was acquitted of the charges.

Another risk is posed by the so – called data brokers: these are companies that harvest data, cleanse or analyze it and sell them to the highest bidder. These companies could also be used by law enforcement agencies to arbitrarily investigate people who could be related to abortion cases.

The lack of legislation regarding personal data protection is a serious issue in the United States. For example, there is no principle of data minimization as found in the GDPR. The Supreme Courts’ ruling makes this historical moment unexplored territory from a legal point of view. Privacy advisors and activists recommend to try to limit the digital footprint users leave on the web. Also, new laws and bills could be introduce in order to limit the access law enforcement agencies have to personal data.

Twitter fined $150m for handing users’ contact details to advertisers

30. May 2022

Twitter has been fined $150 million by U.S. authorities after the company collected users’ email addresses and phone numbers for security reasons and then used the data for targeted advertising. 

According to a settlement with the U.S. Department of Justice and the Federal Trade Commission, the social media platform had told users that the information would be used to keep their accounts secure. “While Twitter represented to users that it collected their telephone numbers and email addresses to secure their accounts, Twitter failed to disclose that it also used user contact information to aid advertisers in reaching their preferred audiences,” said a court complaint filed by the DoJ. 

A stated in the court documents, the breaches occurred between May 2013 and September 2019, and the information was apparently used for purposes such as two-factor authentication. However, in addition to the above-mentioned purposes, Twitter used that data to allow advertisers to target specific groups of users by matching phone numbers and email addresses with advertisers’ own lists. 

In addition to financial compensation, the settlement requires Twitter to improve its compliance practices. According to the complaint, the false disclosures violated FTC law and a 2011 settlement with the agency. 

Twitter’s chief privacy officer, Damien Kieran, said in a statement that the company has “cooperated with the FTC at every step of the way.” 

“In reaching this settlement, we have paid a $150m penalty, and we have aligned with the agency on operational updates and program enhancements to ensure that people’s personal data remains secure, and their privacy protected,” he added. 

Twitter generates 90 percent of its $5 billion (£3.8 billion) in annual revenue from advertising.  

The complaint also alleges that Twitter falsely claimed to comply with EU and U.S. privacy laws, as well as Swiss and U.S. privacy laws, which prohibit companies from using data in ways that consumers have not approved of. 

The settlement with Twitter follows years of controversy over tech companies’ privacy practices. Revelations in 2018 that Facebook, the world’s largest social network, used phone numbers provided for two-factor authentication for advertising purposes enraged privacy advocates. Facebook, now Meta, also settled the matter with the FTC as part of a $5 billion settlement in 2019. 

 

European Commission and United States agree in principle on Trans-Atlantic Data Privacy Framework

29. March 2022

On March 25th, 2022, the United States and the European Commission have committed to a new Trans-Atlantic Data Privacy Framework that aims at taking the place of the previous Privacy Shield framework.

The White House stated that the Trans-Atlantic Data Privacy Framework “will foster trans-Atlantic data flows and address the concerns raised by the Court of Justice of the European Union when it struck down in 2020 the Commission’s adequacy decision underlying the EU-US Privacy Shield framework”.

According to the joint statement of the US and the European Commission, “under the Trans-Atlantic Data Privacy Framework, the United States is to put in place new safeguards to ensure that signals surveillance activities are necessary and proportionate in the pursuit of defined national security objectives, establish a two-level independent redress mechanism with binding authority to direct remedial measures, and enhance rigorous and layered oversight of signals intelligence activities to ensure compliance with limitations on surveillance activities”.

This new Trans-Atlantic Data Privacy Framework has been a strenuous work in the making and reflects more than a year of detailed negotiations between the US and EU led by Secretary of Commerce Gina Raimondo and Commissioner for Justice Didier Reynders.

It is hoped that this new framework will provide a durable basis for the data flows between the EU and the US, and underscores the shared commitment to privacy, data protection, the rule of law, and the collective security.

Like the Privacy Shield before, this new framework will represent a self-certification with the US Department of Commerce. Therefore, it will be crucial for data exporters in the EU to ensure that their data importers are certified under the new framework.

The establishment of a new “Data Protection Review Court” will be the responsible department in cases of the new two-tier redress system that will allow EU citizens to raise complaints in cases of access of their data by US intelligence authorities, aiming at investigating and resolving the complaints.

The US’ commitments will be concluded by an Executive Order, which will form the basis of the adequacy decision by the European Commission to put the new framework in place. While this represents a quicker solution to reach the goal, it also means that Executive Orders can be easily repealed by the next government of the US. Therefore, it remains to be seen if this new framework, so far only agreed upon in principle, will bring the much hoped closure on the topic of trans-Atlantic data flows that is intended to bring.

CNIL judges use of Google Analytics illegal

14. February 2022

On 10th February 2022, the French Data Protection Authority Commission Nationale de l’Informatique et des Libertés (CNIL) has pronounced the use of Google Analytics on European websites to not be in line with the requirements of the General Data Protection Regulation (GDPR) and has ordered the website owner to comply with the requirements of the GDPR within a month’s time.

The CNIL judged this decision in regard to several complaints maybe by the NOYB association concerning the transfer to the USA of personal data collected during visits to websites using Google Analytics. All in all, NOYB filed 101 complaints against data controllers allegedly transferring personal data to the USA in all of the 27 EU Member States and the three further states of European Economic Area (EEA).

Only two weeks ago, the Austrian Data Protection Authority (ADPA) made a similar decision, stating that the use of Google Analytics was in violation of the GDPR.

Regarding the French decision, the CNIL concluded that transfers to the United States are currently not sufficiently regulated. In the absence of an adequacy decision concerning transfers to the USA, the transfer of data can only take place if appropriate guarantees are provided for this data flow. However, while Google has adopted additional measures to regulate data transfers in the context of the Google Analytics functionality, the CNIL deemed that those measures are not sufficient to exclude the accessibility of the personal data for US intelligence services. This would result in “a risk for French website users who use this service and whose data is exported”.

The CNIL stated therefore that “the data of Internet users is thus transferred to the United States in violation of Articles 44 et seq. of the GDPR. The CNIL therefore ordered the website manager to bring this processing into compliance with the GDPR, if necessary by ceasing to use the Google Analytics functionality (under the current conditions) or by using a tool that does not involve a transfer outside the EU. The website operator in question has one month to comply.”

The CNIL has also given advice regarding website audience measurement and analysis services. For these purposes, the CNIL recommended that these tools should only be used to produce anonymous statistical data. This would allow for an exemption as the aggregated data would not be considered “personal” data and therefore not fall under the scope of the GDPR and the requirements for consent, if the data controller ensures that there are no illegal transfers.

(Update) Processing of COVID-19 immunization data of employees in non-EEA countries

21. January 2022

With COVID-19 vaccination campaigns well under way, employers are faced with the question of whether they are legally permitted to ask employees about their COVID-19 related information and, if so, how that information may be used.

COVID-19 related information, such as vaccination status, whether an employee has recovered from an infection or whether an employee is infected with COVID-19, is considered health data. This type of data is considered particularly sensitive data in most data protection regimes, which may only be processed under strict conditions. Art. 9 (1) General Data Protection Regulation (GDPR)(EU), Art. 9 (1) UK-GDPR (UK), Art. 5 (II) General Personal Data Protection Law (LGPD) (Brazil), para. 1798.140. (b) California Consumer Privacy Act of 2018 (CCPA) (California) all consider health-related information as sensitive personal data. However, the question of whether COVID-19-related data may be processed by an employer is evaluated differently, even in the context of the same data protection regime such as the GDPR.

Below, we discuss whether employers in different European Economic Area (EEA) countries are permitted to process COVID-19-related data about their employees.

Brazil: According to the Labor Code (CLT), employers in Brazil have the right to require their employees to be vaccinated. The employer is responsible for the health and safety of its employees in the workplace and therefore has the right to take reasonable measures to ensure health and safety in the workplace. Since employers can require their employees to be vaccinated, they can also require proof of vaccination. As LGPD considers this information to be sensitive personal data, special care must be taken in processing it.

Hong-Kong: An employer may require its employees to disclose their immunization status. Under the Occupational Safety and Health Ordinance (OSHO), employers are required to take all reasonably practicable measures to ensure the safety and health of all their employees in the workplace. The vaccination may be considered as part of  COVID-19 risk assessments as a possible additional measure to mitigate the risks associated with infection with the virus in the workplace. The requirement for vaccination must be lawful and reasonable. Employers may decide, following such a risk assessment, that a vaccinated workforce is necessary and appropriate to mitigate the risk. In this case, the employer must comply with the Personal Data Protection Regulation (PDPO). Among other things, the PDPO requires that the collection of data must be necessary for the purpose for which it is collected and must not be kept longer than is necessary for that purpose. According to the PDPO, before collecting data, the employer must inform the employee whether the collection is mandatory or voluntary for the employee and, if mandatory, what the consequences are for the employee if he or she does not provide the data.

Russia: Employers must verify which employees have been vaccinated and record this information if such vaccinations are required by law. If a vaccination is not required by law, the employer may require this information, but employees have the right not to provide it. If the information on vaccinations is provided on a voluntary basis, the employer may keep it in the employee’s file, provided that the employee consents in writing to the processing of the personal data. An employer may impose mandatory vaccination if an employee performs an activity involving a high risk of infection (e.g. employees in educational institutions, organizations working with infected patients, laboratories working with live cultures of pathogens of infectious diseases or with human blood and body fluids, etc.) and a corresponding vaccination is listed in the national calendar of protective vaccinations for epidemic indications. All these cases are listed in the Decree of the Government of the Russian Federation dated July 15, 1999 No 825.

UK: An employer may inquire about an employee’s vaccination status or conduct tests on employees if it is proportionate and necessary for the employer to comply with its legal obligation to ensure health and safety at work. The employer must be able to demonstrate that the processing of this information is necessary for compliance with its health and safety obligations under employment law, Art. 9 (2) (b) UK GDPR. He must also conduct a data protection impact assessment to evaluate the necessity of the data collection and balance that necessity against the employee’s right to privacy. A policy for the collection of such data and its retention is also required. The information must be retained only as long as it is needed. There must also be no risk of unlawful discrimination, e.g. the reason for refusing vaccination could be protected from discrimination by the Equality Act 2010.

In England, mandatory vaccination is in place for staff in care homes, and from April 2022, this will also apply to staff with patient contact in the National Health Service (NHS). Other parts of the UK have not yet introduced such rules.

USA: The Equal Employment Opportunity Commission (EEOC) published a document proposing that an employer may implement a vaccination policy as a condition of physically returning to the workplace. Before implementing a vaccination requirement, an employer should consider whether there are any relevant state laws or regulations that might change anything about the requirements for such a provision. If an employer asks an unvaccinated employee questions about why he or she has not been vaccinated or does not want to be vaccinated, such questions may elicit information about a disability and therefore would fall under the standard for disability-related questions. Because immunization records are personally identifiable information about an employee, the information must be recorded, handled, and stored as confidential medical information. If an employer self-administers the vaccine to its employees or contracts with a third party to do so, it must demonstrate that the screening questions are “job-related and consistent with business necessity.”

On November 5th, 2021, the U.S. Occupational Safety and Health Administration (OSHA) released a emergency temporary standard (ETS) urging affected employers to take affirmative action on COVID-19 safety, including adopting a policy requiring full COVID-19 vaccination of employees or giving employees the choice of either being vaccinated against COVID-19 or requiring COVID-19 testing and facial coverage. On November 12th, 2021, the court of appeals suspended enforcement of the ETS pending a decision on a permanent injunction. While this suspension is pending, OSHA cannot take any steps to implement or enforce the ETS.

In the US there are a number of different state and federal workplace safety, employment, and privacy laws that provide diverging requirements on processing COVID-19 related information.

US court unsuccessfully demanded extensive information about user of the messenger app Signal

16. November 2021

On October 27th, 2021 Signal published a search warrant for user data issued by a court in Santa Clara, California. The court ordered Signal to provide a variety of information, including a user’s name, address, correspondence, contacts, groups, and call records from the years 2019 and 2020. Signal was only able to provide two sets of data: the timestamp of when the account was created and the date of the last connection to the Signal server, as Signal does not store any other information about its users.

The warrant also included a confidentiality order that was extended four times. Signal stated:

Though the judge approved four consecutive non-disclosure orders, the court never acknowledged receipt of our motion to partially unseal, nor scheduled a hearing, and would not return counsel’s phone calls seeking to schedule a hearing.

A similar case was made public by Signal in 2016, when a court in Virginia requested the release of user data and ordered that the request not be made public. Signal fought the non-publication order in court and eventually won.

Signal is a messenger app that is highly regarded among privacy experts like Edward Snowden. That’s because Signal has used end-to-end encryption by default from the start, doesn’t ask its users for personal information or store personal data on its servers and is open source. The messenger is therefore considered particularly secure and trustworthy. Moreover, no security vulnerabilities have become known so far, which is definitely the case with numerous competing products.

Since 2018, Signal is beeing operated by the non-profit organization Signal Technology Foundation and the Signal Messenger LLC. At that time, WhatsApp co-founder Brian Acton, among others, joined the company and invested $50 million. Signal founder Moxie Marlinspike is also still on board.

The EU commission is planning a legislative package to fight the spread of child abuse on the Internet. The law will also include automated searches of the content of private and encrypted communications, for example via messenger apps. This would undermine the core functions of Signal in Europe. Critics call this form of preventive mass surveillance a threat to privacy, IT security, freedom of expression and democracy.

Colorado Privacy Act officially enacted into Law

14. July 2021

On July 8, 2021, the state of Colorado officially enacted the Colorado Privacy Act (CPA), which makes it the third state to have a comprehensive data privacy law, following California and Virginia. The Act will go into effect on July 1, 2023, with some specific provisions going into effect at later dates.

The CPA shares many similarities with the California Consumer Privacy Act (CCPA) and the Virgina Consumer Data Protection Act (CDPA), not having developed any brand-new ideas in its laws. However, there are also differences. For example, the CPA applies to controllers that conduct business in Colorado or target residents of Colorado with their business, and controls or processes the data of more than 100 000 consumers in a calendar year or receive revenue by processing data of more than 25 000 consumers. Therefore, it is broader than the CDPA, and does not include revenue thresholds like the CCPA.

Similar to the CDPA, the CPA defines a consumer as “a Colorado resident acting only in an individual or household context” and explicitly omits individuals acting in “a commercial or employment context, as a job applicant, or as a beneficiary of someone acting in an employment context”. As a result, controllers do not need to consider the employee personal data they collect and process in the application of the CPA.

The CPA further defines “the sale of personal information” as “the exchange of personal data for monetary or other valuable consideration by a controller to a third party”. Importantly, the definition of “sale” explicitly excludes certain types of disclosures, as is the case in the CDPA, such as:

  • Disclosures to a processor that processes the personal data on behalf of a controller;
  • Disclosures of personal data to a third party for purposes of providing a product or service requested by consumer;
  • Disclosures or transfer or personal data to an affiliate of the controller’s;
  • Disclosure or transfer to a third party of personal data as an asset that is part of a proposed or actual merger, acquisition, bankruptcy, or other transaction in which the third party assumes control of all or part of the controller’s assets;
  • Disclosure of personal data that a consumer directs the controller to disclose or intentionally discloses by using the controller to interact with a third party; or intentionally made available by a consumer to the general public via a channel of mass media.

The CPA provides five main consumer rights, such as the right of access, right of correction, right of deletion, right to data portability and right to opt out. In case of the latter, the procedure is different from the other laws. The CPA mandates a controller provide consumers with the right to opt out and a universal opt-out option so a consumer can click one button to exercise all opt-out rights.

In addition, the CPA also provides the consumer with a right to appeal a business’ denial to take action within a reasonable time period.

The CPA differentiates between controller and processor in a similar way that the European General Data Protection Regulation (GDPR) does and follows, to an extent, similar basic principles such as duty of transparency, duty of purpose specification, duty of data minimization, duty of care and duty to avoid secondary use. In addition, it follows the principle of duty to avoid unlawful discrimination, which prohibits controllers from processing personal data in violation of state or federal laws that prohibit discrimination.

Pages: 1 2 3 4 Next
1 2 3 4