Childrens Safeguarding Digital Co-Design Report 2023

Dr Sarah Carlick • Mar 30, 2023

IAA Report March 2023

A digital safeguarding approach aims to use tech for good, co-designed with users. In the present project we seek to develop innovative ways in which children and young people can digitally and safely self- refer and share information with multi-agency professionals, and communicate harms in real time, transforming traditional help and support to reduce risk of harm to children and young people. This part of the project reports on the views of senior leaders in safeguarding within local authorities towards digital safeguarding. The public sector has statutory obligations, along with private sector organisations, offering protective services, but there is general agreement that it lags behind in optimising technological advancements for information sharing and communicating directly with children and young people. There is a notable lack of innovation in the application of creative forms of technology, particularly in terms of safeguarding, stemming from the following list of key findings.

Ways forward

  • Clear frameworks for safeguarding digital policy and procedures applied locally could drive significant change and a culture shift.
  • Targeted funding and investment, under a dedicated scheme such as the Safety Tech Challenge Fund, in both the public sector and the private sector, to encourage innovation and growth in the availability and application of digital safeguarding tools that enable digital access.
  • Innovation will come from the public, private and community sectors engaging in co-design. It must involve children and young people who can share lived experience and structure the architecture for new technology.
  • Ensure workforce upskilling by including technical agility and skills in pre and post qualifying social work training.
  • Policy – Progress towards digital safeguarding requires a policy framework at national and local level , which is currently lacking . Policies primarily focus on staying safe online (children and staff) and the use of case management systems . There are no existing statutory frameworks for doing safeguarding digitally.
  • Children’s co-design – Several initiatives engage and consult with children and young people, particularly looked after young people. There was little evidence of child initiated co- designed policy and practice in digital safeguarding.
  • Digital Safeguarding Initiatives – Senior leaders within local authorities are enthusiastic and excited about the potential for digital safeguarding. Post-pandemic there is some evidence of online case monitoring continuing, but no local authorities have extended safeguarding practices to integrate digital access for children and families.
  • Barriers to implementing digital safeguarding –respondents emphasised that changes to culture and working practices will take time. It requires: Developing a digitally resilient workforce that works outside normal office working hours. Bridging digital divides to bring social and digital agendas together. Case management systems that are agile, connecting to developing technology that allows easy self service engagement through parent and/or child portals. Thereafter capturing such data into new and emerging data sets.
  • A dominant theme aired was the need for leadership and strategy from a nationalperspective . Alongside this were operational restrictions stemming from a lack of knowledge and skills in the workforce ; lack of resources and limits imposed by the architecture and governance of case management systems.
See link below this post, where you can download the full report.

by Dr Sarah Carlick 23 Jul, 2024
A relatively new concept, the term “digital safeguarding” tends to make most people think of online safety, or digital safety strategy, and/or all the risks of the internet. However, this is only a small section of a wider, more meaningful view of what technology can bring to the safeguarding landscape. Digital safeguarding is about using co-design research methods with children, adults, volunteers and/or beneficiaries to create innovative technology for self-referrals to services, sharing of information, and managing risks. This approach gives service users that are children and young people an equal voice, recognising them as digital natives and appreciating the changing technological environments of our lived experiences today. It is also about ensuring organisations are able to offer digital access for their services, as well as having processes and policies as a framework to optimise digital security and safety. To digitally communicate in real time harnesses the digital and traditional early help and support to reduce risk of harm. What digital safeguarding embraces is user-friendly, user-led approaches that respect children, young people, vulnerable adults and their families or carers as equal partners in design ideas, product development and product usability. Such co-production and engagement should extend to the design of online forms, to how systems and platforms are used by organisations and service users, as well as how and what information is shared. The aim is to enhance accessibility to services for those not already known to the charity, and to increase the sharing of information by allowing digital services to be offered to service users that are known to the organisation. My research suggests that if service users such as children had been part of the original agreement for, and design of, digital platforms, this would increase their safety and the safety elements of those platforms. Tech for good Tech for good already exists within the voluntary sector; for example, as part of organisations’ work on sustainability, fundraising, environmental challenges, and mental health and wellbeing. In this article I am introducing a comparison with these in terms of safeguarding, early access to help, and protection. In this context we could include integration of artificial intelligence, chatbots, access to the latest devices and equipment, the use of applications for interventions or easy access to policies, and case management systems that offer service users safe digital self-referrals. Technology could enable service users to see, amend and contribute to records, know their service history, build trusting relationships and feed back on the services or information they receive. In response, practitioners, key workers and volunteers must be consistently flexible and creative when working with users, and systems need to be adaptable. Digital safeguarding is a conceptual framework that sits alongside traditional safeguarding. Although there is a range of statutory legislation and guidance for safeguarding children and adults at risk, technology and digital is not yet prescriptive within legislation or Charity Commission guidance. I believe it is only a matter of time before this changes, as we must catch up with the technological world of today. Working Together to Safeguard Children (2023), the government guidance to multi- agency working to help, protect and promote the welfare of children, does reference the need to co-design with children, and within the scope of supporting vulnerable adults there is a vast range of technology-assisted aids. But there are challenges and barriers to adoption and implementation at both a national and organisational level. Barriers to adoption Recent research has reinforced the lag in workforce digital capacity, with claims that the workforce is not sufficiently digitally skilled, and that current case management systems are not able to adapt to the ways that children and young people communicate. Strong leadership and strategic direction sits with trustees and chief executives, and heir wider organisation’s strategy and business planning teams. Digital safeguarding must have a clear governance process that guides the development of technical and digital products that meet the needs of a charity’s service users or beneficiaries. Increased use of digital communications was inevitable during the pandemic, even if this presented challenges for organisations. Some were quick to adapt, but the tools used were generally limited to familiar platforms such as Zoom, Microsoft Teams and WhatsApp, as there is little availability of dedicated safeguarding tools. Key considerations There are several key questions and considerations for charities working with children and other potentially vulnerable service users: ■ Are your staff and volunteers confident in their digital life in the workplace? What is your technology offer for service users? ■ Does your organisation have a digital safeguarding vision and/or policy? ■ Does there need to be more awareness of digital safeguarding within the organisation? ■ Do you audit the digital footprint of the organisation and understand where digital safeguarding could improve services and access to services? ■ What is the digital capacity of your staff and volunteers and how can they access digital skills training? How is this integrated into training plans? ■ Would your charity benefit from having its safeguarding policy and procedures in a more accessible format, such as a downloadable app? ■ Do you accept digital self-referrals to your service(s)? ■ Are service users or beneficiaries able to submit forms digitally, or to communicate with staff digitally? Traditionally, this could be via Zoom, Teams, etc, but is there a more innovative way of using tech for good? ■ Is there a commitment to co-design methods for your internal digital systems? ■ Is there a range of digital formats or digital self-service portals that service users, volunteers or beneficiaries can use to digitally share their own information with the organisation? There is a need for the voluntary sector to link funding opportunities with new digital innovations for safeguarding. A number of organisations fund digital transformation projects, including the National Lottery, Nominet, Catalyst, the Social Tech Trust and Bethnal Green Ventures. Developing a strategy Existing online communication tools – whether video meeting platforms or social media groups – have limited use in a safeguarding context. A digital and technology safeguarding strategy goes further than these traditional tools, and further than simply a commitment to digital innovation. A strategy gives you and your organisation clear visibility of all the technological and digital activity across the charity. When this is achieved staff, volunteers and service users have the digital tools they need and everyone understands the process of designing and implementing new safeguarding technological and digital solutions. The benefits of a digital and technology safeguarding strategy are as follows: ■ It brings together a charity’s safer practices and digital practices. ■ It enables the organisation to reframe its use of language based on the digital lived experiences of its volunteers and services users, which can help in managing risk. ■ Technology can be an enabler for delivery of services. ■ It allows service users to feel heard in a non-stigmatised way throughout their journey. ■ It increases accessibility to services, especially for less engaged groups. ■ It creates and empowers digital communities, increasing the likelihood of early support. ■ It promotes co-design as a fundamental thread through IT systems and digital services, such as for case-management systems that are designed for all users and allow service users to access their digital records and/or to contribute to them digitally. ■ It expands areas of digital support, ie in fields of mental health, domestic abuse, foster carers, people with disabilities, etc. ■ The development of technological and digital solutions are embedded within governance processes and become a standard way of working. Interactive safeguarding apps Co-design and child-centred or user-centred, co-operative enquiry is key to design and usability. This, therefore, becomes a transitional nature to the whole organisation and a technical period of change. In this context, “safeguarding” or “protection” places the service user or child as the lead protagonist in user-led design principles; for instance, in designing the architecture of a safeguarding application. As a result, they are also more likely to promote the app through their own social and technological networks, opening up new avenues for others to find and access your organisation’s services. Interactive safeguarding apps allow for self-reporting, immediate responses and access to information. They can also, for example, be age-appropriate or tailored to meet the needs of children and young people with disabilities. Two examples of charities that have developed their own apps under the umbrella of safeguarding are the Grief Support for Young People app created by a leading bereavement children’s charity that works with 11 to 25- year-olds who have been bereaved of someone important to them. Content contains information about grief feelings and how to manage these effectively. The user is also able to read others’ stories, watch short films written and made by bereaved children, and to personalise, diarise and share their own information. Christian charity the Churches’ Child Protection Advisory Service (CPPAS) developed the CCPAS Safeguarding app to offer essential basic information so the user can have all the details of their safeguarding coordinator and other key people in one place. The user can text, phone or email the safeguarding coordinator as well as having a downloadable copy of their organisation’s safeguarding policy. Once a digital and technology safeguarding policy is in place, I would like to see charities develop more web-based offers and apps for staff and volunteers for easy, accessible reporting. It would also be good to see more children and vulnerable adults designing more be-safe apps that allow for self-reporting of concerns – moving away from the traditional “safeguarding@” email address. In sum, if there is a will, we must work with our staff, volunteers and service users to find the way. Civil Society - Governance and Leadership July 2024
by Dr Sarah Carlick 13 Jun, 2024
The body content of your post goes here. To edit this text, click on it and delete this default text and start typing your own or paste your own from a different source.
by Dr Sarah Carlick 13 Jun, 2024
The body content of your post goes here. To edit this text, click on it and delete this default text and start typing your own or paste your own from a different source.
by Dr Sarah Carlick 11 Dec, 2023
Children and young people today may be digital natives but digital safeguarding in social care is not a well-known concept. It has yet to be part of everyday social work practice. Most people think digital safeguarding is just about staying safe online. But a true digital safeguarding approach aims to use technology for good, co-designed with children and young people. This includes the integration of artificial intelligence, chatbots, access to the latest devices and equipment, the use of applications for interventions, easy access to policies, and, perhaps most radically, delivering case management systems that offer children and families safe digital self-referrals. I am a social worker by background currently involved in developing innovative ways for children and young people to digitally and safely self-refer and share their own information with social workers and other professionals. Digital communities and digital safeguarding solutions can empower families within the context of family help. They can also allow children to feel heard in a non-stigmatised way throughout the child protection journey. The revolution in digital communication is already transforming our lives. Online communication tools such as Zoom, Microsoft Teams and WhatsApp are widely used but they have limited use in a safeguarding context. Dedicated interactive safeguarding apps in social care allow for self-reporting, immediate responses and access to information. They can also be age-appropriate or tailored to meet the needs of people with disabilities. As it stands, the digital divide still presents considerable challenges for social workers as well as for children in realising the full potential of technology within safeguarding. Social care and public sector organisations lag behind in using technology for information sharing and communicating directly with children and young people digitally. I work with local authorities implementing digital safeguarding strategies with the aim of introducing digital front doors for children. Describing a digital, online service as being at the heart of the community might sound contradictory. But in reality, it means there is a route to access services anywhere and at any time, opening up opportunities for frontline practice that empowers children and families. It gives communities digital tools at a local level, increasing the speed of communication and placing written conversations directly into recording systems. As well as speeding up child protection referrals, reducing risk of harms quicker, it can break down stereotyping and fear associated with social workers and children’s social care, build trust in sharing of information in different formats, promote greater variety in ways of communicating, such as sharing videos and photos, and offer new information as sets of data for analysis. Technology allows the creation ofmulti-agency digital front doors and an ability to digitally walk shoulder-to-shoulder with children in our local communities. An example of a digital front door is where a child can refer themselves either for advice or for protection from harm through their own device such as a mobile phone or iPad. The response they get could initially be a chatbot to offer advice and guidance, or it could transfer direct to a team of digital advisors linked to a multi-agency safeguarding hub responding with questions and answers. It could involve completing an interactive form that can be forwarded to a social worker or provide a big red button to connect to the police or multi-agency support hub in an emergency without having to report to a trusted adult first. Social workers and social care professionals may worry they lack the digital skills for this brave new world and that current case management systems are not able to adapt to ways children digitally communicate. There are challenges, including developing a digitally resilient workforce that works outside normal office with 24/7 digital referrals. There is also a need for a new generation of case management systems fit for the digital age. This requires leadership and strategy from a national perspective. But overcoming such barriers ultimately benefits children because current service practices do not integrate with the way children live their lives which heavily interfaces with the digital world. If there is no digital offer to either access services or digitally contribute to their own records and care plans, then social care and practitioners are not speaking the child’s language. A key aspect of this is children co-designing digital safeguarding systems. Doing so empowers them and ensures technology will be relevant to the child's lived experience. Co-design is about treating children as equal partners in user-centred design, especially for the implementation of digital self-referrals. If children are part of the original agreement and design this increases the safety elements of the platforms. Technology can offer revolutionary ways of working with children and families in a digital world with the potential to complement - or even replace - the standard front door referral route of a phone call or email, which must be accompanied by an electronic form. Digital self-referrals and digital engagement can be developed across the safeguarding spectrum from early help to child protection planning. At the core is improving the voice of the child throughout the system by delivering frictionless information sharing, self-referring and the improvement of case management systems. British Asssociation of Social Work
by Dr Sarah Carlick 11 May, 2023
In February 2019, Sarah was approached to share her research on stage at TedX Manchester. Her talk was titled: Be Safe Technologies: Children as Equal Partners in Social Care Tech. Children themselves rarely engage directly with protective services unless they are referred into the child protection system by a third-party adult. Technologies have enabled children to communicate in new ways as children today become digital natives. Sarah shared insights from her research and her passion for safeguarding children and tech for social good. Working with children as equal partners for further research and innovation and leading the first Children's Safeguarding Digital Cooperative Design Centre in the field of Social Care Tech. While all of us dream of a better world, some of us are busy making it a better and safer place to live in; Sarah Carlick is one of them.
by Dr Sarah Carlick 20 Oct, 2020
I currently see the UK Safety Tech network and industry is a new foundling growing with rapid investment and new emerging technologies accelerated by the pandemic. I believe alongside this we should continue to explore and focus upon tech for good and specifically on multi-agency collaborative tech innovation. This must include the social care sector where little has been developed and lacks financial investment. Safe social care tech that not only allows the younger generations to voice how this emerging tech should be shaped and designed but also celebrate all the possibilities that tech for good could contribute to protecting all children. My mission is help champion companies and policy advisors to understand children and young people use tech every day all day and if we are hoping to shield and protect them from on line harms they must be seen as equal. This equality is in the process of the development of the actual technologies that are in place to protect them. Notwithstanding there is quite rightly a place for detection of on line harms however if parental control and surveillance is the central driver how do children and young people learn to mange their own risks? If the main statutory agencies such as social care and the social workers that serve them do not join in this same agenda how does everything join together? Conceptually think of it in simple terms as child, online harm, abuse, parent, tech company, designed for profit, private sector, Internet poverty, social worker, single agency reporting i.e. School tech. They all are singular there is no cross cutting network or connections bringing them together. What equates to a private company or voluntary sector organistaion when they produce a piece of software that is bought by a parent can be very different through the eyes of the child. How does this work for those parents that lack the cash or knowledge to purchase there apps. For instance, how does this support the child in a home of domestic abuse or is hungry for food. There is a need to think about those that do not have easy access to the Internet let alone the statutory services for protection. Moreover alongside the need to minimize, deter and reduce the risk of children being exposed to harmful content, contact or conduct we must find technological ways to empower children and young people. Empower them to recognize, respond and report or even technological transmit their information about what is happening to them in the a technological safe way. At the other end there needs to be social care technology that supports this process and transform child protection with hybrids and inter disciplinary innovations. I argue that the traditional method of speaking to children in the classroom or calling a helpline is not forward thinking especially amidst a global pandemic. There are opportunities for children to act as equal partners so that their voices become standard operating practice for companies and organisation within the Child Safety Tech industry. I am advocating that if we took on board some small changes now, we in the sector could make a bigger social impact in the long term. In terms of the policy landscape which should provide leadership and guidance incorporated these changes it may allow more effective access and engagement for the sector. It just might help the birth of social care tech as co-operative creations and generate further tech for good. The basis of changed is based upon my research that introduces new conceptual thinking that children can also managing some of their own digital risks and wanting to revolutionize reporting of abuse and on line harms in a child friendly digital format. · Younger children as well as young people equally have apart to play in the co-operative creation of emerging safety tech. · Co-operative design with children and young people at the center of user design. · When the customer is the parent, carer or adult and the end users are children or young people. Include the end users in feedback so their voices and experiences can be embedded in the design. · Focus more on inter-disciplinary research and networks that include statutory agencies such as social care, the police and health. · Considerations to be given to policy implications if the agenda is driven by a bottom up approach. · Spend more time on understanding child’s access to these technologies and their user journeys. · Believe that the child has agency and power to enable us (the adults) to shape the way forward to ensure safety is not just a word, a piece of tech or an idea. Safe becomes tech as an enabler for them to engage in the online world in their language and behaviours. · Innovation and investment directed towards the social care sector from within the safety tech industry. Therefore not always doing what we have always done!! Let me offer these concluding thoughts… how can there be a sector wide safety tech strategy that includes the child's journey ? How do we map children's journeys and experiences? How do we ensure children and young peoples' voice are at the center of strategy and sector development? How do we push the boundaries to move beyond listening to enacting? Our focus is of course about identifying those at risk of harm but how do we reach those experiencing hidden harms? How do we support preventative technology models so there is access for all ? It always seems impossible until its done . Dr Sarah Carlick sarah@mesafe.co.uk
Share by: