Search results for: learning characteristics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13673

Search results for: learning characteristics

53 Childhood Sensory Sensitivity: A Potential Precursor to Borderline Personality Disorder

Authors: Valerie Porr, Sydney A. DeCaro

Abstract:

TARA for borderline personality disorder (BPD), an education and advocacy organization, helps families to compassionately and effectively deal with troubling BPD behaviors. Our psychoeducational programs focus on understanding underlying neurobiological features of BPD and evidence-based methodology integrating dialectical behavior therapy (DBT) and mentalization based therapy (MBT,) clarifying the inherent misunderstanding of BPD behaviors and improving family communication. TARA4BPD conducts online surveys, workshops, and topical webinars. For over 25 years, we have collected data from BPD helpline callers. This data drew our attention to particular childhood idiosyncrasies that seem to characterize many of the children who later met the criteria for BPD. The idiosyncrasies we observed, heightened sensory sensitivity and hypervigilance, were included in Adolf Stern’s 1938 definition of “Borderline.” This aspect of BPD has not been prioritized by personality disorder researchers, presently focused on emotion processing and social cognition in BPD. Parents described sleep reversal problems in infants who, early on, seem to exhibit dysregulation in circadian rhythm. Families describe children as supersensitive to sensory sensations, such as specific sounds, heightened sense of smell, taste, textures of foods, and an inability to tolerate various fabrics textures (i.e., seams in socks). They also exhibit high sensitivity to particular words and voice tones. Many have alexithymia and dyslexia. These children are either hypo- or hypersensitive to sensory sensations, including pain. Many suffer from fibromyalgia. BPD reactions to pain have been studied (C. Schmahl) and confirm the existence of hyper and hypo-reactions to pain stimuli in people with BPD. To date, there is little or no data regarding what comprises a normative range of sensitivity in infants and children. Many parents reported that their children were tested or treated for sensory processing disorder (SPD), learning disorders, and ADHD. SPD is not included in the DSM and is treated by occupational therapists. The overwhelming anecdotal data from thousands of parents of children who later met criteria for BPD led TARA4BPD to develop a sensitivity survey to develop evidence of the possible role of early sensory perception problems as a pre-cursor to BPD, hopefully initiating new directions in BPD research. At present, the research community seems unaware of the role supersensory sensitivity might play as an early indicator of BPD. Parents' observations of childhood sensitivity obtained through family interviews and results of an extensive online survey on sensory responses across various ages of development will be presented. People with BPD suffer from a sense of isolation and otherness that often results in later interpersonal difficulties. Early identification of supersensitive children while brain circuits are developing might decrease the development of social interaction deficits such as rejection sensitivity, self-referential processes, and negative bias, hallmarks of BPD, ultimately minimizing the maladaptive methods of coping with distress that characterizes BPD. Family experiences are an untapped resource for BPD research. It is hoped that this data will give family observations the critical credibility to inform future treatment and research directions.

Keywords: alexithymia, dyslexia, hypersensitivity, sensory processing disorder

Procedia PDF Downloads 181
52 Applying Concept Mapping to Explore Temperature Abuse Factors in the Processes of Cold Chain Logistics Centers

Authors: Marco F. Benaglia, Mei H. Chen, Kune M. Tsai, Chia H. Hung

Abstract:

As societal and family structures, consumer dietary habits, and awareness about food safety and quality continue to evolve in most developed countries, the demand for refrigerated and frozen foods has been growing, and the issues related to their preservation have gained increasing attention. A well-established cold chain logistics system is essential to avoid any temperature abuse; therefore, assessing potential disruptions in the operational processes of cold chain logistics centers becomes pivotal. This study preliminarily employs HACCP to find disruption factors in cold chain logistics centers that may cause temperature abuse. Then, concept mapping is applied: selected experts engage in brainstorming sessions to identify any further factors. The panel consists of ten experts, including four from logistics and home delivery, two from retail distribution, one from the food industry, two from low-temperature logistics centers, and one from the freight industry. Disruptions include equipment-related aspects, human factors, management aspects, and process-related considerations. The areas of observation encompass freezer rooms, refrigerated storage areas, loading docks, sorting areas, and vehicle parking zones. The experts also categorize the disruption factors based on perceived similarities and build a similarity matrix. Each factor is evaluated for its impact, frequency, and investment importance. Next, multiple scale analysis, cluster analysis, and other methods are used to analyze these factors. Simultaneously, key disruption factors are identified based on their impact and frequency, and, subsequently, the factors that companies prioritize and are willing to invest in are determined by assessing investors’ risk aversion behavior. Finally, Cumulative Prospect Theory (CPT) is applied to verify the risk patterns. 66 disruption factors are found and categorized into six clusters: (1) "Inappropriate Use and Maintenance of Hardware and Software Facilities", (2) "Inadequate Management and Operational Negligence", (3) "Product Characteristics Affecting Quality and Inappropriate Packaging", (4) "Poor Control of Operation Timing and Missing Distribution Processing", (5) "Inadequate Planning for Peak Periods and Poor Process Planning", and (6) "Insufficient Cold Chain Awareness and Inadequate Training of Personnel". This study also identifies five critical factors in the operational processes of cold chain logistics centers: "Lack of Personnel’s Awareness Regarding Cold Chain Quality", "Personnel Not Following Standard Operating Procedures", "Personnel’s Operational Negligence", "Management’s Inadequacy", and "Lack of Personnel’s Knowledge About Cold Chain". The findings show that cold chain operators prioritize prevention and improvement efforts in the "Inappropriate Use and Maintenance of Hardware and Software Facilities" cluster, particularly focusing on the factors of "Temperature Setting Errors" and "Management’s Inadequacy". However, through the application of CPT theory, this study reveals that companies are not usually willing to invest in the improvement of factors related to the "Inappropriate Use and Maintenance of Hardware and Software Facilities" cluster due to its low occurrence likelihood, but they acknowledge the severity of the consequences if it does occur. Hence, the main implication is that the key disruption factors in cold chain logistics centers’ processes are associated with personnel issues; therefore, comprehensive training, periodic audits, and the establishment of reasonable incentives and penalties for both new employees and managers may significantly reduce disruption issues.

Keywords: concept mapping, cold chain, HACCP, cumulative prospect theory

Procedia PDF Downloads 41
51 Biotech Processes to Recover Valuable Fraction from Buffalo Whey Usable in Probiotic Growth, Cosmeceutical, Nutraceutical and Food Industries

Authors: Alberto Alfano, Sergio D’ambrosio, Darshankumar Parecha, Donatella Cimini, Chiara Schiraldi.

Abstract:

The main objective of this study regards the setup of an efficient small-scale platform for the conversion of local renewable waste materials, such as whey, into added-value products, thereby reducing environmental impact and costs deriving from the disposal of processing waste products. The buffalo milk whey derived from the cheese-making process, called second cheese whey, is the main by-product of the dairy industry. Whey is the main and most polluting by-product obtained from cheese manufacturing consisting of lactose, lactic acid, proteins, and salts, making whey an added-value product. In Italy, and in particular, in the Campania region, soft cheese production needs a large volume of liquid waste, especially during late spring and summer. This project is part of a circular economy perspective focused on the conversion of potentially polluting and difficult to purify waste into a resource to be exploited, and it embodies the concept of the three “R”: reduce, recycle, and reuse. Special focus was paid to the production of health-promoting biomolecules and biopolymers, which may be exploited in different segments of the food and pharmaceutical industries. These biomolecules may be recovered through appropriate processes and reused in an attempt to obtain added value products. So, ultrafiltration and nanofiltration processes were performed to fractionate bioactive components starting from buffalo milk whey. In this direction, the present study focused on the implementation of a downstream process that converts waste generated from food and food processing industries into added value products with potential applications. Owing to innovative downstream and biotechnological processes, rather than a waste product may be considered a resource to obtain high added value products, such as food supplements (probiotics), cosmeceuticals, biopolymers, and recyclable purified water. Besides targeting gastrointestinal disorders, probiotics such as Lactobacilli have been reported to improve immunomodulation and protection of the host against infections caused by viral and bacterial pathogens. Interestingly, also inactivated microbial (probiotic) cells and their metabolic products, indicated as parabiotic and postbiotics, respectively, have a crucial role and act as mediators in the modulation of the host’s immune function. To boost the production of biomass (both viable and/or heat inactivated cells) and/or the synthesis of growth-related postbiotics, such as EPS, efficient and sustainable fermentation processes are necessary. Based on a “zero-waste” approach, wastes generated from local industries can be recovered and recycled to develop sustainable biotechnological processes to obtain probiotics as well as post and parabiotic, to be tested as bioactive compounds against gastrointestinal disorders. The results have shown it was possible to recover an ultrafiltration retentate with suitable characteristics to be used in skin dehydration, to perform films (i.e., packaging for food industries), or as a wound repair agent and a nanofiltration retentate to recover lactic acid and carbon sources (e.g., lactose, glucose..) used for microbial cultivation. On the side, the last goal is to obtain purified water that can be reused throughout the process. In fact, water reclamation and reuse provide a unique and viable opportunity to augment traditional water supplies, a key issue nowadays.

Keywords: biotech process, downstream process, probiotic growth, from waste to product, buffalo whey

Procedia PDF Downloads 49
50 Technological Transference Tools to Diffuse Low-Cost Earthquake Resistant Construction with Adobe in Rural Areas of the Peruvian Andes

Authors: Marcial Blondet, Malena Serrano, Álvaro Rubiños, Elin Mattsson

Abstract:

In Peru, there are more than two million houses made of adobe (sun dried mud bricks) or rammed earth (35% of the total houses), in which almost 9 million people live, mainly because they cannot afford to purchase industrialized construction materials. Although adobe houses are cheap to build and thermally comfortable, their seismic performance is very poor, and they usually suffer significant damage or collapse with tragic loss of life. Therefore, over the years, researchers at the Pontifical Catholic University of Peru and other institutions have developed many reinforcement techniques as an effort to improve the structural safety of earthen houses located in seismic areas. However, most rural communities live under unacceptable seismic risk conditions because these techniques have not been adopted massively, mainly due to high cost and lack of diffusion. The nylon rope mesh reinforcement technique is simple and low-cost, and two technological transference tools have been developed to diffuse it among rural communities: 1) Scale seismic simulations using a portable shaking table have been designed to prove its effectiveness to protect adobe houses; 2) A step-by-step illustrated construction manual has been developed to guide the complete building process of a nylon rope mesh reinforced adobe house. As a study case, it was selected the district of Pullo: a small rural community in the Peruvian Andes where more than 80% of its inhabitants live in adobe houses and more than 60% are considered to live in poverty or extreme poverty conditions. The research team carried out a one-day workshop in May 2015 and a two-day workshop in September 2015. Results were positive: First, the nylon rope mesh reinforcement procedure was proven simple enough to be replicated by adults, both young and seniors, and participants handled ropes and knots easily as they use them for daily livestock activity. In addition, nylon ropes were proven highly available in the study area as they were found at two local stores in variety of color and size.. Second, the portable shaking table demonstration successfully showed the effectiveness of the nylon rope mesh reinforcement and generated interest on learning about it. On the first workshop, more than 70% of the participants were willing to formally subscribe and sign up for practical training lessons. On the second workshop, more than 80% of the participants returned the second day to receive introductory practical training. Third, community members found illustrations on the construction manual simple and friendly but the roof system illustrations led to misinterpretation so they were improved. The technological transfer tools developed in this project can be used to train rural dwellers on earthquake-resistant self-construction with adobe, which is still very common in the Peruvian Andes. This approach would allow community members to develop skills and capacities to improve safety of their households on their own, thus, mitigating their high seismic risk and preventing tragic losses. Furthermore, proper training in earthquake-resistant self-construction with adobe would prevent rural dwellers from depending on external aid after an earthquake and become agents of their own development.

Keywords: adobe, Peruvian Andes, safe housing, technological transference

Procedia PDF Downloads 275
49 Consumer Preferences for Low-Carbon Futures: A Structural Equation Model Based on the Domestic Hydrogen Acceptance Framework

Authors: Joel A. Gordon, Nazmiye Balta-Ozkan, Seyed Ali Nabavi

Abstract:

Hydrogen-fueled technologies are rapidly advancing as a critical component of the low-carbon energy transition. In countries historically reliant on natural gas for home heating, such as the UK, hydrogen may prove fundamental for decarbonizing the residential sector, alongside other technologies such as heat pumps and district heat networks. While the UK government is set to take a long-term policy decision on the role of domestic hydrogen by 2026, there are considerable uncertainties regarding consumer preferences for ‘hydrogen homes’ (i.e., hydrogen-fueled appliances for space heating, hot water, and cooking. In comparison to other hydrogen energy technologies, such as road transport applications, to date, few studies have engaged with the social acceptance aspects of the domestic hydrogen transition, resulting in a stark knowledge deficit and pronounced risk to policymaking efforts. In response, this study aims to safeguard against undesirable policy measures by revealing the underlying relationships between the factors of domestic hydrogen acceptance and their respective dimensions: attitudinal, socio-political, community, market, and behavioral acceptance. The study employs an online survey (n=~2100) to gauge how different UK householders perceive the proposition of switching from natural gas to hydrogen-fueled appliances. In addition to accounting for housing characteristics (i.e., housing tenure, property type and number of occupants per dwelling) and several other socio-structural variables (e.g. age, gender, and location), the study explores the impacts of consumer heterogeneity on hydrogen acceptance by recruiting respondents from across five distinct groups: (1) fuel poor householders, (2) technology engaged householders, (3) environmentally engaged householders, (4) technology and environmentally engaged householders, and (5) a baseline group (n=~700) which filters out each of the smaller targeted groups (n=~350). This research design reflects the notion that supporting a socially fair and efficient transition to hydrogen will require parallel engagement with potential early adopters and demographic groups impacted by fuel poverty while also accounting strongly for public attitudes towards net zero. Employing a second-order multigroup confirmatory factor analysis (CFA) in Mplus, the proposed hydrogen acceptance model is tested to fit the data through a partial least squares (PLS) approach. In addition to testing differences between and within groups, the findings provide policymakers with critical insights regarding the significance of knowledge and awareness, safety perceptions, perceived community impacts, cost factors, and trust in key actors and stakeholders as potential explanatory factors of hydrogen acceptance. Preliminary results suggest that knowledge and awareness of hydrogen are positively associated with support for domestic hydrogen at the household, community, and national levels. However, with the exception of technology and/or environmentally engaged citizens, much of the population remains unfamiliar with hydrogen and somewhat skeptical of its application in homes. Knowledge and awareness present as critical to facilitating positive safety perceptions, alongside higher levels of trust and more favorable expectations for community benefits, appliance performance, and potential cost savings. Based on these preliminary findings, policymakers should be put on red alert about diffusing hydrogen into the public consciousness in alignment with energy security, fuel poverty, and net-zero agendas.

Keywords: hydrogen homes, social acceptance, consumer heterogeneity, heat decarbonization

Procedia PDF Downloads 87
48 Accountability of Artificial Intelligence: An Analysis Using Edgar Morin’s Complex Thought

Authors: Sylvie Michel, Sylvie Gerbaix, Marc Bidan

Abstract:

Artificial intelligence (AI) can be held accountable for its detrimental impacts. This question gains heightened relevance given AI's pervasive reach across various domains, magnifying its power and potential. The expanding influence of AI raises fundamental ethical inquiries, primarily centering on biases, responsibility, and transparency. This encompasses discriminatory biases arising from algorithmic criteria or data, accidents attributed to autonomous vehicles or other systems, and the imperative of transparent decision-making. This article aims to stimulate reflection on AI accountability, denoting the necessity to elucidate the effects it generates. Accountability comprises two integral aspects: adherence to legal and ethical standards and the imperative to elucidate the underlying operational rationale. The objective is to initiate a reflection on the obstacles to this "accountability," facing the challenges of the complexity of artificial intelligence's system and its effects. Then, this article proposes to mobilize Edgar Morin's complex thought to encompass and face the challenges of this complexity. The first contribution is to point out the challenges posed by the complexity of A.I., with fractional accountability between a myriad of human and non-human actors, such as software and equipment, which ultimately contribute to the decisions taken and are multiplied in the case of AI. Accountability faces three challenges resulting from the complexity of the ethical issues combined with the complexity of AI. The challenge of the non-neutrality of algorithmic systems as fully ethically non-neutral actors is put forward by a revealing ethics approach that calls for assigning responsibilities to these systems. The challenge of the dilution of responsibility is induced by the multiplicity and distancing between the actors. Thus, a dilution of responsibility is induced by a split in decision-making between developers, who feel they fulfill their duty by strictly respecting the requests they receive, and management, which does not consider itself responsible for technology-related flaws. Accountability is confronted with the challenge of transparency of complex and scalable algorithmic systems, non-human actors self-learning via big data. A second contribution involves leveraging E. Morin's principles, providing a framework to grasp the multifaceted ethical dilemmas and subsequently paving the way for establishing accountability in AI. When addressing the ethical challenge of biases, the "hologrammatic" principle underscores the imperative of acknowledging the non-ethical neutrality of algorithmic systems inherently imbued with the values and biases of their creators and society. The "dialogic" principle advocates for the responsible consideration of ethical dilemmas, encouraging the integration of complementary and contradictory elements in solutions from the very inception of the design phase. Aligning with the principle of organizing recursiveness, akin to the "transparency" of the system, it promotes a systemic analysis to account for the induced effects and guides the incorporation of modifications into the system to rectify deviations and reintroduce modifications into the system to rectify its drifts. In conclusion, this contribution serves as an inception for contemplating the accountability of "artificial intelligence" systems despite the evident ethical implications and potential deviations. Edgar Morin's principles, providing a lens to contemplate this complexity, offer valuable perspectives to address these challenges concerning accountability.

Keywords: accountability, artificial intelligence, complexity, ethics, explainability, transparency, Edgar Morin

Procedia PDF Downloads 40
47 Industrial Production of the Saudi Future Dwelling: A Saudi Volumetric Solution for Single Family Homes, Leveraging Industry 4.0 with Scalable Automation, Hybrid Structural Insulated Panels Technology and Local Materials

Authors: Bandar Alkahlan

Abstract:

The King Abdulaziz City for Science and Technology (KACST) created the Saudi Future Dwelling (SFD) initiative to identify, localize and commercialize a scalable home manufacturing technology suited to deployment across the Kingdom of Saudi Arabia (KSA). This paper outlines the journey, the creation of the international project delivery team, the product design, the selection of the process technologies, and the outcomes. A target was set to remove 85% of the construction and finishing processes from the building site as these activities could be more efficiently completed in a factory environment. Therefore, integral to the SFD initiative is the successful industrialization of the home building process using appropriate technologies, automation, robotics, and manufacturing logistics. The technologies proposed for the SFD housing system are designed to be energy efficient, economical, fit for purpose from a Saudi cultural perspective, and will minimize the use of concrete, relying mainly on locally available Saudi natural materials derived from the local resource industries. To this end, the building structure is comprised of a hybrid system of structural insulated panels (SIP), combined with a light gauge steel framework manufactured in a large format panel system. The paper traces the investigative process and steps completed by the project team during the selection process. As part of the SFD Project, a pathway was mapped out to include a proof-of-concept prototype housing module and the set-up and commissioning of a lab-factory complete with all production machinery and equipment necessary to simulate a full-scale production environment. The prototype housing module was used to validate and inform current and future product design as well as manufacturing process decisions. A description of the prototype design and manufacture is outlined along with valuable learning derived from the build and how these results were used to enhance the SFD project. The industrial engineering concepts and lab-factory detailed design and layout are described in the paper, along with the shop floor I.T. management strategy. Special attention was paid to showcase all technologies within the lab-factory as part of the engagement strategy with private investors to leverage the SFD project with large scale factories throughout the Kingdom. A detailed analysis is included in the process surrounding the design, specification, and procurement of the manufacturing machinery, equipment, and logistical manipulators required to produce the SFD housing modules. The manufacturing machinery was comprised of a combination of standardized and bespoke equipment from a wide range of international suppliers. The paper describes the selection process, pre-ordering trials and studies, and, in some cases, the requirement for additional research and development by the equipment suppliers in order to achieve the SFD objectives. A set of conclusions is drawn describing the results achieved thus far, along with a list of recommended ongoing operational tests, enhancements, research, and development aimed at achieving full-scale engagement with private sector investment and roll-out of the SFD project across the Kingdom.

Keywords: automation, dwelling, manufacturing, product design

Procedia PDF Downloads 102
46 i2kit: A Tool for Immutable Infrastructure Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservice architectures are increasingly in distributed cloud applications due to the advantages on the software composition, development speed, release cycle frequency and the business logic time to market. On the other hand, these architectures also introduce some challenges on the testing and release phases of applications. Container technology solves some of these issues by providing reproducible environments, easy of software distribution and isolation of processes. However, there are other issues that remain unsolved in current container technology when dealing with multiple machines, such as networking for multi-host communication, service discovery, load balancing or data persistency (even though some of these challenges are already solved by traditional cloud vendors in a very mature and widespread manner). Container cluster management tools, such as Kubernetes, Mesos or Docker Swarm, attempt to solve these problems by introducing a new control layer where the unit of deployment is the container (or the pod — a set of strongly related containers that must be deployed on the same machine). These tools are complex to configure and manage and they do not follow a pure immutable infrastructure approach since servers are reused between deployments. Indeed, these tools introduce dependencies at execution time for solving networking or service discovery problems. If an error on the control layer occurs, which would affect running applications, specific expertise is required to perform ad-hoc troubleshooting. As a consequence, it is not surprising that container cluster support is becoming a source of revenue for consulting services. This paper presents i2kit, a deployment tool based on the immutable infrastructure pattern, where the virtual machine is the unit of deployment. The input for i2kit is a declarative definition of a set of microservices, where each microservice is defined as a pod of containers. Microservices are built into machine images using linuxkit —- a tool for creating minimal linux distributions specialized in running containers. These machine images are then deployed to one or more virtual machines, which are exposed through a cloud vendor load balancer. Finally, the load balancer endpoint is set into other microservices using an environment variable, providing service discovery. The toolkit i2kit reuses the best ideas from container technology to solve problems like reproducible environments, process isolation, and software distribution, and at the same time relies on mature, proven cloud vendor technology for networking, load balancing and persistency. The result is a more robust system with no learning curve for troubleshooting running applications. We have implemented an open source prototype that transforms i2kit definitions into AWS cloud formation templates, where each microservice AMI (Amazon Machine Image) is created on the fly using linuxkit. Even though container cluster management tools have more flexibility for resource allocation optimization, we defend that adding a new control layer implies more important disadvantages. Resource allocation is greatly improved by using linuxkit, which introduces a very small footprint (around 35MB). Also, the system is more secure since linuxkit installs the minimum set of dependencies to run containers. The toolkit i2kit is currently under development at the IMDEA Software Institute.

Keywords: container, deployment, immutable infrastructure, microservice

Procedia PDF Downloads 157
45 Prospects of Acellular Organ Scaffolds for Drug Discovery

Authors: Inna Kornienko, Svetlana Guryeva, Natalia Danilova, Elena Petersen

Abstract:

Drug toxicity often goes undetected until clinical trials, the most expensive and dangerous phase of drug development. Both human cell culture and animal studies have limitations that cannot be overcome by improvements in drug testing protocols. Tissue engineering is an emerging alternative approach to creating models of human malignant tumors for experimental oncology, personalized medicine, and drug discovery studies. This new generation of bioengineered tumors provides an opportunity to control and explore the role of every component of the model system including cell populations, supportive scaffolds, and signaling molecules. An area that could greatly benefit from these models is cancer research. Recent advances in tissue engineering demonstrated that decellularized tissue is an excellent scaffold for tissue engineering. Decellularization of donor organs such as heart, liver, and lung can provide an acellular, naturally occurring three-dimensional biologic scaffold material that can then be seeded with selected cell populations. Preliminary studies in animal models have provided encouraging results for the proof of concept. Decellularized Organs preserve organ microenvironment, which is critical for cancer metastasis. Utilizing 3D tumor models results greater proximity of cell culture morphological characteristics in a model to its in vivo counterpart, allows more accurate simulation of the processes within a functioning tumor and its pathogenesis. 3D models allow study of migration processes and cell proliferation with higher reliability as well. Moreover, cancer cells in a 3D model bear closer resemblance to living conditions in terms of gene expression, cell surface receptor expression, and signaling. 2D cell monolayers do not provide the geometrical and mechanical cues of tissues in vivo and are, therefore, not suitable to accurately predict the responses of living organisms. 3D models can provide several levels of complexity from simple monocultures of cancer cell lines in liquid environment comprised of oxygen and nutrient gradients and cell-cell interaction to more advanced models, which include co-culturing with other cell types, such as endothelial and immune cells. Following this reasoning, spheroids cultivated from one or multiple patient-derived cell lines can be utilized to seed the matrix rather than monolayer cells. This approach furthers the progress towards personalized medicine. As an initial step to create a new ex vivo tissue engineered model of a cancer tumor, optimized protocols have been designed to obtain organ-specific acellular matrices and evaluate their potential as tissue engineered scaffolds for cultures of normal and tumor cells. Decellularized biomatrix was prepared from animals’ kidneys, urethra, lungs, heart, and liver by two decellularization methods: perfusion in a bioreactor system and immersion-agitation on an orbital shaker with the use of various detergents (SDS, Triton X-100) in different concentrations and freezing. Acellular scaffolds and tissue engineered constructs have been characterized and compared using morphological methods. Models using decellularized matrix have certain advantages, such as maintaining native extracellular matrix properties and biomimetic microenvironment for cancer cells; compatibility with multiple cell types for cell culture and drug screening; utilization to culture patient-derived cells in vitro to evaluate different anticancer therapeutics for developing personalized medicines.

Keywords: 3D models, decellularization, drug discovery, drug toxicity, scaffolds, spheroids, tissue engineering

Procedia PDF Downloads 280
44 Unique Interprofessional Mental Health Education Model: A Pre/Post Survey

Authors: Michele L. Tilstra, Tiffany J. Peets

Abstract:

Interprofessional collaboration in behavioral healthcare education is increasingly recognized for its value in training students to address diverse client needs. While interprofessional education (IPE) is well-documented in occupational therapy education to address physical health, limited research exists on collaboration with counselors to address mental health concerns and the psychosocial needs of individuals receiving care. Counseling education literature primarily examines the collaboration of counseling students with psychiatrists, psychologists, social workers, and marriage and family therapists. This pretest/posttest survey research study explored changes in attitudes toward interprofessional teams among 56 Master of Occupational Therapy (MOT) (n = 42) and Counseling and Human Development (CHD) (n = 14) students participating in the Counselors and Occupational Therapists Professionally Engaged in the Community (COPE) program. The COPE program was designed to strengthen the behavioral health workforce in high-need and high-demand areas. Students accepted into the COPE program were divided into small MOT/CHD groups to complete multiple interprofessional multicultural learning modules using videos, case studies, and online discussion board posts. The online modules encouraged reflection on various behavioral healthcare roles, benefits of team-based care, cultural humility, current mental health challenges, personal biases, power imbalances, and advocacy for underserved populations. Using the Student Perceptions of Interprofessional Clinical Education- Revision 2 (SPICE-R2) scale, students completed pretest and posttest surveys using a 5-point Likert scale (Strongly Agree = 5 to Strongly Disagree = 1) to evaluate their attitudes toward interprofessional teamwork and collaboration. The SPICE-R2 measured three different factors: interprofessional teamwork and team-based practice (Team), roles/responsibilities for collaborative practice (Roles), and patient outcomes from collaborative practice (Outcomes). The mean total scores for all students improved from 4.25 (pretest) to 4.43 (posttest), Team from 4.66 to 4.58, Roles from 3.88 to 4.30, and Outcomes from 4.08 to 4.36. A paired t-test analysis for the total mean scores resulted in a t-statistic of 2.54, which exceeded both one-tail and two-tail critical values, indicating statistical significance (p = .001). When the factors of the SPICE-R2 were analyzed separately, only the Roles (t Stat=4.08, p =.0001) and Outcomes (t Stat=3.13, p = .002) were statistically significant. The item ‘I understand the roles of other health professionals’ showed the most improvement from a mean score for all students of 3.76 (pretest) to 4.46 (posttest). The significant improvement in students' attitudes toward interprofessional teams suggests that the unique integration of OT and CHD students in the COPE program effectively develops a better understanding of the collaborative roles necessary for holistic client care. These results support the importance of IPE through structured, engaging interprofessional experiences. These experiences are essential for enhancing students' readiness for collaborative practice and align with accreditation standards requiring interprofessional education in OT and CHD programs to prepare practitioners for team-based care. The findings contribute to the growing body of evidence supporting the integration of IPE in behavioral healthcare curricula to improve holistic client care and encourage students to engage in collaborative practice across healthcare settings.

Keywords: behavioral healthcare, counseling education, interprofessional education, mental health education, occupational therapy education

Procedia PDF Downloads 19
43 Unveiling the Dynamics of Preservice Teachers’ Engagement with Mathematical Modeling through Model Eliciting Activities: A Comprehensive Exploration of Acceptance and Resistance Towards Modeling and Its Pedagogy

Authors: Ozgul Kartal, Wade Tillett, Lyn D. English

Abstract:

Despite its global significance in curricula, mathematical modeling encounters persistent disparities in recognition and emphasis within regular mathematics classrooms and teacher education across countries with diverse educational and cultural traditions, including variations in the perceived role of mathematical modeling. Over the past two decades, increased attention has been given to the integration of mathematical modeling into national curriculum standards in the U.S. and other countries. Therefore, the mathematics education research community has dedicated significant efforts to investigate various aspects associated with the teaching and learning of mathematical modeling, primarily focusing on exploring the applicability of modeling in schools and assessing students', teachers', and preservice teachers' (PTs) competencies and engagement in modeling cycles and processes. However, limited attention has been directed toward examining potential resistance hindering teachers and PTs from effectively implementing mathematical modeling. This study focuses on how PTs, without prior modeling experience, resist and/or embrace mathematical modeling and its pedagogy as they learn about models and modeling perspectives, navigate the modeling process, design and implement their modeling activities and lesson plans, and experience the pedagogy enabling modeling. Model eliciting activities (MEAs) were employed due to their high potential to support the development of mathematical modeling pedagogy. The mathematical modeling module was integrated into a mathematics methods course to explore how PTs embraced or resisted mathematical modeling and its pedagogy. The module design included reading, reflecting, engaging in modeling, assessing models, creating a modeling task (MEA), and designing a modeling lesson employing an MEA. Twelve senior undergraduate students participated, and data collection involved video recordings, written prompts, lesson plans, and reflections. An open coding analysis revealed acceptance and resistance toward teaching mathematical modeling. The study identified four overarching themes, including both acceptance and resistance: pedagogy, affordance of modeling (tasks), modeling actions, and adjusting modeling. In the category of pedagogy, PTs displayed acceptance based on potential pedagogical benefits and resistance due to various concerns. The affordance of modeling (tasks) category emerged from instances when PTs showed acceptance or resistance while discussing the nature and quality of modeling tasks, often debating whether modeling is considered mathematics. PTs demonstrated both acceptance and resistance in their modeling actions, engaging in modeling cycles as students and designing/implementing MEAs as teachers. The adjusting modeling category captured instances where PTs accepted or resisted maintaining the qualities and nature of the modeling experience or converted modeling into a typical structured mathematics experience for students. While PTs displayed a mix of acceptance and resistance in their modeling actions, limitations were observed in embracing complexity and adhering to model principles. The study provides valuable insights into the challenges and opportunities of integrating mathematical modeling into teacher education, emphasizing the importance of addressing pedagogical concerns and providing support for effective implementation. In conclusion, this research offers a comprehensive understanding of PTs' engagement with modeling, advocating for a more focused discussion on the distinct nature and significance of mathematical modeling in the broader curriculum to establish a foundation for effective teacher education programs.

Keywords: mathematical modeling, model eliciting activities, modeling pedagogy, secondary teacher education

Procedia PDF Downloads 40
42 Extension of Moral Agency to Artificial Agents

Authors: Sofia Quaglia, Carmine Di Martino, Brendan Tierney

Abstract:

Artificial Intelligence (A.I.) constitutes various aspects of modern life, from the Machine Learning algorithms predicting the stocks on Wall streets to the killing of belligerents and innocents alike on the battlefield. Moreover, the end goal is to create autonomous A.I.; this means that the presence of humans in the decision-making process will be absent. The question comes naturally: when an A.I. does something wrong when its behavior is harmful to the community and its actions go against the law, which is to be held responsible? This research’s subject matter in A.I. and Robot Ethics focuses mainly on Robot Rights and its ultimate objective is to answer the questions: (i) What is the function of rights? (ii) Who is a right holder, what is personhood and the requirements needed to be a moral agent (therefore, accountable for responsibility)? (iii) Can an A.I. be a moral agent? (ontological requirements) and finally (iv) if it ought to be one (ethical implications). With the direction to answer this question, this research project was done via a collaboration between the School of Computer Science in the Technical University of Dublin that oversaw the technical aspects of this work, as well as the Department of Philosophy in the University of Milan, who supervised the philosophical framework and argumentation of the project. Firstly, it was found that all rights are positive and based on consensus; they change with time based on circumstances. Their function is to protect the social fabric and avoid dangerous situations. The same goes for the requirements considered necessary to be a moral agent: those are not absolute; in fact, they are constantly redesigned. Hence, the next logical step was to identify what requirements are regarded as fundamental in real-world judicial systems, comparing them to that of ones used in philosophy. Autonomy, free will, intentionality, consciousness and responsibility were identified as the requirements to be considered a moral agent. The work went on to build a symmetrical system between personhood and A.I. to enable the emergence of the ontological differences between the two. Each requirement is introduced, explained in the most relevant theories of contemporary philosophy, and observed in its manifestation in A.I. Finally, after completing the philosophical and technical analysis, conclusions were drawn. As underlined in the research questions, there are two issues regarding the assignment of moral agency to artificial agent: the first being that all the ontological requirements must be present and secondly being present or not, whether an A.I. ought to be considered as an artificial moral agent. From an ontological point of view, it is very hard to prove that an A.I. could be autonomous, free, intentional, conscious, and responsible. The philosophical accounts are often very theoretical and inconclusive, making it difficult to fully detect these requirements on an experimental level of demonstration. However, from an ethical point of view it makes sense to consider some A.I. as artificial moral agents, hence responsible for their own actions. When considering artificial agents as responsible, there can be applied already existing norms in our judicial system such as removing them from society, and re-educating them, in order to re-introduced them to society. This is in line with how the highest profile correctional facilities ought to work. Noticeably, this is a provisional conclusion and research must continue further. Nevertheless, the strength of the presented argument lies in its immediate applicability to real world scenarios. To refer to the aforementioned incidents, involving the murderer of innocents, when this thesis is applied it is possible to hold an A.I. accountable and responsible for its actions. This infers removing it from society by virtue of its un-usability, re-programming it and, only when properly functioning, re-introducing it successfully

Keywords: artificial agency, correctional system, ethics, natural agency, responsibility

Procedia PDF Downloads 161
41 Identification Strategies for Unknown Victims from Mass Disasters and Unknown Perpetrators from Violent Crime or Terrorist Attacks

Authors: Michael Josef Schwerer

Abstract:

Background: The identification of unknown victims from mass disasters, violent crimes, or terrorist attacks is frequently facilitated through information from missing persons lists, portrait photos, old or recent pictures showing unique characteristics of a person such as scars or tattoos, or simply reference samples from blood relatives for DNA analysis. In contrast, the identification or at least the characterization of an unknown perpetrator from criminal or terrorist actions remains challenging, particularly in the absence of material or data for comparison, such as fingerprints, which had been previously stored in criminal records. In scenarios that result in high levels of destruction of the perpetrator’s corpse, for instance, blast or fire events, the chance for a positive identification using standard techniques is further impaired. Objectives: This study shows the forensic genetic procedures in the Legal Medicine Service of the German Air Force for the identification of unknown individuals, including such cases in which reference samples are not available. Scenarios requiring such efforts predominantly involve aircraft crash investigations, which are routinely carried out by the German Air Force Centre of Aerospace Medicine as one of the Institution’s essential missions. Further, casework by military police or military intelligence is supported based on administrative cooperation. In the talk, data from study projects, as well as examples from real casework, will be demonstrated and discussed with the audience. Methods: Forensic genetic identification in our laboratories involves the analysis of Short Tandem Repeats and Single Nucleotide Polymorphisms in nuclear DNA along with mitochondrial DNA haplotyping. Extended DNA analysis involves phenotypic markers for skin, hair, and eye color together with the investigation of a person’s biogeographic ancestry. Assessment of the biological age of an individual employs CpG-island methylation analysis using bisulfite-converted DNA. Forensic Investigative Genealogy assessment allows the detection of an unknown person’s blood relatives in reference databases. Technically, end-point-PCR, real-time PCR, capillary electrophoresis, pyrosequencing as well as next generation sequencing using flow-cell-based and chip-based systems are used. Results and Discussion: Optimization of DNA extraction from various sources, including difficult matrixes like formalin-fixed, paraffin-embedded tissues, degraded specimens from decomposed bodies or from decedents exposed to blast or fire events, provides soil for successful PCR amplification and subsequent genetic profiling. For cases with extremely low yields of extracted DNA, whole genome preamplification protocols are successfully used, particularly regarding genetic phenotyping. Improved primer design for CpG-methylation analysis, together with validated sampling strategies for the analyzed substrates from, e.g., lymphocyte-rich organs, allows successful biological age estimation even in bodies with highly degraded tissue material. Conclusions: Successful identification of unknown individuals or at least their phenotypic characterization using pigmentation markers together with age-informative methylation profiles, possibly supplemented by family tree search employing Forensic Investigative Genealogy, can be provided in specialized laboratories. However, standard laboratory procedures must be adapted to work with difficult and highly degraded sample materials.

Keywords: identification, forensic genetics, phenotypic markers, CPG methylation, biological age estimation, forensic investigative genealogy

Procedia PDF Downloads 25
40 Developing Primal Teachers beyond the Classroom: The Quadrant Intelligence (Q-I) Model

Authors: Alexander K. Edwards

Abstract:

Introduction: The moral dimension of teacher education globally has assumed a new paradigm of thinking based on the public gain (return-on-investments), value-creation (quality), professionalism (practice), and business strategies (innovations). Abundant literature reveals an interesting revolutionary trend in complimenting the raising of teachers and academic performances. Because of the global competition in the knowledge-creation and service areas, the C21st teacher at all levels is expected to be resourceful, strategic thinker, socially intelligent, relationship aptitude, and entrepreneur astute. This study is a significant contribution to practice and innovations to raise exemplary or primal teachers. In this study, the qualities needed were considered as ‘Quadrant Intelligence (Q-i)’ model for a primal teacher leadership beyond the classroom. The researcher started by examining the issue of the majority of teachers in Ghana Education Services (GES) in need of this Q-i to be effective and efficient. The conceptual framing became determinants of such Q-i. This is significant for global employability and versatility in teacher education to create premium and primal teacher leadership, which are again gaining high attention in scholarship due to failing schools. The moral aspect of teachers failing learners is a highly important discussion. In GES, some schools score zero percent at the basic education certificate examination (BECE). The question is what will make any professional teacher highly productive, marketable, and an entrepreneur? What will give teachers the moral consciousness of doing the best to succeed? Method: This study set out to develop a model for primal teachers in GES as an innovative way to highlight a premium development for the C21st business-education acumen through desk reviews. The study is conceptually framed by examining certain skill sets such as strategic thinking, social intelligence, relational and emotional intelligence and entrepreneurship to answer three main burning questions and other hypotheses. Then the study applied the causal comparative methodology with a purposive sampling technique (N=500) from CoE, GES, NTVI, and other teachers associations. Participants responded to a 30-items, researcher-developed questionnaire. Data is analyzed on the quadrant constructs and reported as ex post facto analyses of multi-variances and regressions. Multiple associations were established for statistical significance (p=0.05). Causes and effects are postulated for scientific discussions. Findings: It was found out that these quadrants are very significant in teacher development. There were significant variations in the demographic groups. However, most teachers lack considerable skills in entrepreneurship, leadership in teaching and learning, and business thinking strategies. These have significant effect on practices and outcomes. Conclusion and Recommendations: It is quite conclusive therefore that in GES teachers may need further instructions in innovations and creativity to transform knowledge-creation into business venture. In service training (INSET) has to be comprehensive. Teacher education curricula at Colleges may have to be re-visited. Teachers have the potential to raise their social capital, to be entrepreneur, and to exhibit professionalism beyond their community services. Their primal leadership focus will benefit many clienteles including students and social circles. Recommendations examined the policy implications for curriculum design, practice, innovations and educational leadership.

Keywords: emotional intelligence, entrepreneurship, leadership, quadrant intelligence (q-i), primal teacher leadership, strategic thinking, social intelligence

Procedia PDF Downloads 285
39 A Computational Investigation of Potential Drugs for Cholesterol Regulation to Treat Alzheimer’s Disease

Authors: Marina Passero, Tianhua Zhai, Zuyi (Jacky) Huang

Abstract:

Alzheimer’s disease has become a major public health issue, as indicated by the increasing populations of Americans living with Alzheimer’s disease. After decades of extensive research in Alzheimer’s disease, only seven drugs have been approved by Food and Drug Administration (FDA) to treat Alzheimer’s disease. Five of these drugs were designed to treat the dementia symptoms, and only two drugs (i.e., Aducanumab and Lecanemab) target the progression of Alzheimer’s disease, especially the accumulation of amyloid-b plaques. However, controversial comments were raised for the accelerated approvals of either Aducanumab or Lecanemab, especially with concerns on safety and side effects of these two drugs. There is still an urgent need for further drug discovery to target the biological processes involved in the progression of Alzheimer’s disease. Excessive cholesterol has been found to accumulate in the brain of those with Alzheimer’s disease. Cholesterol can be synthesized in both the blood and the brain, but the majority of biosynthesis in the adult brain takes place in astrocytes and is then transported to the neurons via ApoE. The blood brain barrier separates cholesterol metabolism in the brain from the rest of the body. Various proteins contribute to the metabolism of cholesterol in the brain, which offer potential targets for Alzheimer’s treatment. In the astrocytes, SREBP cleavage-activating protein (SCAP) binds to Sterol Regulatory Element-binding Protein 2 (SREBP2) in order to transport the complex from the endoplasmic reticulum to the Golgi apparatus. Cholesterol is secreted out of the astrocytes by ATP-Binding Cassette A1 (ABCA1) transporter. Lipoprotein receptors such as triggering receptor expressed on myeloid cells 2 (TREM2) internalize cholesterol into the microglia, while lipoprotein receptors such as Low-density lipoprotein receptor-related protein 1 (LRP1) internalize cholesterol into the neuron. Cytochrome P450 Family 46 Subfamily A Member 1 (CYP46A1) converts excess cholesterol to 24S-hydroxycholesterol (24S-OHC). Cholesterol has been approved for its direct effect on the production of amyloid-beta and tau proteins. The addition of cholesterol to the brain promotes the activity of beta-site amyloid precursor protein cleaving enzyme 1 (BACE1), secretase, and amyloid precursor protein (APP), which all aid in amyloid-beta production. The reduction of cholesterol esters in the brain have been found to reduce phosphorylated tau levels in mice. In this work, a computational pipeline was developed to identify the protein targets involved in cholesterol regulation in brain and further to identify chemical compounds as the inhibitors of a selected protein target. Since extensive evidence shows the strong correlation between brain cholesterol regulation and Alzheimer’s disease, a detailed literature review on genes or pathways related to the brain cholesterol synthesis and regulation was first conducted in this work. An interaction network was then built for those genes so that the top gene targets were identified. The involvement of these genes in Alzheimer’s disease progression was discussed, which was followed by the investigation of existing clinical trials for those targets. A ligand-protein docking program was finally developed to screen 1.5 million chemical compounds for the selected protein target. A machine learning program was developed to evaluate and predict the binding interaction between chemical compounds and the protein target. The results from this work pave the way for further drug discovery to regulate brain cholesterol to combat Alzheimer’s disease.

Keywords: Alzheimer’s disease, drug discovery, ligand-protein docking, gene-network analysis, cholesterol regulation

Procedia PDF Downloads 46
38 Deciphering Information Quality: Unraveling the Impact of Information Distortion in the UK Aerospace Supply Chains

Authors: Jing Jin

Abstract:

The incorporation of artificial intelligence (AI) and machine learning (ML) in aircraft manufacturing and aerospace supply chains leads to the generation of a substantial amount of data among various tiers of suppliers and OEMs. Identifying the high-quality information challenges decision-makers. The application of AI/ML models necessitates access to 'high-quality' information to yield desired outputs. However, the process of information sharing introduces complexities, including distortion through various communication channels and biases introduced by both human and AI entities. This phenomenon significantly influences the quality of information, impacting decision-makers engaged in configuring supply chain systems. Traditionally, distorted information is categorized as 'low-quality'; however, this study challenges this perception, positing that distorted information, contributing to stakeholder goals, can be deemed high-quality within supply chains. The main aim of this study is to identify and evaluate the dimensions of information quality crucial to the UK aerospace supply chain. Guided by a central research question, "What information quality dimensions are considered when defining information quality in the UK aerospace supply chain?" the study delves into the intricate dynamics of information quality in the aerospace industry. Additionally, the research explores the nuanced impact of information distortion on stakeholders' decision-making processes, addressing the question, "How does the information distortion phenomenon influence stakeholders’ decisions regarding information quality in the UK aerospace supply chain system?" This study employs deductive methodologies rooted in positivism, utilizing a cross-sectional approach and a mono-quantitative method -a questionnaire survey. Data is systematically collected from diverse tiers of supply chain stakeholders, encompassing end-customers, OEMs, Tier 0.5, Tier 1, and Tier 2 suppliers. Employing robust statistical data analysis methods, including mean values, mode values, standard deviation, one-way analysis of variance (ANOVA), and Pearson’s correlation analysis, the study interprets and extracts meaningful insights from the gathered data. Initial analyses challenge conventional notions, revealing that information distortion positively influences the definition of information quality, disrupting the established perception of distorted information as inherently low-quality. Further exploration through correlation analysis unveils the varied perspectives of different stakeholder tiers on the impact of information distortion on specific information quality dimensions. For instance, Tier 2 suppliers demonstrate strong positive correlations between information distortion and dimensions like access security, accuracy, interpretability, and timeliness. Conversely, Tier 1 suppliers emphasise strong negative influences on the security of accessing information and negligible impact on information timeliness. Tier 0.5 suppliers showcase very strong positive correlations with dimensions like conciseness and completeness, while OEMs exhibit limited interest in considering information distortion within the supply chain. Introducing social network analysis (SNA) provides a structural understanding of the relationships between information distortion and quality dimensions. The moderately high density of ‘information distortion-by-information quality’ underscores the interconnected nature of these factors. In conclusion, this study offers a nuanced exploration of information quality dimensions in the UK aerospace supply chain, highlighting the significance of individual perspectives across different tiers. The positive influence of information distortion challenges prevailing assumptions, fostering a more nuanced understanding of information's role in the Industry 4.0 landscape.

Keywords: information distortion, information quality, supply chain configuration, UK aerospace industry

Procedia PDF Downloads 37
37 Using the UK as a Case Study to Assess the Current State of Large Woody Debris Restoration as a Tool for Improving the Ecological Status of Natural Watercourses Globally

Authors: Isabelle Barrett

Abstract:

Natural watercourses provide a range of vital ecosystem services, notably freshwater provision. They also offer highly heterogeneous habitat which supports an extreme diversity of aquatic life. Exploitation of rivers, changing land use and flood prevention measures have led to habitat degradation and subsequent biodiversity loss; indeed, freshwater species currently face a disproportionate rate of extinction compared to their terrestrial and marine counterparts. Large woody debris (LWD) encompasses the trees, large branches and logs which fall into watercourses, and is responsible for important habitat characteristics. Historically, natural LWD has been removed from streams under the assumption that it is not aesthetically pleasing and is thus ecologically unfavourable, despite extensive evidence contradicting this. Restoration efforts aim to replace lost LWD in order to reinstate habitat heterogeneity. This paper aims to assess the current state of such restoration schemes for improving fluvial ecological health in the UK. A detailed review of the scientific literature was conducted alongside a meta-analysis of 25 UK-based projects involving LWD restoration. Projects were chosen for which sufficient information was attainable for analysis, covering a broad range of budgets and scales. The most effective strategies for river restoration encompass ecological success, stakeholder engagement and scientific advancement, however few projects surveyed showed sensitivity to all three; for example, only 32% of projects stated biological aims. Focus tended to be on stakeholder engagement and public approval, since this is often a key funding driver. Consequently, there is a tendency to focus on the aesthetic outcomes of a project, however physical habitat restoration does not necessarily lead to direct biodiversity increases. This highlights the significance of rivers as highly heterogeneous environments with multiple interlinked processes, and emphasises a need for a stronger scientific presence in project planning. Poor scientific rigour means monitoring is often lacking, with varying, if any, definitions of success which are rarely pre-determined. A tendency to overlook negative or neutral results was apparent, with unjustified focus often put on qualitative results. The temporal scale of monitoring is typically inadequate to facilitate scientific conclusions, with only 20% of projects surveyed reporting any pre-restoration monitoring. Furthermore, monitoring is often limited to a few variables, with biotic monitoring often fish-focussed. Due to their longer life cycles and dispersal capability, fish are usually poor indicators of environmental change, making it difficult to attribute any changes in ecological health to restoration efforts. Although the potential impact of LWD restoration may be positive, this method of restoration could simply be making short-term, small-scale improvements; without addressing the underlying symptoms of degradation, for example water quality, the issue cannot be fully resolved. Promotion of standardised monitoring for LWD projects could help establish a deeper understanding of the ecology surrounding the practice, supporting movement towards adaptive management in which scientific evidence feeds back to practitioners, enabling the design of more efficient projects with greater ecological success. By highlighting LWD, this study hopes to address the difficulties faced within river management, and emphasise the need for a more holistic international and inter-institutional approach to tackling problems associated with degradation.

Keywords: biological monitoring, ecological health, large woody debris, river management, river restoration

Procedia PDF Downloads 187
36 Evaluation of Coal Quality and Geomechanical Moduli Using Core and Geophysical Logs: Study from Middle Permian Barakar Formation of Gondwana Coalfield

Authors: Joyjit Dey, Souvik Sen

Abstract:

Middle Permian Barakar formation is the major economic coal bearing unit of vast east-west trending Damodar Valley basin of Gondwana coalfield. Primary sedimentary structures were studied from the core holes, which represent majorly four facies groups: sandstone dominated facies, sandstone-shale heterolith facies, shale facies and coal facies. Total eight major coal seams have been identified with the bottom most seam being the thickest. Laterally, continuous coal seams were deposited in the calm and quiet environment of extensive floodplain swamps. Channel sinuosity and lateral channel migration/avulsion results in lateral facies heterogeneity and coal splitting. Geophysical well logs (Gamma-Resistivity-Density logs) have been used to establish the vertical and lateral correlation of various litho units field-wide, which reveals the predominance of repetitive fining upwards cycles. Well log data being a permanent record, offers a strong foundation for generating log based property evaluation and helps in characterization of depositional units in terms of lateral and vertical heterogeneity. Low gamma, high resistivity, low density is the typical coal seam signatures in geophysical logs. Here, we have used a density cutoff of 1.6 g/cc as a primary discriminator of coal and the same has been employed to compute various coal assay parameters, which are ash, fixed carbon, moisture, volatile content, cleat porosity, vitrinite reflectance (VRo%), which were calibrated with the laboratory based measurements. The study shows ash content and VRo% increase from west to east (towards basin margin), while fixed carbon, moisture and volatile content increase towards west, depicting increased coal quality westwards. Seam wise cleat porosity decreases from east to west, this would be an effect of overburden, as overburden pressure increases westward with the deepening of basin causing more sediment packet deposited on the western side of the study area. Coal is a porous, viscoelastic material in which velocity and strain both change nonlinearly with stress, especially for stress applied perpendicular to the bedding plane. Usually, the coal seam has a high velocity contrast relative to its neighboring layers. Despite extensive discussion of the maceral and chemical properties of coal, its elastic characteristics have received comparatively little attention. The measurement of the elastic constants of coal presents many difficulties: sample-to-sample inhomogeneity and fragility and velocity dependence on stress, orientation, humidity, and chemical content. In this study, a conclusive empirical equation VS= 0.80VP-0.86 has been used to model shear velocity from compression velocity. Also the same has been used to compute various geomechanical moduli. Geomech analyses yield a Poisson ratio of 0.348 against coals. Average bulk modulus value is 3.97 GPA, while average shear modulus and Young’s modulus values are coming out as 1.34 and 3.59 GPA respectively. These middle Permian Barakar coals show an average 23.84 MPA uniaxial compressive strength (UCS) with 4.97 MPA cohesive strength and 0.46 as friction coefficient. The output values of log based proximate parameters and geomechanical moduli suggest a medium volatile Bituminous grade for the studied coal seams, which is found in the laboratory based core study as well.

Keywords: core analysis, coal characterization, geophysical log, geo-mechanical moduli

Procedia PDF Downloads 207
35 Glocalization of Journalism and Mass Communication Education: Best Practices from an International Collaboration on Curriculum Development

Authors: Bellarmine Ezumah, Michael Mawa

Abstract:

Glocalization is often defined as the practice of conducting business according to both local and global considerations – this epitomizes the curriculum co-development collaboration between a journalism and mass communications professor from a university in the United States and the Uganda Martyrs University in Uganda where a brand new journalism and mass communications program was recently co-developed. This paper presents the experiences and research result of this initiative which was funded through the Institute of International Education (IIE) under the umbrella of the Carnegie African Diaspora Fellowship Program (CADFP). Vital international and national concerns were addressed. On a global level, scholars have questioned and criticized the general Western-module ingrained in journalism and mass communication curriculum and proposed a decolonization of journalism curricula. Another major criticism is the concept of western-based educators transplanting their curriculum verbatim to other regions of the world without paying greater attention to the local needs. To address these two global concerns, an extensive assessment of local needs was conducted prior to the conceptualization of the new program. The assessment of needs adopted a participatory action model and captured the knowledge and narratives of both internal and external stakeholders. This involved review of pertinent documents including the nation’s constitution, governmental briefs, and promulgations, interviews with governmental officials, media and journalism educators, media practitioners, students, and benchmarking the curriculum of other tertiary institutions in the nation. Information gathered through this process served as blueprint and frame of reference for all design decisions. In the area of local needs, four key factors were addressed. First, the realization that most media personnel in Uganda are both academically and professionally unqualified. Second, the practitioners with academic training were found lacking in experience. Third, the current curricula offered at several tertiary institutions are not comprehensive and lack local relevance. The project addressed these problems thus: first, the program was designed to cater to both traditional and non-traditional students offering opportunities for unqualified media practitioners to get their formal training through evening and weekender programs. Secondly, the challenge of inexperienced graduates was mitigated by designing the program to adopt the experiential learning approach which many refer to as the ‘Teaching Hospital Model’. This entails integrating practice to theory - similar to the way medical students engage in hands-on practice under the supervision of a mentor. The university drew a Memorandum of Understanding (MoU) with reputable media houses for students and faculty to use their studios for hands-on experience and for seasoned media practitioners to guest-teach some courses. With the convergence functions of media industry today, graduates should be trained to have adequate knowledge of other disciplines; therefore, the curriculum integrated cognate courses that would render graduates versatile. Ultimately, this research serves as a template for African colleges and universities to follow in their quest to glocalize their curricula. While the general concept of journalism may remain western, journalism curriculum developers in Africa through extensive assessment of needs, and focusing on those needs and other societal particularities, can adjust the western module to fit their local needs.

Keywords: curriculum co-development, glocalization of journalism education, international journalism, needs assessment

Procedia PDF Downloads 114
34 A Modular Solution for Large-Scale Critical Industrial Scheduling Problems with Coupling of Other Optimization Problems

Authors: Ajit Rai, Hamza Deroui, Blandine Vacher, Khwansiri Ninpan, Arthur Aumont, Francesco Vitillo, Robert Plana

Abstract:

Large-scale critical industrial scheduling problems are based on Resource-Constrained Project Scheduling Problems (RCPSP), that necessitate integration with other optimization problems (e.g., vehicle routing, supply chain, or unique industrial ones), thus requiring practical solutions (i.e., modular, computationally efficient with feasible solutions). To the best of our knowledge, the current industrial state of the art is not addressing this holistic problem. We propose an original modular solution that answers the issues exhibited by the delivery of complex projects. With three interlinked entities (project, task, resources) having their constraints, it uses a greedy heuristic with a dynamic cost function for each task with a situational assessment at each time step. It handles large-scale data and can be easily integrated with other optimization problems, already existing industrial tools and unique constraints as required by the use case. The solution has been tested and validated by domain experts on three use cases: outage management in Nuclear Power Plants (NPPs), planning of future NPP maintenance operation, and application in the defense industry on supply chain and factory relocation. In the first use case, the solution, in addition to the resources’ availability and tasks’ logical relationships, also integrates several project-specific constraints for outage management, like, handling of resource incompatibility, updating of tasks priorities, pausing tasks in a specific circumstance, and adjusting dynamic unit of resources. With more than 20,000 tasks and multiple constraints, the solution provides a feasible schedule within 10-15 minutes on a standard computer device. This time-effective simulation corresponds with the nature of the problem and requirements of several scenarios (30-40 simulations) before finalizing the schedules. The second use case is a factory relocation project where production lines must be moved to a new site while ensuring the continuity of their production. This generates the challenge of merging job shop scheduling and the RCPSP with location constraints. Our solution allows the automation of the production tasks while considering the rate expectation. The simulation algorithm manages the use and movement of resources and products to respect a given relocation scenario. The last use case establishes a future maintenance operation in an NPP. The project contains complex and hard constraints, like on Finish-Start precedence relationship (i.e., successor tasks have to start immediately after predecessors while respecting all constraints), shareable coactivity for managing workspaces, and requirements of a specific state of "cyclic" resources (they can have multiple states possible with only one at a time) to perform tasks (can require unique combinations of several cyclic resources). Our solution satisfies the requirement of minimization of the state changes of cyclic resources coupled with the makespan minimization. It offers a solution of 80 cyclic resources with 50 incompatibilities between levels in less than a minute. Conclusively, we propose a fast and feasible modular approach to various industrial scheduling problems that were validated by domain experts and compatible with existing industrial tools. This approach can be further enhanced by the use of machine learning techniques on historically repeated tasks to gain further insights for delay risk mitigation measures.

Keywords: deterministic scheduling, optimization coupling, modular scheduling, RCPSP

Procedia PDF Downloads 166
33 Neologisms and Word-Formation Processes in Board Game Rulebook Corpus: Preliminary Results

Authors: Athanasios Karasimos, Vasiliki Makri

Abstract:

This research focuses on the design and development of the first text Corpus based on Board Game Rulebooks (BGRC) with direct application on the morphological analysis of neologisms and tendencies in word-formation processes. Corpus linguistics is a dynamic field that examines language through the lens of vast collections of texts. These corpora consist of diverse written and spoken materials, ranging from literature and newspapers to transcripts of everyday conversations. By morphologically analyzing these extensive datasets, morphologists can gain valuable insights into how language functions and evolves, as these extensive datasets can reflect the byproducts of inflection, derivation, blending, clipping, compounding, and neology. This entails scrutinizing how words are created, modified, and combined to convey meaning in a corpus of challenging, creative, and straightforward texts that include rules, examples, tutorials, and tips. Board games teach players how to strategize, consider alternatives, and think flexibly, which are critical elements in language learning. Their rulebooks reflect not only their weight (complexity) but also the language properties of each genre and subgenre of these games. Board games are a captivating realm where strategy, competition, and creativity converge. Beyond the excitement of gameplay, board games also spark the art of word creation. Word games, like Scrabble, Codenames, Bananagrams, Wordcraft, Alice in the Wordland, Once uUpona Time, challenge players to construct words from a pool of letters, thus encouraging linguistic ingenuity and vocabulary expansion. These games foster a love for language, motivating players to unearth obscure words and devise clever combinations. On the other hand, the designers and creators produce rulebooks, where they include their joy of discovering the hidden potential of language, igniting the imagination, and playing with the beauty of words, making these games a delightful fusion of linguistic exploration and leisurely amusement. In this research, more than 150 rulebooks in English from all types of modern board games, either language-independent or language-dependent, are used to create the BGRC. A representative sample of each genre (family, party, worker placement, deckbuilding, dice, and chance games, strategy, eurogames, thematic, role-playing, among others) was selected based on the score from BoardGameGeek, the size of the texts and the level of complexity (weight) of the game. A morphological model with morphological networks, multi-word expressions, and word-creation mechanics based on the complexity of the textual structure, difficulty, and board game category will be presented. In enabling the identification of patterns, trends, and variations in word formation and other morphological processes, this research aspires to make avail of this creative yet strict text genre so as to (a) give invaluable insight into morphological creativity and innovation that (re)shape the lexicon of the English language and (b) test morphological theories. Overall, it is shown that corpus linguistics empowers us to explore the intricate tapestry of language, and morphology in particular, revealing its richness, flexibility, and adaptability in the ever-evolving landscape of human expression.

Keywords: board game rulebooks, corpus design, morphological innovations, neologisms, word-formation processes

Procedia PDF Downloads 65
32 Understanding the Impact of Spatial Light Distribution on Object Identification in Low Vision: A Pilot Psychophysical Study

Authors: Alexandre Faure, Yoko Mizokami, éRic Dinet

Abstract:

These recent years, the potential of light in assisting visually impaired people in their indoor mobility has been demonstrated by different studies. Implementing smart lighting systems for selective visual enhancement, especially designed for low-vision people, is an approach that breaks with the existing visual aids. The appearance of the surface of an object is significantly influenced by the lighting conditions and the constituent materials of the objects. Appearance of objects may appear to be different from expectation. Therefore, lighting conditions lead to an important part of accurate material recognition. The main objective of this work was to investigate the effect of the spatial distribution of light on object identification in the context of low vision. The purpose was to determine whether and what specific lighting approaches should be preferred for visually impaired people. A psychophysical experiment was designed to study the ability of individuals to identify the smallest cube of a pair under different lighting diffusion conditions. Participants were divided into two distinct groups: a reference group of observers with normal or corrected-to-normal visual acuity and a test group, in which observers were required to wear visual impairment simulation glasses. All participants were presented with pairs of cubes in a "miniature room" and were instructed to estimate the relative size of the two cubes. The miniature room replicates real-life settings, adorned with decorations and separated from external light sources by black curtains. The correlated color temperature was set to 6000 K, and the horizontal illuminance at the object level at approximately 240 lux. The objects presented for comparison consisted of 11 white cubes and 11 black cubes of different sizes manufactured with a 3D printer. Participants were seated 60 cm away from the objects. Two different levels of light diffuseness were implemented. After receiving instructions, participants were asked to judge whether the two presented cubes were the same size or if one was smaller. They provided one of five possible answers: "Left one is smaller," "Left one is smaller but unsure," "Same size," "Right one is smaller," or "Right one is smaller but unsure.". The method of constant stimuli was used, presenting stimulus pairs in a random order to prevent learning and expectation biases. Each pair consisted of a comparison stimulus and a reference cube. A psychometric function was constructed to link stimulus value with the frequency of correct detection, aiming to determine the 50% correct detection threshold. Collected data were analyzed through graphs illustrating participants' responses to stimuli, with accuracy increasing as the size difference between cubes grew. Statistical analyses, including 2-way ANOVA tests, showed that light diffuseness had no significant impact on the difference threshold, whereas object color had a significant influence in low vision scenarios. The first results and trends derived from this pilot experiment clearly and strongly suggest that future investigations could explore extreme diffusion conditions to comprehensively assess the impact of diffusion on object identification. For example, the first findings related to light diffuseness may be attributed to the range of manipulation, emphasizing the need to explore how other lighting-related factors interact with diffuseness.

Keywords: Lighting, Low Vision, Visual Aid, Object Identification, Psychophysical Experiment

Procedia PDF Downloads 42
31 Shifting Contexts and Shifting Identities: Campus Race-related Experiences, Racial Identity, and Achievement Motivation among Black College Students during the Transition to College

Authors: Tabbye Chavous, Felecia Webb, Bridget Richardson, Gloryvee Fonseca-Bolorin, Seanna Leath, Robert Sellers

Abstract:

There has been recent renewed attention to Black students’ experiences at predominantly White U.S. universities (PWIs), e.g., the #BBUM (“Being Black at the University of Michigan”), “I too am Harvard” social media campaigns, and subsequent student protest activities nationwide. These campaigns illuminate how many minority students encounter challenges to their racial/ethnic identities as they enter PWI contexts. Students routinely report experiences such as being ignored or treated as a token in classes, receiving messages of low academic expectations by faculty and peers, being questioned about their academic qualifications or belonging, being excluded from academic and social activities, and being racially profiled and harassed in the broader campus community due to race. Researchers have linked such racial marginalization and stigma experiences to student motivation and achievement. One potential mechanism is through the impact of college experiences on students’ identities, given the relevance of the college context for students’ personal identity development, including personal beliefs systems around social identities salient in this context. However, little research examines the impact of the college context on Black students’ racial identities. This study examined change in Black college students’ (N=329) racial identity beliefs over the freshman year at three predominantly White U.S. universities. Using cluster analyses, we identified profile groups reflecting different patterns of stability and change in students’ racial centrality (importance of race to overall self-concept), private regard (personal group affect/group pride), and public regard (perceptions of societal views of Blacks) from beginning of year (Time 1) to end of year (Time 2). Multinomial logit regression analyses indicated that the racial identity change clusters were predicted by pre-college background (racial composition of high school and neighborhood), as well as college-based experiences (racial discrimination, interracial friendships, and perceived campus racial climate). In particular, experiencing campus racial discrimination related to high, stable centrality, and decreases in private regard and public regard. Perceiving racial climates norms of institutional support for intergroup interactions on campus related to maintaining low and decreasing in private and public regard. Multivariate Analyses of Variance results showed change cluster effects on achievement motivation outcomes at the end of students’ academic year. Having high, stable centrality and high private regard related to more positive outcomes overall (academic competence, positive academic affect, academic curiosity and persistence). Students decreasing in private regard and public regard were particularly vulnerable to negative motivation outcomes. Findings support scholarship indicating both stability in racial identity beliefs and the importance of critical context transitions in racial identity development and adjustment outcomes among emerging adults. Findings also are consistent with research suggesting promotive effects of a strong, positive racial identity on student motivation, as well as research linking awareness of racial stigma to decreased academic engagement.

Keywords: diversity, motivation, learning, ethnic minority achievement, higher education

Procedia PDF Downloads 492
30 Critiquing Israel as Child Abuse: How Colonial White Feminism Disrupts Critical Pedagogies of Culturally Responsive and Relevant Practices and Inclusion through Ongoing and Historical Maternalism and Neoliberal Settler Colonialism

Authors: Wafaa Hasan

Abstract:

In May of 2022, Palestinian parents in Toronto, Canada, became aware that educators and staff in the Toronto District School Board were attempting to include the International Holocaust and Remembrance Definition of Antisemitism (IHRA) in The Child Abuse and Neglect Policy of the largest school board in Canada, The Toronto District School Board (TDSB). The idea was that if students were to express any form of antisemitism, as defined by the IHRA, then an investigation could follow with Child Protective Services (CPS). That is, the student’s parents could be reported to the state and investigated for custodial rights to their children. The TDSB has set apparent goals for “Decolonizing Pedagogy” (“TDSB Equity Leadership Competencies”), Culturally Responsive and Relevant Practices (CRRP) and inclusive education. These goals promote the centering of colonized, racialized and marginalized voices. CRRP cannot be effective without the application of anti-racist and settler colonial analyses. In order for CRRP to be effective, school boards need a comprehensive understanding of the ways in which the vilification of Palestinians operates through anti-indigenous and white supremacist systems and logic. Otherwise, their inclusion will always be in tension with the inclusion of settler colonial agendas and worldviews. Feminist maternalism frames racial mothering as degenerate (viewing the contributions of racialized students and their parents as products of primitive and violent cultures) and also indirectly inhibits the actualization of the tenets of CRRP and inclusive education through its extensions into the welfare state and public education. The contradiction between the tenets of CRRP and settler colonial systems of erasure and repression is resolved by the continuation of tactics to 1) force assimilation, 2) punish those who push back on that assimilation and 3) literally fragment familial and community structures of racialized students, educators and parents. This paper draws on interdisciplinary (history, philosophy, anthropology) critiques of white feminist “maternalism” from the 19th century onwards in North America and Europe (Jacobs, Weber), as well as “anti-racist education” theory (Dei), and more specifically,” culturally responsive learning,” (Muhammad) and “bandwidth” pedagogy theory (Verschelden) to make its claims. This research contributes to vibrant debates about anti-racist and decolonial pedagogies in public education systems globally. This paper also documents first-hand interviews and experiences of diasporic Palestinian mothers and motherhoods and situates their experiences within longstanding histories of white feminist maternalist (and eugenicist) politics. This informal qualitative data from "participatory conversations" (Swain) is situated within a set of formal interview data collected with Palestinian women in the West Bank (approved by the McMaster University Humanities Research Ethics Board) relating to white feminist maternalism in the peace and dialogue industry.

Keywords: decolonial feminism, maternal feminism, anti-racist pedagogies, settler colonial studies, motherhood studies, pedagogy theory, cultural theory

Procedia PDF Downloads 45
29 Coastal Foodscapes as Nature-Based Coastal Regeneration Systems

Authors: Gulce Kanturer Yasar, Hayriye Esbah Tuncay

Abstract:

Cultivated food production systems have coexisted harmoniously with nature for thousands of years through ancient techniques. Based on this experience, experimentation, and discovery, these culturally embedded methods have evolved to sustain food production, restore ecosystems, and harmoniously adapt to nature. In this era, as we seek solutions to food security challenges, enhancing and repairing our food production systems is crucial, making them more resilient to future disasters without harming the ecosystem. Instead of unsustainable conventional systems with ongoing destructive effects, we must investigate innovative and restorative production systems that integrate ancient wisdom and technology. Whether we consider agricultural fields, pastures, forests, coastal wetland ecosystems, or lagoons, it is crucial to harness the potential of these natural resources in addressing future global challenges, fostering both socio-economic resilience and ecological sustainability through strategic organization for food production. When thoughtfully designed and managed, marine-based food production has the potential to function as a living infrastructure system that addresses social and environmental challenges despite its known adverse impacts on the environment and local economies. These areas are also stages of daily life, vibrant hubs where local culture is produced and shared, contributing to the distinctive rural character of coastal settlements and exhibiting numerous spatial expressions of public nature. When we consider the history of humanity, indigenous communities have engaged in these sustainable production practices that provide goods for food, trade, culture, and the environment for many ages. Ecosystem restoration and socio-economic resilience can be achieved by combining production techniques based on ecological knowledge developed by indigenous societies with modern technologies. Coastal lagoons are highly productive coastal features that provide various natural services and societal values. They are especially vulnerable to severe physical, ecological, and social impacts of changing, challenging global conditions because of their placement within the coastal landscape. Coastal lagoons are crucial in sustaining fisheries productivity, providing storm protection, supporting tourism, and offering other natural services that hold significant value for society. Although there is considerable literature on the physical and ecological dimensions of lagoons, much less literature focuses on their economic and social values. This study will discuss the possibilities of coastal lagoons to achieve both ecologically sustainable and socio-economically resilient while maintaining their productivity by combining local techniques and modern technologies. The case study will present Turkey’s traditional aquaculture method, "Dalyans," predominantly operated by small-scale farmers in coastal lagoons. Due to human, ecological, and economic factors, dalyans are losing their landscape characteristics and efficiency. These 1000-year-old ancient techniques, rooted in centuries of traditional and agroecological knowledge, are under threat of tourism, urbanization, and unsustainable agricultural practices. Thus, Dalyans have diminished from 29 to approximately 4-5 active Dalyans. To deal with the adverse socio-economic and ecological consequences on Turkey's coastal areas, conserving Dalyans by protecting their indigenous practices while incorporating contemporary methods is essential. This study seeks to generate scenarios that envision the potential ways protection and development can manifest within case study areas.

Keywords: coastal foodscape, lagoon aquaculture, regenerative food systems, watershed food networks

Procedia PDF Downloads 44
28 Development of Chitosan/Dextran Gelatin Methacrylate Core/Shell 3D Scaffolds and Protein/Polycaprolactone Melt Electrowriting Meshes for Tissue Regeneration Applications

Authors: J. D. Cabral, E. Murray, P. Turner, E. Hewitt, A. Ali, M. McConnell

Abstract:

Worldwide demand for organ replacement and tissue regeneration is progressively increasing. Three-dimensional (3D) bioprinting, where a physical construct is produced using computer-aided design, is a promising tool to advance the tissue engineering and regenerative medicine fields. In this paper we describe two different approaches to developing 3D bioprinted constructs for use in tissue regeneration. Bioink development is critical in achieving the 3D biofabrication of functional, regenerative tissues. Hydrogels, cross-linked macromolecules that absorb large amounts of water, have received widespread interest as bioinks due to their relevant soft tissue mechanics, biocompatibility, and tunability. In turn, not only is bioink optimisation crucial, but the creation of vascularized tissues remains a key challenge for the successful fabrication of thicker, more clinically relevant bioengineered tissues. Among the various methodologies, cell-laden hydrogels are regarded as a favorable approach; and when combined with novel core/shell 3D bioprinting technology, an innovative strategy towards creating new vessel-like structures. In this work, we investigate this cell-based approach by using human umbilical endothelial cells (HUVECs) entrapped in a viscoelastic chitosan/dextran (CD)-based core hydrogel, printed simulataneously along with a gelatin methacrylate (GelMA) shell. We have expanded beyond our previously reported FDA approved, commercialised, post-surgical CD hydrogel, Chitogel®, by functionalizing it with cell adhesion and proteolytic peptides in order to promote bone marrow-derived mesenchymal stem cell (immortalized BMSC cell line, hTERT) and HUVECs growth. The biocompatibility and biodegradability of these cell lines in a 3D bioprinted construct is demonstrated. Our studies show that particular peptide combinations crosslinked within the CD hydrogel was found to increase in vitro growth of BMSCs and HUVECs by more than two-fold. These gels were then used as a core bioink combined with the more mechanically robust, UV irradiated GelMA shell bioink, to create 3D regenerative, vessel-like scaffolds with high print fidelity. As well, microporous MEW scaffolds made from milk proteins blended with PCL were found to show promising bioactivity, exhibiting a significant increase in keratinocyte (HaCaTs) and fibroblast (normal human dermal fibroblasts, NhDFs) cell migration and proliferation when compared to PCL only scaffolds. In conclusion, our studies indicate that a peptide functionalized CD hydrogel bioink reinforced with a GelMA shell is biocompatible, biodegradable, and an appropriate cell delivery vehicle in the creation of regenerative 3D constructs. In addition, a novel 3D printing technique, melt electrowriting (MEW), which allows fabrication of micrometer fibre meshes, was used to 3D print polycaprolactone (PCL) and bioactive milk protein, lactorferrin (LF) and whey protein (WP), blended scaffolds for potential skin regeneration applications. MEW milk protein/PCL scaffolds exhibited high porosity characteristics, low overall biodegradation, and rapid protein release. Human fibroblasts and keratinocyte cells were seeded on to the scaffolds. Scaffolds containing high concentrations of LF and combined proteins (LF+WP) showed improved cell viability over time as compared to PCL only scaffolds. This research highlights two scaffolds made using two different 3D printing techniques using a combination of both natural and synthetic biomaterial components in order to create regenerative constructs as potential chronic wound treatments.

Keywords: biomaterials, hydrogels, regenerative medicine, 3D bioprinting

Procedia PDF Downloads 247
27 Familiarity with Intercultural Conflicts and Global Work Performance: Testing a Theory of Recognition Primed Decision-Making

Authors: Thomas Rockstuhl, Kok Yee Ng, Guido Gianasso, Soon Ang

Abstract:

Two meta-analyses show that intercultural experience is not related to intercultural adaptation or performance in international assignments. These findings have prompted calls for a deeper grounding of research on international experience in the phenomenon of global work. Two issues, in particular, may limit current understanding of the relationship between international experience and global work performance. First, intercultural experience is too broad a construct that may not sufficiently capture the essence of global work, which to a large part involves sensemaking and managing intercultural conflicts. Second, the psychological mechanisms through which intercultural experience affects performance remains under-explored, resulting in a poor understanding of how experience is translated into learning and performance outcomes. Drawing on recognition primed decision-making (RPD) research, the current study advances a cognitive processing model to highlight the importance of intercultural conflict familiarity. Compared to intercultural experience, intercultural conflict familiarity is a more targeted construct that captures individuals’ previous exposure to dealing with intercultural conflicts. Drawing on RPD theory, we argue that individuals’ intercultural conflict familiarity enhances their ability to make accurate judgments and generate effective responses when intercultural conflicts arise. In turn, the ability to make accurate situation judgements and effective situation responses is an important predictor of global work performance. A relocation program within a multinational enterprise provided the context to test these hypotheses using a time-lagged, multi-source field study. Participants were 165 employees (46% female; with an average of 5 years of global work experience) from 42 countries who relocated from country to regional offices as part a global restructuring program. Within the first two weeks of transfer to the regional office, employees completed measures of their familiarity with intercultural conflicts, cultural intelligence, cognitive ability, and demographic information. They also completed an intercultural situational judgment test (iSJT) to assess their situation judgment and situation response. The iSJT comprised four validated multimedia vignettes of challenging intercultural work conflicts and prompted employees to provide protocols of their situation judgment and situation response. Two research assistants, trained in intercultural management but blind to the study hypotheses, coded the quality of employee’s situation judgment and situation response. Three months later, supervisors rated employees’ global work performance. Results using multilevel modeling (vignettes nested within employees) support the hypotheses that greater familiarity with intercultural conflicts is positively associated with better situation judgment, and that situation judgment mediates the effect of intercultural familiarity on situation response quality. Also, aggregated situation judgment and situation response quality both predicted supervisor-rated global work performance. Theoretically, our findings highlight the important but under-explored role of familiarity with intercultural conflicts; a shift in attention from the general nature of international experience assessed in terms of number and length of overseas assignments. Also, our cognitive approach premised on RPD theory offers a new theoretical lens to understand the psychological mechanisms through which intercultural conflict familiarity affects global work performance. Third, and importantly, our study contributes to the global talent identification literature by demonstrating that the cognitive processes engaged in resolving intercultural conflicts predict actual performance in the global workplace.

Keywords: intercultural conflict familiarity, job performance, judgment and decision making, situational judgment test

Procedia PDF Downloads 153
26 Disabled Graduate Students’ Experiences and Vision of Change for Higher Education: A Participatory Action Research Study

Authors: Emily Simone Doffing, Danielle Kohfeldt

Abstract:

Disabled students are underrepresented in graduate-level degree enrollment and completion. There is limited research on disabled students' progression during the pandemic. Disabled graduate students (DGS) face unique interpersonal and institutional barriers, yet, limited research explores these barriers, buffering facilitators, and aids to academic persistence. This study adopts an asset-based, embodied disability approach using the critical pedagogy theoretical framework instead of the deficit research approach. The Participatory Action Research (PAR) paradigm, the critical pedagogy theoretical framework, and emancipatory disability research share the same purpose -creating a socially just world through reciprocal learning. This study is one of few, if not the first, to center solely on DGS’ lived understanding using a Participatory Action Research (PAR) epistemology. With a PAR paradigm, participants and investigators work as a research team democratically at every stage of the research process. PAR has individual and systemic outcomes. PAR lessens the researcher-participant power gap and elevates a marginalized community’s knowledge as expertise for local change. PAR and critical pedagogy work toward enriching everyone involved with empowerment, civic engagement, knowledge proliferation, socio-cultural reflection, skills development, and active meaning-making. The PAR process unveils the tensions between disability and graduate school in policy and practice during the pandemic. Likewise, institutional and ideological tensions influence the PAR process. This project is recruiting 10 DGS until September through purposive and snowball sampling. DGS will collectively practice praxis during four monthly focus groups in the fall 2023 semester. Participant researchers can attend a focus group or an interview, both with field notes. September will be our orientation and first monthly meeting. It will include access needs check-ins, ice breakers, consent form review, a group agreement, PAR introduction, research ethics discussion, research goals, and potential research topics. October and November will be available for meetings for dialogues about lived experiences during our collaborative data collection. Our sessions can be semi-structured with “framing questions,” which would be revised together. Field notes include observations that cannot be captured through audio. December will focus on local social action planning and dissemination. Finally, in January, there will be a post-study focus group for students' reflections on their experiences of PAR. Iterative analysis methods include transcribed audio, reflexivity, memos, thematic coding, analytic triangulation, and member checking. This research follows qualitative rigor and quality criteria: credibility, transferability, confirmability, and psychopolitical validity. Results include potential tension points, social action, individual outcomes, and recommendations for conducting PAR. Tension points have three components: dubious practices, contestable knowledge, and conflict. The dissemination of PAR recommendations will aid and encourage researchers to conduct future PAR projects with the disabled community. Identified stakeholders will be informed of DGS’ insider knowledge to drive social sustainability.

Keywords: participatory action research, graduate school, disability, higher education

Procedia PDF Downloads 37
25 Anajaa-Visual Substitution System: A Navigation Assistive Device for the Visually Impaired

Authors: Juan Pablo Botero Torres, Alba Avila, Luis Felipe Giraldo

Abstract:

Independent navigation and mobility through unknown spaces pose a challenge for the autonomy of visually impaired people (VIP), who have relied on the use of traditional assistive tools like the white cane and trained dogs. However, emerging visually assistive technologies (VAT) have proposed several human-machine interfaces (HMIs) that could improve VIP’s ability for self-guidance. Hereby, we introduce the design and implementation of a visually assistive device, Anajaa – Visual Substitution System (AVSS). This system integrates ultrasonic sensors with custom electronics, and computer vision models (convolutional neural networks), in order to achieve a robust system that acquires information of the surrounding space and transmits it to the user in an intuitive and efficient manner. AVSS consists of two modules: the sensing and the actuation module, which are fitted to a chest mount and belt that communicate via Bluetooth. The sensing module was designed for the acquisition and processing of proximity signals provided by an array of ultrasonic sensors. The distribution of these within the chest mount allows an accurate representation of the surrounding space, discretized in three different levels of proximity, ranging from 0 to 6 meters. Additionally, this module is fitted with an RGB-D camera used to detect potentially threatening obstacles, like staircases, using a convolutional neural network specifically trained for this purpose. Posteriorly, the depth data is used to estimate the distance between the stairs and the user. The information gathered from this module is then sent to the actuation module that creates an HMI, by the means of a 3x2 array of vibration motors that make up the tactile display and allow the system to deliver haptic feedback. The actuation module uses vibrational messages (tactones); changing both in amplitude and frequency to deliver different awareness levels according to the proximity of the obstacle. This enables the system to deliver an intuitive interface. Both modules were tested under lab conditions, and the HMI was additionally tested with a focal group of VIP. The lab testing was conducted in order to establish the processing speed of the computer vision algorithms. This experimentation determined that the model can process 0.59 frames per second (FPS); this is considered as an adequate processing speed taking into account that the walking speed of VIP is 1.439 m/s. In order to test the HMI, we conducted a focal group composed of two females and two males between the ages of 35-65 years. The subject selection was aided by the Colombian Cooperative of Work and Services for the Sightless (COOTRASIN). We analyzed the learning process of the haptic messages throughout five experimentation sessions using two metrics: message discrimination and localization success. These correspond to the ability of the subjects to recognize different tactones and locate them within the tactile display. Both were calculated as the mean across all subjects. Results show that the focal group achieved message discrimination of 70% and a localization success of 80%, demonstrating how the proposed HMI leads to the appropriation and understanding of the feedback messages, enabling the user’s awareness of its surrounding space.

Keywords: computer vision on embedded systems, electronic trave aids, human-machine interface, haptic feedback, visual assistive technologies, vision substitution systems

Procedia PDF Downloads 60
24 A Low-Cost Disposable PDMS Microfluidic Cartridge with Reagent Storage Silicone Blisters for Isothermal DNA Amplification

Authors: L. Ereku, R. E. Mackay, A. Naveenathayalan, K. Ajayi, W. Balachandran

Abstract:

Over the past decade the increase of sexually transmitted infections (STIs) especially in the developing world due to high cost and lack of sufficient medical testing have given rise to the need for a rapid, low cost point of care medical diagnostic that is disposable and most significantly reproduces equivocal results achieved within centralised laboratories. This paper present the development of a disposable PDMS microfluidic cartridge incorporating blisters filled with reagents required for isothermal DNA amplification in clinical diagnostics and point-of-care testing. In view of circumventing the necessity for external complex microfluidic pumps, designing on-chip pressurised fluid reservoirs is embraced using finger actuation and blister storage. The fabrication of the blisters takes into consideration three proponents that include: material characteristics, fluid volume and structural design. Silicone rubber is the chosen material due to its good chemical stability, considerable tear resistance and moderate tension/compression strength. The case of fluid capacity and structural form go hand in hand as the reagent need for the experimental analysis determines the volume size of the blisters, whereas the structural form has to be designed to provide low compression stress when deformed for fluid expulsion. Furthermore, the top and bottom section of the blisters are embedded with miniature polar opposite magnets at a defined parallel distance. These magnets are needed to lock or restrain the blisters when fully compressed so as to prevent unneeded backflow as a result of elasticity. The integrated chip is bonded onto a large microscope glass slide (50mm x 75mm). Each part is manufactured using a 3D printed mould designed using Solidworks software. Die-casting is employed, using 3D printed moulds, to form the deformable blisters by forcing a proprietary liquid silicone rubber through the positive mould cavity. The set silicone rubber is removed from the cast and prefilled with liquid reagent and then sealed with a thin (0.3mm) burstable layer of recast silicone rubber. The main microfluidic cartridge is fabricated using classical soft lithographic techniques. The cartridge incorporates microchannel circuitry, mixing chamber, inlet port, outlet port, reaction chamber and waste chamber. Polydimethylsiloxane (PDMS, QSil 216) is mixed and degassed using a centrifuge (ratio 10:1) is then poured after the prefilled blisters are correctly positioned on the negative mould. Heat treatment of about 50C to 60C in the oven for about 3hours is needed to achieve curing. The latter chip production stage involves bonding the cured PDMS to the glass slide. A plasma coroner treater device BD20-AC (Electro-Technic Products Inc., US) is used to activate the PDMS and glass slide before they are both joined and adequately compressed together, then left in the oven over the night to ensure bonding. There are two blisters in total needed for experimentation; the first will be used as a wash buffer to remove any remaining cell debris and unbound DNA while the second will contain 100uL amplification reagents. This paper will present results of chemical cell lysis, extraction using a biopolymer paper membrane and isothermal amplification on a low-cost platform using the finger actuated blisters for reagent storage. The platform has been shown to detect 1x105 copies of Chlamydia trachomatis using Recombinase Polymerase Amplification (RPA).

Keywords: finger actuation, point of care, reagent storage, silicone blisters

Procedia PDF Downloads 352