Search results for: computer game-based learning
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8977

Search results for: computer game-based learning

97 Process of Production of an Artisanal Brewery in a City in the North of the State of Mato Grosso, Brazil

Authors: Ana Paula S. Horodenski, Priscila Pelegrini, Salli Baggenstoss

Abstract:

The brewing industry with artisanal concepts seeks to serve a specific market, with diversified production that has been gaining ground in the national environment, also in the Amazon region. This growth is due to the more demanding consumer, with a diversified taste that wants to try new types of beer, enjoying products with new aromas, flavors, as a differential of what is so widely spread through the big industrial brands. Thus, through qualitative research methods, the study aimed to investigate how is the process of managing the production of a craft brewery in a city in the northern State of Mato Grosso (BRAZIL), providing knowledge of production processes and strategies in the industry. With the efficient use of resources, it is possible to obtain the necessary quality and provide better performance and differentiation of the company, besides analyzing the best management model. The research is descriptive with a qualitative approach through a case study. For the data collection, a semi-structured interview was elaborated, composed of the areas: microbrewery characterization, artisan beer production process, and the company supply chain management. Also, production processes were observed during technical visits. With the study, it was verified that the artisan brewery researched develops preventive maintenance strategies with the inputs, machines, and equipment, so that the quality of the product and the production process are achieved. It was observed that the distance from the supplying centers makes the management of processes and the supply chain be carried out with a longer planning time so that the delivery of the final product is satisfactory. The production process of the brewery is composed of machines and equipment that allows the control and quality of the product, which the manager states that for the productive capacity of the industry and its consumer market, the available equipment meets the demand. This study also contributes to highlight one of the challenges for the development of small breweries in front of the market giants, that is, the legislation, which fits the microbreweries as producers of alcoholic beverages. This makes the micro and small business segment to be taxed as a major, who has advantages in purchasing large batches of raw materials and tax incentives because they are large employers and tax pickers. It was possible to observe that the supply chain management system relies on spreadsheets and notes that are done manually, which could be simplified with a computer program to streamline procedures and reduce risks and failures of the manual process. In relation to the control of waste and effluents affected by the industry is outsourced and meets the needs. Finally, the results showed that the industry uses preventive maintenance as a productive strategy, which allows better conditions for the production and quality of artisanal beer. The quality is directly related to the satisfaction of the final consumer, being prized and performed throughout the production process, with the selection of better inputs, the effectiveness of the production processes and the relationship with the commercial partners.

Keywords: artisanal brewery, production management, production processes, supply chain

Procedia PDF Downloads 120
96 Accountability of Artificial Intelligence: An Analysis Using Edgar Morin’s Complex Thought

Authors: Sylvie Michel, Sylvie Gerbaix, Marc Bidan

Abstract:

Artificial intelligence (AI) can be held accountable for its detrimental impacts. This question gains heightened relevance given AI's pervasive reach across various domains, magnifying its power and potential. The expanding influence of AI raises fundamental ethical inquiries, primarily centering on biases, responsibility, and transparency. This encompasses discriminatory biases arising from algorithmic criteria or data, accidents attributed to autonomous vehicles or other systems, and the imperative of transparent decision-making. This article aims to stimulate reflection on AI accountability, denoting the necessity to elucidate the effects it generates. Accountability comprises two integral aspects: adherence to legal and ethical standards and the imperative to elucidate the underlying operational rationale. The objective is to initiate a reflection on the obstacles to this "accountability," facing the challenges of the complexity of artificial intelligence's system and its effects. Then, this article proposes to mobilize Edgar Morin's complex thought to encompass and face the challenges of this complexity. The first contribution is to point out the challenges posed by the complexity of A.I., with fractional accountability between a myriad of human and non-human actors, such as software and equipment, which ultimately contribute to the decisions taken and are multiplied in the case of AI. Accountability faces three challenges resulting from the complexity of the ethical issues combined with the complexity of AI. The challenge of the non-neutrality of algorithmic systems as fully ethically non-neutral actors is put forward by a revealing ethics approach that calls for assigning responsibilities to these systems. The challenge of the dilution of responsibility is induced by the multiplicity and distancing between the actors. Thus, a dilution of responsibility is induced by a split in decision-making between developers, who feel they fulfill their duty by strictly respecting the requests they receive, and management, which does not consider itself responsible for technology-related flaws. Accountability is confronted with the challenge of transparency of complex and scalable algorithmic systems, non-human actors self-learning via big data. A second contribution involves leveraging E. Morin's principles, providing a framework to grasp the multifaceted ethical dilemmas and subsequently paving the way for establishing accountability in AI. When addressing the ethical challenge of biases, the "hologrammatic" principle underscores the imperative of acknowledging the non-ethical neutrality of algorithmic systems inherently imbued with the values and biases of their creators and society. The "dialogic" principle advocates for the responsible consideration of ethical dilemmas, encouraging the integration of complementary and contradictory elements in solutions from the very inception of the design phase. Aligning with the principle of organizing recursiveness, akin to the "transparency" of the system, it promotes a systemic analysis to account for the induced effects and guides the incorporation of modifications into the system to rectify deviations and reintroduce modifications into the system to rectify its drifts. In conclusion, this contribution serves as an inception for contemplating the accountability of "artificial intelligence" systems despite the evident ethical implications and potential deviations. Edgar Morin's principles, providing a lens to contemplate this complexity, offer valuable perspectives to address these challenges concerning accountability.

Keywords: accountability, artificial intelligence, complexity, ethics, explainability, transparency, Edgar Morin

Procedia PDF Downloads 63
95 Industrial Production of the Saudi Future Dwelling: A Saudi Volumetric Solution for Single Family Homes, Leveraging Industry 4.0 with Scalable Automation, Hybrid Structural Insulated Panels Technology and Local Materials

Authors: Bandar Alkahlan

Abstract:

The King Abdulaziz City for Science and Technology (KACST) created the Saudi Future Dwelling (SFD) initiative to identify, localize and commercialize a scalable home manufacturing technology suited to deployment across the Kingdom of Saudi Arabia (KSA). This paper outlines the journey, the creation of the international project delivery team, the product design, the selection of the process technologies, and the outcomes. A target was set to remove 85% of the construction and finishing processes from the building site as these activities could be more efficiently completed in a factory environment. Therefore, integral to the SFD initiative is the successful industrialization of the home building process using appropriate technologies, automation, robotics, and manufacturing logistics. The technologies proposed for the SFD housing system are designed to be energy efficient, economical, fit for purpose from a Saudi cultural perspective, and will minimize the use of concrete, relying mainly on locally available Saudi natural materials derived from the local resource industries. To this end, the building structure is comprised of a hybrid system of structural insulated panels (SIP), combined with a light gauge steel framework manufactured in a large format panel system. The paper traces the investigative process and steps completed by the project team during the selection process. As part of the SFD Project, a pathway was mapped out to include a proof-of-concept prototype housing module and the set-up and commissioning of a lab-factory complete with all production machinery and equipment necessary to simulate a full-scale production environment. The prototype housing module was used to validate and inform current and future product design as well as manufacturing process decisions. A description of the prototype design and manufacture is outlined along with valuable learning derived from the build and how these results were used to enhance the SFD project. The industrial engineering concepts and lab-factory detailed design and layout are described in the paper, along with the shop floor I.T. management strategy. Special attention was paid to showcase all technologies within the lab-factory as part of the engagement strategy with private investors to leverage the SFD project with large scale factories throughout the Kingdom. A detailed analysis is included in the process surrounding the design, specification, and procurement of the manufacturing machinery, equipment, and logistical manipulators required to produce the SFD housing modules. The manufacturing machinery was comprised of a combination of standardized and bespoke equipment from a wide range of international suppliers. The paper describes the selection process, pre-ordering trials and studies, and, in some cases, the requirement for additional research and development by the equipment suppliers in order to achieve the SFD objectives. A set of conclusions is drawn describing the results achieved thus far, along with a list of recommended ongoing operational tests, enhancements, research, and development aimed at achieving full-scale engagement with private sector investment and roll-out of the SFD project across the Kingdom.

Keywords: automation, dwelling, manufacturing, product design

Procedia PDF Downloads 121
94 i2kit: A Tool for Immutable Infrastructure Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservice architectures are increasingly in distributed cloud applications due to the advantages on the software composition, development speed, release cycle frequency and the business logic time to market. On the other hand, these architectures also introduce some challenges on the testing and release phases of applications. Container technology solves some of these issues by providing reproducible environments, easy of software distribution and isolation of processes. However, there are other issues that remain unsolved in current container technology when dealing with multiple machines, such as networking for multi-host communication, service discovery, load balancing or data persistency (even though some of these challenges are already solved by traditional cloud vendors in a very mature and widespread manner). Container cluster management tools, such as Kubernetes, Mesos or Docker Swarm, attempt to solve these problems by introducing a new control layer where the unit of deployment is the container (or the pod — a set of strongly related containers that must be deployed on the same machine). These tools are complex to configure and manage and they do not follow a pure immutable infrastructure approach since servers are reused between deployments. Indeed, these tools introduce dependencies at execution time for solving networking or service discovery problems. If an error on the control layer occurs, which would affect running applications, specific expertise is required to perform ad-hoc troubleshooting. As a consequence, it is not surprising that container cluster support is becoming a source of revenue for consulting services. This paper presents i2kit, a deployment tool based on the immutable infrastructure pattern, where the virtual machine is the unit of deployment. The input for i2kit is a declarative definition of a set of microservices, where each microservice is defined as a pod of containers. Microservices are built into machine images using linuxkit —- a tool for creating minimal linux distributions specialized in running containers. These machine images are then deployed to one or more virtual machines, which are exposed through a cloud vendor load balancer. Finally, the load balancer endpoint is set into other microservices using an environment variable, providing service discovery. The toolkit i2kit reuses the best ideas from container technology to solve problems like reproducible environments, process isolation, and software distribution, and at the same time relies on mature, proven cloud vendor technology for networking, load balancing and persistency. The result is a more robust system with no learning curve for troubleshooting running applications. We have implemented an open source prototype that transforms i2kit definitions into AWS cloud formation templates, where each microservice AMI (Amazon Machine Image) is created on the fly using linuxkit. Even though container cluster management tools have more flexibility for resource allocation optimization, we defend that adding a new control layer implies more important disadvantages. Resource allocation is greatly improved by using linuxkit, which introduces a very small footprint (around 35MB). Also, the system is more secure since linuxkit installs the minimum set of dependencies to run containers. The toolkit i2kit is currently under development at the IMDEA Software Institute.

Keywords: container, deployment, immutable infrastructure, microservice

Procedia PDF Downloads 179
93 Solutions for Food-Safe 3D Printing

Authors: Geremew Geidare Kailo, Igor Gáspár, András Koris, Ivana Pajčin, Flóra Vitális, Vanja Vlajkov

Abstract:

Three-dimension (3D) printing, a very popular additive manufacturing technology, has recently undergone rapid growth and replaced the use of conventional technology from prototyping to producing end-user parts and products. The 3D Printing technology involves a digital manufacturing machine that produces three-dimensional objects according to designs created by the user via 3D modeling or computer-aided design/manufacturing (CAD/CAM) software. The most popular 3D printing system is Fused Deposition Modeling (FDM) or also called Fused Filament Fabrication (FFF). A 3D-printed object is considered food safe if it can have direct contact with the food without any toxic effects, even after cleaning, storing, and reusing the object. This work analyzes the processing timeline of the filament (material for 3D printing) from unboxing to the extrusion through the nozzle. It is an important task to analyze the growth of bacteria on the 3D printed surface and in gaps between the layers. By default, the 3D-printed object is not food safe after longer usage and direct contact with food (even though they use food-safe filaments), but there are solutions for this problem. The aim of this work was to evaluate the 3D-printed object from different perspectives of food safety. Firstly, testing antimicrobial 3D printing filaments from a food safety aspect since the 3D Printed object in the food industry may have direct contact with the food. Therefore, the main purpose of the work is to reduce the microbial load on the surface of a 3D-printed part. Coating with epoxy resin was investigated, too, to see its effect on mechanical strength, thermal resistance, surface smoothness and food safety (cleanability). Another aim of this study was to test new temperature-resistant filaments and the effect of high temperature on 3D printed materials to see if they can be cleaned with boiling or similar hi-temp treatment. This work proved that all three mentioned methods could improve the food safety of the 3D printed object, but the size of this effect variates. The best result we got was with coating with epoxy resin, and the object was cleanable like any other injection molded plastic object with a smooth surface. Very good results we got by boiling the objects, and it is good to see that nowadays, more and more special filaments have a food-safe certificate and can withstand boiling temperatures too. Using antibacterial filaments reduced bacterial colonies to 1/5, but the biggest advantage of this method is that it doesn’t require any post-processing. The object is ready out of the 3D printer. Acknowledgements: The research was supported by the Hungarian and Serbian bilateral scientific and technological cooperation project funded by the Hungarian National Office for Research, Development and Innovation (NKFI, 2019-2.1.11-TÉT-2020-00249) and the Ministry of Education, Science and Technological Development of the Republic of Serbia. The authors acknowledge the Hungarian University of Agriculture and Life Sciences’s Doctoral School of Food Science for the support in this study

Keywords: food safety, 3D printing, filaments, microbial, temperature

Procedia PDF Downloads 142
92 The Soviet Union-Style of Urban Planning in China: Historical Review and Enlightenment from the Output Mode of Contemporary Cooperative Parks

Authors: Yifeng Shi, Xingping Wang

Abstract:

The Soviet Union-style of urban planning has produced a broad and profound influence on China’s urban planning system. The study on extendibility and development experience of Soviet planning in China helps to change the current embarrassing situation 'one-hand planning practice, second-hand planning theory', and also beneficial to facilitate the establishment of China's domestic urban planning theory from the planning source, especially the overseas cooperation parks rich in 'Chinese characteristics'. In practice, as the world’s major infrastructure country, China is exporting to the world especially countries along 'the Belt and Road' a development model featuring cooperation parks as Chinese characteristics. This is of great significance to evaluate and summarize the experiences of Soviet Union-style of planning for China's development objectively and rationally, from removing ideological factors and extracting positive factors to carry them forward in overseas cooperation parks. This article briefly reviews the Soviet influence on urban planning after the founding of China and divided the influences stages into 'guidance, internalization and absorption, selective learning, decline' four periods. The impact includes production-oriented planning and planning concepts continue to be implemented, the establishment of the regional planning, master planning, detailed planning of the basic framework of urban planning, and homogenized cellular structure of the space, as well as planning techniques, professional training, planning techniques and so on. China and even most socialist countries now still carry such planning genes. At present, in the process of implementing 'the Belt and Road' strategy, the planning and construction of China’s overseas cooperation parks generally encounter many problems as lack of strategic planning and systematic planning, lack of top-level design, uncoordinated planning and layout in parks, and redundant construction in some areas. After sublating the planning genes of the Soviet Union-style of urban planning for the development of the socialist countries, especially the industrial planning system, this paper puts forward some views as follows to realize the overseas output and development of China's planning model and technology. Firstly the future development of overseas cooperation park should be from a rational planning point of view. Secondly the government should not only rigidly and equitably allocate the resources of the parks but also closely integrate the national economic plans or economic development strategies. Lastly management department should frame the threshold of development rationally, give full play to the pragmatic planning style in accordance with the local land system and planning system. It has an important guiding and reference role for the development of China's overseas cooperation park under the 'go global' strategy, after objectively evaluating the impact of the Soviet Union-style urban planning and absorbing the beneficial components on China. However, we should also recognize that the cooperation parks and the urban industrial system behind it are only part of urban development. More attention should be payed on the design of the local and the general rules of urban development to take the lead effect of cooperation parks suitable. Foundation item: Under the auspices of the Specific Plan for Strategic International Cooperation in Scientific and Technological Innovation, the National Key Research and Development Plan 'Research Cooperation and Exemplary Application in Planning of Development of Overseas Industrial Parks' (No 2016YFE0201000).

Keywords: China cooperative parks, history of urban planning, output mode, The Soviet Union

Procedia PDF Downloads 247
91 Unique Interprofessional Mental Health Education Model: A Pre/Post Survey

Authors: Michele L. Tilstra, Tiffany J. Peets

Abstract:

Interprofessional collaboration in behavioral healthcare education is increasingly recognized for its value in training students to address diverse client needs. While interprofessional education (IPE) is well-documented in occupational therapy education to address physical health, limited research exists on collaboration with counselors to address mental health concerns and the psychosocial needs of individuals receiving care. Counseling education literature primarily examines the collaboration of counseling students with psychiatrists, psychologists, social workers, and marriage and family therapists. This pretest/posttest survey research study explored changes in attitudes toward interprofessional teams among 56 Master of Occupational Therapy (MOT) (n = 42) and Counseling and Human Development (CHD) (n = 14) students participating in the Counselors and Occupational Therapists Professionally Engaged in the Community (COPE) program. The COPE program was designed to strengthen the behavioral health workforce in high-need and high-demand areas. Students accepted into the COPE program were divided into small MOT/CHD groups to complete multiple interprofessional multicultural learning modules using videos, case studies, and online discussion board posts. The online modules encouraged reflection on various behavioral healthcare roles, benefits of team-based care, cultural humility, current mental health challenges, personal biases, power imbalances, and advocacy for underserved populations. Using the Student Perceptions of Interprofessional Clinical Education- Revision 2 (SPICE-R2) scale, students completed pretest and posttest surveys using a 5-point Likert scale (Strongly Agree = 5 to Strongly Disagree = 1) to evaluate their attitudes toward interprofessional teamwork and collaboration. The SPICE-R2 measured three different factors: interprofessional teamwork and team-based practice (Team), roles/responsibilities for collaborative practice (Roles), and patient outcomes from collaborative practice (Outcomes). The mean total scores for all students improved from 4.25 (pretest) to 4.43 (posttest), Team from 4.66 to 4.58, Roles from 3.88 to 4.30, and Outcomes from 4.08 to 4.36. A paired t-test analysis for the total mean scores resulted in a t-statistic of 2.54, which exceeded both one-tail and two-tail critical values, indicating statistical significance (p = .001). When the factors of the SPICE-R2 were analyzed separately, only the Roles (t Stat=4.08, p =.0001) and Outcomes (t Stat=3.13, p = .002) were statistically significant. The item ‘I understand the roles of other health professionals’ showed the most improvement from a mean score for all students of 3.76 (pretest) to 4.46 (posttest). The significant improvement in students' attitudes toward interprofessional teams suggests that the unique integration of OT and CHD students in the COPE program effectively develops a better understanding of the collaborative roles necessary for holistic client care. These results support the importance of IPE through structured, engaging interprofessional experiences. These experiences are essential for enhancing students' readiness for collaborative practice and align with accreditation standards requiring interprofessional education in OT and CHD programs to prepare practitioners for team-based care. The findings contribute to the growing body of evidence supporting the integration of IPE in behavioral healthcare curricula to improve holistic client care and encourage students to engage in collaborative practice across healthcare settings.

Keywords: behavioral healthcare, counseling education, interprofessional education, mental health education, occupational therapy education

Procedia PDF Downloads 38
90 Unveiling the Dynamics of Preservice Teachers’ Engagement with Mathematical Modeling through Model Eliciting Activities: A Comprehensive Exploration of Acceptance and Resistance Towards Modeling and Its Pedagogy

Authors: Ozgul Kartal, Wade Tillett, Lyn D. English

Abstract:

Despite its global significance in curricula, mathematical modeling encounters persistent disparities in recognition and emphasis within regular mathematics classrooms and teacher education across countries with diverse educational and cultural traditions, including variations in the perceived role of mathematical modeling. Over the past two decades, increased attention has been given to the integration of mathematical modeling into national curriculum standards in the U.S. and other countries. Therefore, the mathematics education research community has dedicated significant efforts to investigate various aspects associated with the teaching and learning of mathematical modeling, primarily focusing on exploring the applicability of modeling in schools and assessing students', teachers', and preservice teachers' (PTs) competencies and engagement in modeling cycles and processes. However, limited attention has been directed toward examining potential resistance hindering teachers and PTs from effectively implementing mathematical modeling. This study focuses on how PTs, without prior modeling experience, resist and/or embrace mathematical modeling and its pedagogy as they learn about models and modeling perspectives, navigate the modeling process, design and implement their modeling activities and lesson plans, and experience the pedagogy enabling modeling. Model eliciting activities (MEAs) were employed due to their high potential to support the development of mathematical modeling pedagogy. The mathematical modeling module was integrated into a mathematics methods course to explore how PTs embraced or resisted mathematical modeling and its pedagogy. The module design included reading, reflecting, engaging in modeling, assessing models, creating a modeling task (MEA), and designing a modeling lesson employing an MEA. Twelve senior undergraduate students participated, and data collection involved video recordings, written prompts, lesson plans, and reflections. An open coding analysis revealed acceptance and resistance toward teaching mathematical modeling. The study identified four overarching themes, including both acceptance and resistance: pedagogy, affordance of modeling (tasks), modeling actions, and adjusting modeling. In the category of pedagogy, PTs displayed acceptance based on potential pedagogical benefits and resistance due to various concerns. The affordance of modeling (tasks) category emerged from instances when PTs showed acceptance or resistance while discussing the nature and quality of modeling tasks, often debating whether modeling is considered mathematics. PTs demonstrated both acceptance and resistance in their modeling actions, engaging in modeling cycles as students and designing/implementing MEAs as teachers. The adjusting modeling category captured instances where PTs accepted or resisted maintaining the qualities and nature of the modeling experience or converted modeling into a typical structured mathematics experience for students. While PTs displayed a mix of acceptance and resistance in their modeling actions, limitations were observed in embracing complexity and adhering to model principles. The study provides valuable insights into the challenges and opportunities of integrating mathematical modeling into teacher education, emphasizing the importance of addressing pedagogical concerns and providing support for effective implementation. In conclusion, this research offers a comprehensive understanding of PTs' engagement with modeling, advocating for a more focused discussion on the distinct nature and significance of mathematical modeling in the broader curriculum to establish a foundation for effective teacher education programs.

Keywords: mathematical modeling, model eliciting activities, modeling pedagogy, secondary teacher education

Procedia PDF Downloads 65
89 Investigations on the Application of Avalanche Simulations: A Survey Conducted among Avalanche Experts

Authors: Korbinian Schmidtner, Rudolf Sailer, Perry Bartelt, Wolfgang Fellin, Jan-Thomas Fischer, Matthias Granig

Abstract:

This study focuses on the evaluation of snow avalanche simulations, based on a survey that has been carried out among avalanche experts. In the last decades, the application of avalanche simulation tools has gained recognition within the realm of hazard management. Traditionally, avalanche runout models were used to predict extreme avalanche runout and prepare avalanche maps. This has changed rather dramatically with the application of numerical models. For safety regulations such as road safety simulation tools are now being coupled with real-time meteorological measurements to predict frequent avalanche hazard. That places new demands on model accuracy and requires the simulation of physical processes that previously could be ignored. These simulation tools are based on a deterministic description of the avalanche movement allowing to predict certain quantities (e.g. pressure, velocities, flow heights, runout lengths etc.) of the avalanche flow. Because of the highly variable regimes of the flowing snow, no uniform rheological law describing the motion of an avalanche is known. Therefore, analogies to fluid dynamical laws of other materials are stated. To transfer these constitutional laws to snow flows, certain assumptions and adjustments have to be imposed. Besides these limitations, there exist high uncertainties regarding the initial and boundary conditions. Further challenges arise when implementing the underlying flow model equations into an algorithm executable by a computer. This implementation is constrained by the choice of adequate numerical methods and their computational feasibility. Hence, the model development is compelled to introduce further simplifications and the related uncertainties. In the light of these issues many questions arise on avalanche simulations, on their assets and drawbacks, on potentials for improvements as well as their application in practice. To address these questions a survey among experts in the field of avalanche science (e.g. researchers, practitioners, engineers) from various countries has been conducted. In the questionnaire, special attention is drawn on the expert’s opinion regarding the influence of certain variables on the simulation result, their uncertainty and the reliability of the results. Furthermore, it was tested to which degree a simulation result influences the decision making for a hazard assessment. A discrepancy could be found between a large uncertainty of the simulation input parameters as compared to a relatively high reliability of the results. This contradiction can be explained taking into account how the experts employ the simulations. The credibility of the simulations is the result of a rather thoroughly simulation study, where different assumptions are tested, comparing the results of different flow models along with the use of supplemental data such as chronicles, field observation, silent witnesses i.a. which are regarded as essential for the hazard assessment and for sanctioning simulation results. As the importance of avalanche simulations grows within the hazard management along with their further development studies focusing on the modeling fashion could contribute to a better understanding how knowledge of the avalanche process can be gained by running simulations.

Keywords: expert interview, hazard management, modeling, simulation, snow avalanche

Procedia PDF Downloads 326
88 A Conceptual Model of Sex Trafficking Dynamics in the Context of Pandemics and Provisioning Systems

Authors: Brian J. Biroscak

Abstract:

In the United States (US), “sex trafficking” is defined at the federal level in the Trafficking Victims Protection Act of 2000 as encompassing a number of processes such as recruitment, transportation, and provision of a person for the purpose of a commercial sex act. It involves the use of force, fraud, or coercion, or in which the person induced to perform such act has not attained 18 years of age. Accumulating evidence suggests that sex trafficking is exacerbated by social and environmental stressors (e.g., pandemics). Given that “provision” is a key part of the definition, “provisioning systems” may offer a useful lens through which to study sex trafficking dynamics. Provisioning systems are the social systems connecting individuals, small groups, entities, and embedded communities as they seek to satisfy their needs and wants for goods, services, experiences and ideas through value-based exchange in communities. This project presents a conceptual framework for understanding sex trafficking dynamics in the context of the COVID pandemic. The framework is developed as a system dynamics simulation model based on published evidence, social and behavioral science theory, and key informant interviews with stakeholders from the Protection, Prevention, Prosecution, and Partnership sectors in one US state. This “4 P Paradigm” has been described as fundamental to the US government’s anti-trafficking strategy. The present research question is: “How do sex trafficking systems (e.g., supply, demand and price) interact with other provisioning systems (e.g., networks of organizations that help sexually exploited persons) to influence trafficking over time vis-à-vis the COVID pandemic?” Semi-structured interviews with stakeholders (n = 19) were analyzed based on grounded theory and combined for computer simulation. The first step (Problem Definition) was completed by open coding video-recorded interviews, supplemented by a literature review. The model depicts provision of sex trafficking services for victims and survivors as declining in March 2020, coincidental with COVID, but eventually rebounding. The second modeling step (Dynamic Hypothesis Formulation) was completed by open- and axial coding of interview segments, as well as consulting peer-reviewed literature. Part of the hypothesized explanation for changes over time is that the sex trafficking system behaves somewhat like a commodities market, with each of the other subsystems exhibiting delayed responses but collectively keeping trafficking levels below what they would be otherwise. Next steps (Model Building & Testing) led to a ‘proof of concept’ model that can be used to conduct simulation experiments and test various action ideas, by taking model users outside the entire system and seeing it whole. If sex trafficking dynamics unfold as hypothesized, e.g., oscillated post-COVID, then one potential leverage point is to address the lack of information feedback loops between the actual occurrence and consequences of sex trafficking and those who seek to prevent its occurrence, prosecute the traffickers, protect the victims and survivors, and partner with the other anti-trafficking advocates. Implications for researchers, administrators, and other stakeholders are discussed.

Keywords: pandemics, provisioning systems, sex trafficking, system dynamics modeling

Procedia PDF Downloads 79
87 Pushover Analysis of a Typical Bridge Built in Central Zone of Mexico

Authors: Arturo Galvan, Jatziri Y. Moreno-Martinez, Daniel Arroyo-Montoya, Jose M. Gutierrez-Villalobos

Abstract:

Bridges are one of the most seismically vulnerable structures on highway transportation systems. The general process for assessing the seismic vulnerability of a bridge involves the evaluation of its overall capacity and demand. One of the most common procedures to obtain this capacity is by means of pushover analysis of the structure. Typically, the bridge capacity is assessed using non-linear static methods or non-linear dynamic analyses. The non-linear dynamic approaches use step by step numerical solutions for assessing the capacity with the consuming computer time inconvenience. In this study, a nonlinear static analysis (‘pushover analysis’) was performed to predict the collapse mechanism of a typical bridge built in the central zone of Mexico (Celaya, Guanajuato). The bridge superstructure consists of three simple supported spans with a total length of 76 m: 22 m of the length of extreme spans and 32 m of length of the central span. The deck width is of 14 m and the concrete slab depth is of 18 cm. The bridge is built by means of frames of five piers with hollow box-shaped sections. The dimensions of these piers are 7.05 m height and 1.20 m diameter. The numerical model was created using a commercial software considering linear and non-linear elements. In all cases, the piers were represented by frame type elements with geometrical properties obtained from the structural project and construction drawings of the bridge. The deck was modeled with a mesh of rectangular thin shell (plate bending and stretching) finite elements. The moment-curvature analysis was performed for the sections of the piers of the bridge considering in each pier the effect of confined concrete and its reinforcing steel. In this way, plastic hinges were defined on the base of the piers to carry out the pushover analysis. In addition, time history analyses were performed using 19 accelerograms of real earthquakes that have been registered in Guanajuato. In this way, the displacements produced by the bridge were determined. Finally, pushover analysis was applied through the control of displacements in the piers to obtain the overall capacity of the bridge before the failure occurs. It was concluded that the lateral deformation of the piers due to a critical earthquake occurred in this zone is almost imperceptible due to the geometry and reinforcement demanded by the current design standards and compared to its displacement capacity, they were excessive. According to the analysis, it was found that the frames built with five piers increase the rigidity in the transverse direction of the bridge. Hence it is proposed to reduce these frames of five piers to three piers, maintaining the same geometrical characteristics and the same reinforcement in each pier. Also, the mechanical properties of materials (concrete and reinforcing steel) were maintained. Once a pushover analysis was performed considering this configuration, it was concluded that the bridge would continue having a “correct” seismic behavior, at least for the 19 accelerograms considered in this study. In this way, costs in material, construction, time and labor would be reduced in this study case.

Keywords: collapse mechanism, moment-curvature analysis, overall capacity, push-over analysis

Procedia PDF Downloads 151
86 Developing Primal Teachers beyond the Classroom: The Quadrant Intelligence (Q-I) Model

Authors: Alexander K. Edwards

Abstract:

Introduction: The moral dimension of teacher education globally has assumed a new paradigm of thinking based on the public gain (return-on-investments), value-creation (quality), professionalism (practice), and business strategies (innovations). Abundant literature reveals an interesting revolutionary trend in complimenting the raising of teachers and academic performances. Because of the global competition in the knowledge-creation and service areas, the C21st teacher at all levels is expected to be resourceful, strategic thinker, socially intelligent, relationship aptitude, and entrepreneur astute. This study is a significant contribution to practice and innovations to raise exemplary or primal teachers. In this study, the qualities needed were considered as ‘Quadrant Intelligence (Q-i)’ model for a primal teacher leadership beyond the classroom. The researcher started by examining the issue of the majority of teachers in Ghana Education Services (GES) in need of this Q-i to be effective and efficient. The conceptual framing became determinants of such Q-i. This is significant for global employability and versatility in teacher education to create premium and primal teacher leadership, which are again gaining high attention in scholarship due to failing schools. The moral aspect of teachers failing learners is a highly important discussion. In GES, some schools score zero percent at the basic education certificate examination (BECE). The question is what will make any professional teacher highly productive, marketable, and an entrepreneur? What will give teachers the moral consciousness of doing the best to succeed? Method: This study set out to develop a model for primal teachers in GES as an innovative way to highlight a premium development for the C21st business-education acumen through desk reviews. The study is conceptually framed by examining certain skill sets such as strategic thinking, social intelligence, relational and emotional intelligence and entrepreneurship to answer three main burning questions and other hypotheses. Then the study applied the causal comparative methodology with a purposive sampling technique (N=500) from CoE, GES, NTVI, and other teachers associations. Participants responded to a 30-items, researcher-developed questionnaire. Data is analyzed on the quadrant constructs and reported as ex post facto analyses of multi-variances and regressions. Multiple associations were established for statistical significance (p=0.05). Causes and effects are postulated for scientific discussions. Findings: It was found out that these quadrants are very significant in teacher development. There were significant variations in the demographic groups. However, most teachers lack considerable skills in entrepreneurship, leadership in teaching and learning, and business thinking strategies. These have significant effect on practices and outcomes. Conclusion and Recommendations: It is quite conclusive therefore that in GES teachers may need further instructions in innovations and creativity to transform knowledge-creation into business venture. In service training (INSET) has to be comprehensive. Teacher education curricula at Colleges may have to be re-visited. Teachers have the potential to raise their social capital, to be entrepreneur, and to exhibit professionalism beyond their community services. Their primal leadership focus will benefit many clienteles including students and social circles. Recommendations examined the policy implications for curriculum design, practice, innovations and educational leadership.

Keywords: emotional intelligence, entrepreneurship, leadership, quadrant intelligence (q-i), primal teacher leadership, strategic thinking, social intelligence

Procedia PDF Downloads 311
85 Developing a Deep Understanding of the Immune Response in Hepatitis B Virus Infected Patients Using a Knowledge Driven Approach

Authors: Hanan Begali, Shahi Dost, Annett Ziegler, Markus Cornberg, Maria-Esther Vidal, Anke R. M. Kraft

Abstract:

Chronic hepatitis B virus (HBV) infection can be treated with nucleot(s)ide analog (NA), for example, which inhibits HBV replication. However, they have hardly any influence on the functional cure of HBV, which is defined by hepatitis B surface antigen (HBsAg) loss. NA needs to be taken life-long, which is not available for all patients worldwide. Additionally, NA-treated patients are still at risk of developing cirrhosis, liver failure, or hepatocellular carcinoma (HCC). Although each patient has the same components of the immune system, immune responses vary between patients. Therefore, a deeper understanding of the immune response against HBV in different patients is necessary to understand the parameters leading to HBV cure and to use this knowledge to optimize HBV therapies. This requires seamless integration of an enormous amount of diverse and fine-grained data from viral markers, e.g., hepatitis B core-related antigen (HBcrAg) and hepatitis B surface antigen (HBsAg). The data integration system relies on the assumption that profiling human immune systems requires the analysis of various variables (e.g., demographic data, treatments, pre-existing conditions, immune cell response, or HLA-typing) rather than only one. However, the values of these variables are collected independently. They are presented in a myriad of formats, e.g., excel files, textual descriptions, lab book notes, and images of flow cytometry dot plots. Additionally, patients can be identified differently in these analyses. This heterogeneity complicates the integration of variables, as data management techniques are needed to create a unified view in which individual formats and identifiers are transparent when profiling the human immune systems. The proposed study (HBsRE) aims at integrating heterogeneous data sets of 87 chronically HBV-infected patients, e.g., clinical data, immune cell response, and HLA-typing, with knowledge encoded in biomedical ontologies and open-source databases into a knowledge-driven framework. This new technique enables us to harmonize and standardize heterogeneous datasets in the defined modeling of the data integration system, which will be evaluated in the knowledge graph (KG). KGs are data structures that represent the knowledge and data as factual statements using a graph data model. Finally, the analytic data model will be applied on top of KG in order to develop a deeper understanding of the immune profiles among various patients and to evaluate factors playing a role in a holistic profile of patients with HBsAg level loss. Additionally, our objective is to utilize this unified approach to stratify patients for new effective treatments. This study is developed in the context of the project “Transforming big data into knowledge: for deep immune profiling in vaccination, infectious diseases, and transplantation (ImProVIT)”, which is a multidisciplinary team composed of computer scientists, infection biologists, and immunologists.

Keywords: chronic hepatitis B infection, immune response, knowledge graphs, ontology

Procedia PDF Downloads 108
84 A Mathematical Model for Studying Landing Dynamics of a Typical Lunar Soft Lander

Authors: Johns Paul, Santhosh J. Nalluveettil, P. Purushothaman, M. Premdas

Abstract:

Lunar landing is one of the most critical phases of lunar mission. The lander is provided with a soft landing system to prevent structural damage of lunar module by absorbing the landing shock and also assure stability during landing. Presently available software are not capable to simulate the rigid body dynamics coupled with contact simulation and elastic/plastic deformation analysis. Hence a separate mathematical model has been generated for studying the dynamics of a typical lunar soft lander. Parameters used in the analysis includes lunar surface slope, coefficient of friction, initial touchdown velocity (vertical and horizontal), mass and moment of inertia of lander, crushing force due to energy absorbing material in the legs, number of legs and geometry of lander. The mathematical model is capable to simulate plastic and elastic deformation of honey comb, frictional force between landing leg and lunar soil, surface contact simulation, lunar gravitational force, rigid body dynamics and linkage dynamics of inverted tripod landing gear. The non linear differential equations generated for studying the dynamics of lunar lander is solved by numerical method. Matlab programme has been used as a computer tool for solving the numerical equations. The position of each kinematic joint is defined by mathematical equations for the generation of equation of motion. All hinged locations are defined by position vectors with respect to body fixed coordinate. The vehicle rigid body rotations and motions about body coordinate are only due to the external forces and moments arise from footpad reaction force due to impact, footpad frictional force and weight of vehicle. All these force are mathematically simulated for the generation of equation of motion. The validation of mathematical model is done by two different phases. First phase is the validation of plastic deformation of crushable elements by employing conservation of energy principle. The second phase is the validation of rigid body dynamics of model by simulating a lander model in ADAMS software after replacing the crushable elements to elastic spring element. Simulation of plastic deformation along with rigid body dynamics and contact force cannot be modeled in ADAMS. Hence plastic element of primary strut is replaced with a spring element and analysis is carried out in ADAMS software. The same analysis is also carried out using the mathematical model where the simulation of honeycomb crushing is replaced by elastic spring deformation and compared the results with ADAMS analysis. The rotational motion of linkages and 6 degree of freedom motion of lunar Lander about its CG can be validated by ADAMS software by replacing crushing element to spring element. The model is also validated by the drop test results of 4 leg lunar lander. This paper presents the details of mathematical model generated and its validation.

Keywords: honeycomb, landing leg tripod, lunar lander, primary link, secondary link

Procedia PDF Downloads 351
83 Behavioral Patterns of Adopting Digitalized Services (E-Sport versus Sports Spectating) Using Agent-Based Modeling

Authors: Justyna P. Majewska, Szymon M. Truskolaski

Abstract:

The growing importance of digitalized services in the so-called new economy, including the e-sports industry, can be observed recently. Various demographic or technological changes lead consumers to modify their needs, not regarding the services themselves but the method of their application (attracting customers, forms of payment, new content, etc.). In the case of leisure-related to competitive spectating activities, there is a growing need to participate in events whose content is not sports competitions but computer games challenge – e-sport. The literature in this area so far focuses on determining the number of e-sport fans with elements of a simple statistical description (mainly concerning demographic characteristics such as age, gender, place of residence). Meanwhile, the development of the industry is influenced by a combination of many different, intertwined demographic, personality and psychosocial characteristics of customers, as well as the characteristics of their environment. Therefore, there is a need for a deeper recognition of the determinants of the behavioral patterns upon selecting digitalized services by customers, which, in the absence of available large data sets, can be achieved by using econometric simulations – multi-agent modeling. The cognitive aim of the study is to reveal internal and external determinants of behavioral patterns of customers taking into account various variants of economic development (the pace of digitization and technological development, socio-demographic changes, etc.). In the paper, an agent-based model with heterogeneous agents (characteristics of customers themselves and their environment) was developed, which allowed identifying a three-stage development scenario: i) initial interest, ii) standardization, and iii) full professionalization. The probabilities regarding the transition process were estimated using the Method of Simulated Moments. The estimation of the agent-based model parameters and sensitivity analysis reveals crucial factors that have driven a rising trend in e-sport spectating and, in a wider perspective, the development of digitalized services. Among the psychosocial characteristics of customers, they are the level of familiarization with the rules of games as well as sports disciplines, active and passive participation history and individual perception of challenging activities. Environmental factors include general reception of games, number and level of recognition of community builders and the level of technological development of streaming as well as community building platforms. However, the crucial factor underlying the good predictive power of the model is the level of professionalization. While in the initial interest phase, the entry barriers for new customers are high. They decrease during the phase of standardization and increase again in the phase of full professionalization when new customers perceive participation history inaccessible. In this case, they are prone to switch to new methods of service application – in the case of e-sport vs. sports to new content and more modern methods of its delivery. In a wider context, the findings in the paper support the idea of a life cycle of services regarding methods of their application from “traditional” to digitalized.

Keywords: agent-based modeling, digitalized services, e-sport, spectators motives

Procedia PDF Downloads 172
82 Prevalence and Factors Associated With Concurrent Use of Herbal Medicine and Anti-retroviral Therapy Among HIV/Aids Patients Attending Selected HIV Clinics in Wakiso District

Authors: Nanteza Rachel

Abstract:

Background: Worldwide, there were 36.7 million people living with Human Immunodeficiency Virus (HIV) in 2015, up from 35 million at the end of 2013. Wakiso district is one of the hotspots for the Human Immunodeficiency Virus (HIV)/ Acquired Immune Deficiency Syndrome (AIDS) infection in Uganda, with the prevalence of 8.1 %. Herbal medicine has gained popularity among Human Immunodeficiency Virus (HIV)/ Acquired Immune Deficiency Syndrome (AIDS) patients as adjuvant therapy to reduce the adverse effects of ART. Regardless of the subsidized and physical availability of the Anti-Retroviral Therapy (ART), majority of Africans living with Human Immunodeficiency Virus (HIV)/ Acquired Immune Deficiency Syndrome (AIDS) resort to adding to their ART traditional medicine. Result found out from a pilot observation made by the PI that indicate 13 out of 30 People Living with AIDS(PLWA) who are attending Human Immunodeficiency Virus (HIV) clinics in Wakiso district reported to be using herbal preparations despite the fact that they were taking Anti Retro Viral (ARVs) this prompted this study to be done. Purpose of the study: To determine the prevalence and factors associated with concurrent use of herbal medicine and anti-retroviral therapy among HIV/AIDS patients attending selected HIV clinics in Wakiso district. Methodology: This was a cross sectional study with both quantitative data collection (use of a questionnaire) and qualitative data collection (key informants’ interviews). A mixed method of sampling was used, that is, purposive and random sampling. Purposive sampling was based on the location in the district and used to select 7 health facilities basing on the 7 health sub districts from Wakiso. Simple random sampling was used to select one HIV clinic from each of the 7 health sub districts. Furthermore, the study units were enrolled in to the study as they entered into the HIV clinics, and 105 respondents were interviewed. Both manual and computer packages (SPSS) were used to analyze the data Results: The prevalence of concurrent use of herbal medicine and ART was 38 (36.2%). Commonly HIV symptom treated with herbs was fever 27(71.1%), diarrhea 3(7.9%) and cough 2(5.3%). Commonly used herbs for fever (Omululuza (Vernonica amydalina), Ekigagi (Aloe sp), Nalongo (Justicia betonica Linn) while for diarrhea was Ntwatwa. The side effects also included; too much pain, itchy pain of HIV, aneamia,felt sick, loss/gain appetite, joint pain and bad dreams. Herbs used to sooth the side effects were; for aneamia was avocado leaves Parea Americana mill The significant factors associated with concurrent use of herbal medicine were being familiar with herbs and conventional medicine for management HIV symptoms being expensive. The other significant factor was exhibiting hostility to patients by health personnel providing HIV care. Conclusion: Herbal medicine is widely used by clients in HIV/AIDS care. Patients being familiar with herbs and conventional medicine being expensive were associated with concurrent use of herbal medicine and ART. The exhibition of hostility to the HIV/AIDS patients by the health care providers was also associated with concurrent use of herbal medicine and ART among HIV/AIDS patients.

Keywords: HIV patients, herbal medicine, antiretroviral therapy, factors associated

Procedia PDF Downloads 97
81 A Computational Investigation of Potential Drugs for Cholesterol Regulation to Treat Alzheimer’s Disease

Authors: Marina Passero, Tianhua Zhai, Zuyi (Jacky) Huang

Abstract:

Alzheimer’s disease has become a major public health issue, as indicated by the increasing populations of Americans living with Alzheimer’s disease. After decades of extensive research in Alzheimer’s disease, only seven drugs have been approved by Food and Drug Administration (FDA) to treat Alzheimer’s disease. Five of these drugs were designed to treat the dementia symptoms, and only two drugs (i.e., Aducanumab and Lecanemab) target the progression of Alzheimer’s disease, especially the accumulation of amyloid-b plaques. However, controversial comments were raised for the accelerated approvals of either Aducanumab or Lecanemab, especially with concerns on safety and side effects of these two drugs. There is still an urgent need for further drug discovery to target the biological processes involved in the progression of Alzheimer’s disease. Excessive cholesterol has been found to accumulate in the brain of those with Alzheimer’s disease. Cholesterol can be synthesized in both the blood and the brain, but the majority of biosynthesis in the adult brain takes place in astrocytes and is then transported to the neurons via ApoE. The blood brain barrier separates cholesterol metabolism in the brain from the rest of the body. Various proteins contribute to the metabolism of cholesterol in the brain, which offer potential targets for Alzheimer’s treatment. In the astrocytes, SREBP cleavage-activating protein (SCAP) binds to Sterol Regulatory Element-binding Protein 2 (SREBP2) in order to transport the complex from the endoplasmic reticulum to the Golgi apparatus. Cholesterol is secreted out of the astrocytes by ATP-Binding Cassette A1 (ABCA1) transporter. Lipoprotein receptors such as triggering receptor expressed on myeloid cells 2 (TREM2) internalize cholesterol into the microglia, while lipoprotein receptors such as Low-density lipoprotein receptor-related protein 1 (LRP1) internalize cholesterol into the neuron. Cytochrome P450 Family 46 Subfamily A Member 1 (CYP46A1) converts excess cholesterol to 24S-hydroxycholesterol (24S-OHC). Cholesterol has been approved for its direct effect on the production of amyloid-beta and tau proteins. The addition of cholesterol to the brain promotes the activity of beta-site amyloid precursor protein cleaving enzyme 1 (BACE1), secretase, and amyloid precursor protein (APP), which all aid in amyloid-beta production. The reduction of cholesterol esters in the brain have been found to reduce phosphorylated tau levels in mice. In this work, a computational pipeline was developed to identify the protein targets involved in cholesterol regulation in brain and further to identify chemical compounds as the inhibitors of a selected protein target. Since extensive evidence shows the strong correlation between brain cholesterol regulation and Alzheimer’s disease, a detailed literature review on genes or pathways related to the brain cholesterol synthesis and regulation was first conducted in this work. An interaction network was then built for those genes so that the top gene targets were identified. The involvement of these genes in Alzheimer’s disease progression was discussed, which was followed by the investigation of existing clinical trials for those targets. A ligand-protein docking program was finally developed to screen 1.5 million chemical compounds for the selected protein target. A machine learning program was developed to evaluate and predict the binding interaction between chemical compounds and the protein target. The results from this work pave the way for further drug discovery to regulate brain cholesterol to combat Alzheimer’s disease.

Keywords: Alzheimer’s disease, drug discovery, ligand-protein docking, gene-network analysis, cholesterol regulation

Procedia PDF Downloads 74
80 Assessing Sustainability of Bike Sharing Projects Using Envision™ Rating System

Authors: Tamar Trop

Abstract:

Bike sharing systems can be important elements of smart cities as they have the potential for impact on multiple levels. These systems can add a significant alternative to other modes of mass transit in cities that are continuously looking for measures to become more livable and maintain their attractiveness for citizens, businesses and tourism. Bike-sharing began in Europe in 1965, and a viable format emerged in the mid-2000s thanks to the introduction of information technology. The rate of growth in bike-sharing schemes and fleets has been very rapid since 2008 and has probably outstripped growth in every other form of urban transport. Today, public bike-sharing systems are available on five continents, including over 700 cities, operating more than 800,000 bicycles at approximately 40,000 docking stations. Since modern bike sharing systems have become prevalent only in the last decade, the existing literature analyzing these systems and their sustainability is relatively new. The purpose of the presented study is to assess the sustainability of these newly emerging transportation systems, by using the Envision™ rating system as a methodological framework and the Israeli 'Tel -O-Fun' – bike sharing project as a case study. The assessment was conducted by project team members. Envision™ is a new guidance and rating system used to assess and improve the sustainability of all types and sizes of infrastructure projects. This tool provides a holistic framework for evaluating and rating the community, environmental, and economic benefits of infrastructure projects over the course of their life cycle. This evaluation method has 60 sustainability criteria divided into five categories: Quality of life, leadership, resource allocation, natural world, and climate and risk. 'Tel -O-Fun' project was launched in Tel Aviv-Yafo on 2011 and today provides about 1,800 bikes for rent, at 180 rental stations across the city. The system is based on a complex computer terminal that is located in the docking stations. The highest-rated sustainable features that the project scored include: (a) Improving quality of life by: offering a low cost and efficient form of public transit, improving community mobility and access, enabling the flexibility of travel within a multimodal transportation system, saving commuters time and money, enhancing public health and reducing air and noise pollution; (b) improving resource allocation by: offering inexpensive and flexible last-mile connectivity, reducing space, materials and energy consumption, reducing wear and tear on public roads, and maximizing the utility of existing infrastructure, and (c) reducing of greenhouse gas emissions from transportation. Overall, 'Tel -O-Fun' project was highly scored as an environmentally sustainable and socially equitable infrastructure. The use of this practical framework for evaluation also yielded various interesting insights on the shortcoming of the system and the characteristics of good solutions. This can contribute to the improvement of the project and may assist planners and operators of bike sharing systems to develop a sustainable, efficient and reliable transportation infrastructure within smart cities.

Keywords: bike sharing, Envision™, sustainability rating system, sustainable infrastructure

Procedia PDF Downloads 340
79 Rotational and Linear Accelerations of an Anthropometric Test Dummy Head from Taekwondo Kicks among Amateur Practitioners

Authors: Gabriel P. Fife, Saeyong Lee, David M. O'Sullivan

Abstract:

Introduction: Although investigations into injury characteristics are represented well in the literature, few have investigated the biomechanical characteristics associated with head impacts in Taekwondo. Therefore, the purpose of this study was to identify the kinematic characteristics of head impacts due to taekwondo kicks among non-elite practitioners. Participants: Male participants (n= 11, 175 + 5.3 cm, 71 + 8.3 kg) with 7.5 + 3.6 years of taekwondo training volunteered for this study. Methods: Participants were asked to perform five repetitions of each technique (i.e., turning kick, spinning hook kick, spinning back kick, front axe kick, and clench axe kick) aimed at the Hybrid III head with their dominant kicking leg. All participants wore a protective foot pad (thickness = 12 mm) that is commonly used in competition and training. To simulate head impact in taekwondo, the target consisted of a Hybrid III 50th Percentile Crash Test Dummy (Hybrid III) head (mass = 5.1 kg) and neck (fitted with taekwondo headgear) secured to an aluminum support frame and positioned to each athlete’s standing height. The Hybrid III head form was instrumented with a 500 g tri-axial accelerometer (PCB Piezotronics) mounted to the head center of gravity to obtain resultant linear accelerations (RLA). Rotational accelerations were collected using three angular rate sensors mounted orthogonally to each other (Diversified Technical Systems ARS-12 K Angular Rate Sensor). The accelerometers were interfaced via a 3-channel, battery-powered integrated circuit piezoelectric sensor signal conditioner (PCB Piezotronics) and connected to a desktop computer for analysis. Acceleration data were captured using LABVIEW Signal Express and processed in accordance with SAE J211-1 channel frequency class 1000. Head injury criteria values (HIC) were calculated using the VSRSoftware. A one-way analysis of variance was used to determine differences between kicks, while the Tukey HSD test was employed for pairwise comparisons. The level of significance was set to an effect size of 0.20. All statistical analyses were done using R 3.1.0. Results: A statistically significant difference was observed in RLA (p = 0.00075); however, these differences were not clinically meaningful (η² = 0.04, 95% CI: -0.94 to 1.03). No differences were identified with ROTA (p = 0.734, η² = 0.0004, 95% CI: -0.98 to 0.98). A statistically significant difference (p < 0.001) between kicks in HIC was observed, with a medium effect (η2= 0.08, 95% CI: -0.98 to 1.07). However, the confidence interval of this difference indicates uncertainty. Tukey HSD test identified differences (p < 0.001) between kicking techniques in RLA and HIC. Conclusion: This study observed head impact levels that were comparable to previous studies of similar objectives and methodology. These data are important as impact measures from this study may be more representative of impact levels experienced by non-elite competitors. Although the clench axe kick elicited a lower RLA, the ROTA of this technique was higher than levels from other techniques (although not large differences in reference to effect sizes). As the axe kick has been reported to cause severe head injury, future studies may consider further study of this kick important.

Keywords: Taekwondo, head injury, biomechanics, kicking

Procedia PDF Downloads 26
78 Fort Conger: A Virtual Museum and Virtual Interactive World for Exploring Science in the 19th Century

Authors: Richard Levy, Peter Dawson

Abstract:

Ft. Conger, located in the Canadian Arctic was one of the most remote 19th-century scientific stations. Established in 1881 on Ellesmere Island, a wood framed structure established a permanent base from which to conduct scientific research. Under the charge of Lt. Greely, Ft. Conger was one of 14 expeditions conducted during the First International Polar Year (FIPY). Our research project “From Science to Survival: Using Virtual Exhibits to Communicate the Significance of Polar Heritage Sites in the Canadian Arctic” focused on the creation of a virtual museum website dedicated to one of the most important polar heritage site in the Canadian Arctic. This website was developed under a grant from Virtual Museum of Canada and enables visitors to explore the fort’s site from 1875 to the present, http://fortconger.org. Heritage sites are often viewed as static places. A goal of this project was to present the change that occurred over time as each new group of explorers adapted the site to their needs. The site was first visited by British explorer George Nares in 1875 – 76. Only later did the United States government select this site for the Lady Franklin Bay Expedition (1881-84) with research to be conducted under the FIPY (1882 – 83). Still later Robert Peary and Matthew Henson attempted to reach the North Pole from Ft. Conger in 1899, 1905 and 1908. A central focus of this research is on the virtual reconstruction of the Ft. Conger. In the summer of 2010, a Zoller+Fröhlich Imager 5006i and Minolta Vivid 910 laser scanner were used to scan terrain and artifacts. Once the scanning was completed, the point clouds were registered and edited to form the basis of a virtual reconstruction. A goal of this project has been to allow visitors to step back in time and explore the interior of these buildings with all of its artifacts. Links to text, historic documents, animations, panorama images, computer games and virtual labs provide explanations of how science was conducted during the 19th century. A major feature of this virtual world is the timeline. Visitors to the website can begin to explore the site when George Nares, in his ship the HMS Discovery, appeared in the harbor in 1875. With the emergence of Lt Greely’s expedition in 1881, we can track the progress made in establishing a scientific outpost. Still later in 1901, with Peary’s presence, the site is transformed again, with the huts having been built from materials salvaged from Greely’s main building. Still later in 2010, we can visit the site during its present state of deterioration and learn about the laser scanning technology which was used to document the site. The Science and Survival at Fort Conger project represents one of the first attempts to use virtual worlds to communicate the historical and scientific significance of polar heritage sites where opportunities for first-hand visitor experiences are not possible because of remote location.

Keywords: 3D imaging, multimedia, virtual reality, arctic

Procedia PDF Downloads 419
77 Deciphering Information Quality: Unraveling the Impact of Information Distortion in the UK Aerospace Supply Chains

Authors: Jing Jin

Abstract:

The incorporation of artificial intelligence (AI) and machine learning (ML) in aircraft manufacturing and aerospace supply chains leads to the generation of a substantial amount of data among various tiers of suppliers and OEMs. Identifying the high-quality information challenges decision-makers. The application of AI/ML models necessitates access to 'high-quality' information to yield desired outputs. However, the process of information sharing introduces complexities, including distortion through various communication channels and biases introduced by both human and AI entities. This phenomenon significantly influences the quality of information, impacting decision-makers engaged in configuring supply chain systems. Traditionally, distorted information is categorized as 'low-quality'; however, this study challenges this perception, positing that distorted information, contributing to stakeholder goals, can be deemed high-quality within supply chains. The main aim of this study is to identify and evaluate the dimensions of information quality crucial to the UK aerospace supply chain. Guided by a central research question, "What information quality dimensions are considered when defining information quality in the UK aerospace supply chain?" the study delves into the intricate dynamics of information quality in the aerospace industry. Additionally, the research explores the nuanced impact of information distortion on stakeholders' decision-making processes, addressing the question, "How does the information distortion phenomenon influence stakeholders’ decisions regarding information quality in the UK aerospace supply chain system?" This study employs deductive methodologies rooted in positivism, utilizing a cross-sectional approach and a mono-quantitative method -a questionnaire survey. Data is systematically collected from diverse tiers of supply chain stakeholders, encompassing end-customers, OEMs, Tier 0.5, Tier 1, and Tier 2 suppliers. Employing robust statistical data analysis methods, including mean values, mode values, standard deviation, one-way analysis of variance (ANOVA), and Pearson’s correlation analysis, the study interprets and extracts meaningful insights from the gathered data. Initial analyses challenge conventional notions, revealing that information distortion positively influences the definition of information quality, disrupting the established perception of distorted information as inherently low-quality. Further exploration through correlation analysis unveils the varied perspectives of different stakeholder tiers on the impact of information distortion on specific information quality dimensions. For instance, Tier 2 suppliers demonstrate strong positive correlations between information distortion and dimensions like access security, accuracy, interpretability, and timeliness. Conversely, Tier 1 suppliers emphasise strong negative influences on the security of accessing information and negligible impact on information timeliness. Tier 0.5 suppliers showcase very strong positive correlations with dimensions like conciseness and completeness, while OEMs exhibit limited interest in considering information distortion within the supply chain. Introducing social network analysis (SNA) provides a structural understanding of the relationships between information distortion and quality dimensions. The moderately high density of ‘information distortion-by-information quality’ underscores the interconnected nature of these factors. In conclusion, this study offers a nuanced exploration of information quality dimensions in the UK aerospace supply chain, highlighting the significance of individual perspectives across different tiers. The positive influence of information distortion challenges prevailing assumptions, fostering a more nuanced understanding of information's role in the Industry 4.0 landscape.

Keywords: information distortion, information quality, supply chain configuration, UK aerospace industry

Procedia PDF Downloads 64
76 Effects of School Culture and Curriculum on Gifted Adolescent Moral, Social, and Emotional Development: A Longitudinal Study of Urban Charter Gifted and Talented Programs

Authors: Rebekah Granger Ellis, Pat J. Austin, Marc P. Bonis, Richard B. Speaker, Jr.

Abstract:

Using two psychometric instruments, this study examined social and emotional intelligence and moral judgment levels of more than 300 gifted and talented high school students enrolled in arts-integrated, academic acceleration, and creative arts charter schools in an ethnically diverse large city in the southeastern United States. Gifted and talented individuals possess distinguishable characteristics; these frequently appear as strengths, but often serious problems accompany them. Although many gifted adolescents thrive in their environments, some struggle in their school and community due to emotional intensity, motivation and achievement issues, lack of peers and isolation, identification problems, sensitivity to expectations and feelings, perfectionism, and other difficulties. These gifted students endure and survive in school rather than flourish. Gifted adolescents face special intrapersonal, interpersonal, and environmental problems. Furthermore, they experience greater levels of stress, disaffection, and isolation than non-gifted individuals due to their advanced cognitive abilities. Therefore, it is important to examine the long-term effects of participation in various gifted and talented programs on the socio-affective development of these adolescents. Numerous studies have researched moral, social, and emotional development in the areas of cognitive-developmental, psychoanalytic, and behavioral learning; however, in almost all cases, these three facets have been studied separately leading to many divergent theories. Additionally, various frameworks and models purporting to encourage the different socio-affective branches of development have been debated in curriculum theory, yet research is inconclusive on the effectiveness of these programs. Most often studied is the socio-affective domain, which includes development and regulation of emotions; empathy development; interpersonal relations and social behaviors; personal and gender identity construction; and moral development, thinking, and judgment. Examining development in these domains can provide insight into why some gifted and talented adolescents are not always successful in adulthood despite advanced IQ scores. Particularly whether emotional, social and moral capabilities of gifted and talented individuals are as advanced as their intellectual abilities and how these are related to each other. This mixed methods longitudinal study examined students in urban gifted and talented charter schools for (1) socio-affective development levels and (2) whether a particular environment encourages developmental growth. Research questions guiding the study: (1) How do academically and artistically gifted 10th and 11th grade students perform on psychological scales of social and emotional intelligence and moral judgment? Do they differ from the normative sample? Do gender differences exist among gifted students? (2) Do adolescents who attend distinctive gifted charter schools differ in developmental profiles? Students’ performances on psychometric instruments were compared over time and by program type. Assessing moral judgment (DIT-2) and socio-emotional intelligence (BarOn EQ-I: YV), participants took pre-, mid-, and post-tests during one academic school year. Quantitative differences in growth on these psychological scales (individuals and school-wide) were examined. If a school showed change, qualitative artifacts (culture, curricula, instructional methodology, stakeholder interviews) provided insight for environmental correlation.

Keywords: gifted and talented programs, moral judgment, social and emotional intelligence, socio-affective education

Procedia PDF Downloads 192
75 Glocalization of Journalism and Mass Communication Education: Best Practices from an International Collaboration on Curriculum Development

Authors: Bellarmine Ezumah, Michael Mawa

Abstract:

Glocalization is often defined as the practice of conducting business according to both local and global considerations – this epitomizes the curriculum co-development collaboration between a journalism and mass communications professor from a university in the United States and the Uganda Martyrs University in Uganda where a brand new journalism and mass communications program was recently co-developed. This paper presents the experiences and research result of this initiative which was funded through the Institute of International Education (IIE) under the umbrella of the Carnegie African Diaspora Fellowship Program (CADFP). Vital international and national concerns were addressed. On a global level, scholars have questioned and criticized the general Western-module ingrained in journalism and mass communication curriculum and proposed a decolonization of journalism curricula. Another major criticism is the concept of western-based educators transplanting their curriculum verbatim to other regions of the world without paying greater attention to the local needs. To address these two global concerns, an extensive assessment of local needs was conducted prior to the conceptualization of the new program. The assessment of needs adopted a participatory action model and captured the knowledge and narratives of both internal and external stakeholders. This involved review of pertinent documents including the nation’s constitution, governmental briefs, and promulgations, interviews with governmental officials, media and journalism educators, media practitioners, students, and benchmarking the curriculum of other tertiary institutions in the nation. Information gathered through this process served as blueprint and frame of reference for all design decisions. In the area of local needs, four key factors were addressed. First, the realization that most media personnel in Uganda are both academically and professionally unqualified. Second, the practitioners with academic training were found lacking in experience. Third, the current curricula offered at several tertiary institutions are not comprehensive and lack local relevance. The project addressed these problems thus: first, the program was designed to cater to both traditional and non-traditional students offering opportunities for unqualified media practitioners to get their formal training through evening and weekender programs. Secondly, the challenge of inexperienced graduates was mitigated by designing the program to adopt the experiential learning approach which many refer to as the ‘Teaching Hospital Model’. This entails integrating practice to theory - similar to the way medical students engage in hands-on practice under the supervision of a mentor. The university drew a Memorandum of Understanding (MoU) with reputable media houses for students and faculty to use their studios for hands-on experience and for seasoned media practitioners to guest-teach some courses. With the convergence functions of media industry today, graduates should be trained to have adequate knowledge of other disciplines; therefore, the curriculum integrated cognate courses that would render graduates versatile. Ultimately, this research serves as a template for African colleges and universities to follow in their quest to glocalize their curricula. While the general concept of journalism may remain western, journalism curriculum developers in Africa through extensive assessment of needs, and focusing on those needs and other societal particularities, can adjust the western module to fit their local needs.

Keywords: curriculum co-development, glocalization of journalism education, international journalism, needs assessment

Procedia PDF Downloads 129
74 Investigation of Software Integration for Simulations of Buoyancy-Driven Heat Transfer in a Vehicle Underhood during Thermal Soak

Authors: R. Yuan, S. Sivasankaran, N. Dutta, K. Ebrahimi

Abstract:

This paper investigates the software capability and computer-aided engineering (CAE) method of modelling transient heat transfer process occurred in the vehicle underhood region during vehicle thermal soak phase. The heat retention from the soak period will be beneficial to the cold start with reduced friction loss for the second 14°C worldwide harmonized light-duty vehicle test procedure (WLTP) cycle, therefore provides benefits on both CO₂ emission reduction and fuel economy. When vehicle undergoes soak stage, the airflow and the associated convective heat transfer around and inside the engine bay is driven by the buoyancy effect. This effect along with thermal radiation and conduction are the key factors to the thermal simulation of the engine bay to obtain the accurate fluids and metal temperature cool-down trajectories and to predict the temperatures at the end of the soak period. Method development has been investigated in this study on a light-duty passenger vehicle using coupled aerodynamic-heat transfer thermal transient modelling method for the full vehicle under 9 hours of thermal soak. The 3D underhood flow dynamics were solved inherently transient by the Lattice-Boltzmann Method (LBM) method using the PowerFlow software. This was further coupled with heat transfer modelling using the PowerTHERM software provided by Exa Corporation. The particle-based LBM method was capable of accurately handling extremely complicated transient flow behavior on complex surface geometries. The detailed thermal modelling, including heat conduction, radiation, and buoyancy-driven heat convection, were integrated solved by PowerTHERM. The 9 hours cool-down period was simulated and compared with the vehicle testing data of the key fluid (coolant, oil) and metal temperatures. The developed CAE method was able to predict the cool-down behaviour of the key fluids and components in agreement with the experimental data and also visualised the air leakage paths and thermal retention around the engine bay. The cool-down trajectories of the key components obtained for the 9 hours thermal soak period provide vital information and a basis for the further development of reduced-order modelling studies in future work. This allows a fast-running model to be developed and be further imbedded with the holistic study of vehicle energy modelling and thermal management. It is also found that the buoyancy effect plays an important part at the first stage of the 9 hours soak and the flow development during this stage is vital to accurately predict the heat transfer coefficients for the heat retention modelling. The developed method has demonstrated the software integration for simulating buoyancy-driven heat transfer in a vehicle underhood region during thermal soak with satisfying accuracy and efficient computing time. The CAE method developed will allow integration of the design of engine encapsulations for improving fuel consumption and reducing CO₂ emissions in a timely and robust manner, aiding the development of low-carbon transport technologies.

Keywords: ATCT/WLTC driving cycle, buoyancy-driven heat transfer, CAE method, heat retention, underhood modeling, vehicle thermal soak

Procedia PDF Downloads 153
73 Neologisms and Word-Formation Processes in Board Game Rulebook Corpus: Preliminary Results

Authors: Athanasios Karasimos, Vasiliki Makri

Abstract:

This research focuses on the design and development of the first text Corpus based on Board Game Rulebooks (BGRC) with direct application on the morphological analysis of neologisms and tendencies in word-formation processes. Corpus linguistics is a dynamic field that examines language through the lens of vast collections of texts. These corpora consist of diverse written and spoken materials, ranging from literature and newspapers to transcripts of everyday conversations. By morphologically analyzing these extensive datasets, morphologists can gain valuable insights into how language functions and evolves, as these extensive datasets can reflect the byproducts of inflection, derivation, blending, clipping, compounding, and neology. This entails scrutinizing how words are created, modified, and combined to convey meaning in a corpus of challenging, creative, and straightforward texts that include rules, examples, tutorials, and tips. Board games teach players how to strategize, consider alternatives, and think flexibly, which are critical elements in language learning. Their rulebooks reflect not only their weight (complexity) but also the language properties of each genre and subgenre of these games. Board games are a captivating realm where strategy, competition, and creativity converge. Beyond the excitement of gameplay, board games also spark the art of word creation. Word games, like Scrabble, Codenames, Bananagrams, Wordcraft, Alice in the Wordland, Once uUpona Time, challenge players to construct words from a pool of letters, thus encouraging linguistic ingenuity and vocabulary expansion. These games foster a love for language, motivating players to unearth obscure words and devise clever combinations. On the other hand, the designers and creators produce rulebooks, where they include their joy of discovering the hidden potential of language, igniting the imagination, and playing with the beauty of words, making these games a delightful fusion of linguistic exploration and leisurely amusement. In this research, more than 150 rulebooks in English from all types of modern board games, either language-independent or language-dependent, are used to create the BGRC. A representative sample of each genre (family, party, worker placement, deckbuilding, dice, and chance games, strategy, eurogames, thematic, role-playing, among others) was selected based on the score from BoardGameGeek, the size of the texts and the level of complexity (weight) of the game. A morphological model with morphological networks, multi-word expressions, and word-creation mechanics based on the complexity of the textual structure, difficulty, and board game category will be presented. In enabling the identification of patterns, trends, and variations in word formation and other morphological processes, this research aspires to make avail of this creative yet strict text genre so as to (a) give invaluable insight into morphological creativity and innovation that (re)shape the lexicon of the English language and (b) test morphological theories. Overall, it is shown that corpus linguistics empowers us to explore the intricate tapestry of language, and morphology in particular, revealing its richness, flexibility, and adaptability in the ever-evolving landscape of human expression.

Keywords: board game rulebooks, corpus design, morphological innovations, neologisms, word-formation processes

Procedia PDF Downloads 97
72 Development and Clinical Application of a Cochlear Implant Mapping Assistance System

Authors: Hong Mengdi, Li Jianan, Ji Fei, Chen Aiting, Wang Qian

Abstract:

Objective: To overcome the communication barriers that audiologists encounter during cochlear implant mapping, particularly the challenge of eliciting subjective feedback from recipients regarding electrical stimulation, and to enhance the capabilities of existing technologies, we teamed up with software engineers to design an interactive approach for patient-audiologist communication. This approach employs a tablet (PAD) as the interface for a communication and feedback system between patients and audiologists during the mapping process, known as the Cochlear Implant Mapping Assistance System. Methods: Capitalizing on the touchscreen functionality of the PAD, the recipients' subjective feedback during cochlear implant mapping is instantly transmitted to the audiologist's mapping computer. The system acts as a platform for auditory assessment instruments, facilitating immediate evaluation of recipients' post-mapping hearing and speech discrimination capabilities. Furthermore, the system is designed to augment the visual reinforcement audiometry (VRA) process. The system consists of six modules, including three testing projects: loudness testing, hearing threshold testing, and loudness balance testing; two assessment projects: warble tone testing and digit speech testing; and one VRA animation project. It also incorporates speech-to-text and text input display functions tailored to accommodate speech communication difficulties in hearing-impaired individuals, with pre-installed common exchange content between audiologists and recipients. Audiologists can input sentences by selecting options. The system supports switching between Chinese and English versions, suitable for audiologists and recipients who use English, facilitating international application of the system. Results: The Cochlear Implant Mapping Assistance System has been in use for over a year in the Auditory Implant Center of the Department of Otology and Neurotology, Medical Center of Otology and Head & Neck Surgery, Chinese PLA General Hospital, with more than 300 recipients using this mapping system. Currently, the system operates stably, with both audiologists and recipients providing positive feedback, indicating a significant improvement over previous methods. It is particularly well-received by pediatric recipients, significantly enhancing the work efficiency of audiologists and improving the feedback efficiency and accuracy of recipients. The system enhances the comprehensibility for cochlear implant recipients, improves wearing comfort and user experience, facilitates cochlear implant auditory mapping, and increases the collection of previously challenging-to-obtain data during the existing assisted mapping process, such as loudness testing data, electrical stimulation testing data, warble tone testing data, loudness balance testing data, digit speech testing data, and visual reinforcement audiometry testing data. Real-time data recording improves the accuracy of assisted mapping. The interface design is meticulously crafted to accommodate patients of varying ages and cognitive abilities, featuring an intuitive design that allows for effortless, guidance-free use by patients.

Keywords: audiologist, subjective feedback, mapping, cochlear implant

Procedia PDF Downloads 20
71 Understanding the Impact of Spatial Light Distribution on Object Identification in Low Vision: A Pilot Psychophysical Study

Authors: Alexandre Faure, Yoko Mizokami, éRic Dinet

Abstract:

These recent years, the potential of light in assisting visually impaired people in their indoor mobility has been demonstrated by different studies. Implementing smart lighting systems for selective visual enhancement, especially designed for low-vision people, is an approach that breaks with the existing visual aids. The appearance of the surface of an object is significantly influenced by the lighting conditions and the constituent materials of the objects. Appearance of objects may appear to be different from expectation. Therefore, lighting conditions lead to an important part of accurate material recognition. The main objective of this work was to investigate the effect of the spatial distribution of light on object identification in the context of low vision. The purpose was to determine whether and what specific lighting approaches should be preferred for visually impaired people. A psychophysical experiment was designed to study the ability of individuals to identify the smallest cube of a pair under different lighting diffusion conditions. Participants were divided into two distinct groups: a reference group of observers with normal or corrected-to-normal visual acuity and a test group, in which observers were required to wear visual impairment simulation glasses. All participants were presented with pairs of cubes in a "miniature room" and were instructed to estimate the relative size of the two cubes. The miniature room replicates real-life settings, adorned with decorations and separated from external light sources by black curtains. The correlated color temperature was set to 6000 K, and the horizontal illuminance at the object level at approximately 240 lux. The objects presented for comparison consisted of 11 white cubes and 11 black cubes of different sizes manufactured with a 3D printer. Participants were seated 60 cm away from the objects. Two different levels of light diffuseness were implemented. After receiving instructions, participants were asked to judge whether the two presented cubes were the same size or if one was smaller. They provided one of five possible answers: "Left one is smaller," "Left one is smaller but unsure," "Same size," "Right one is smaller," or "Right one is smaller but unsure.". The method of constant stimuli was used, presenting stimulus pairs in a random order to prevent learning and expectation biases. Each pair consisted of a comparison stimulus and a reference cube. A psychometric function was constructed to link stimulus value with the frequency of correct detection, aiming to determine the 50% correct detection threshold. Collected data were analyzed through graphs illustrating participants' responses to stimuli, with accuracy increasing as the size difference between cubes grew. Statistical analyses, including 2-way ANOVA tests, showed that light diffuseness had no significant impact on the difference threshold, whereas object color had a significant influence in low vision scenarios. The first results and trends derived from this pilot experiment clearly and strongly suggest that future investigations could explore extreme diffusion conditions to comprehensively assess the impact of diffusion on object identification. For example, the first findings related to light diffuseness may be attributed to the range of manipulation, emphasizing the need to explore how other lighting-related factors interact with diffuseness.

Keywords: Lighting, Low Vision, Visual Aid, Object Identification, Psychophysical Experiment

Procedia PDF Downloads 64
70 Shifting Contexts and Shifting Identities: Campus Race-related Experiences, Racial Identity, and Achievement Motivation among Black College Students during the Transition to College

Authors: Tabbye Chavous, Felecia Webb, Bridget Richardson, Gloryvee Fonseca-Bolorin, Seanna Leath, Robert Sellers

Abstract:

There has been recent renewed attention to Black students’ experiences at predominantly White U.S. universities (PWIs), e.g., the #BBUM (“Being Black at the University of Michigan”), “I too am Harvard” social media campaigns, and subsequent student protest activities nationwide. These campaigns illuminate how many minority students encounter challenges to their racial/ethnic identities as they enter PWI contexts. Students routinely report experiences such as being ignored or treated as a token in classes, receiving messages of low academic expectations by faculty and peers, being questioned about their academic qualifications or belonging, being excluded from academic and social activities, and being racially profiled and harassed in the broader campus community due to race. Researchers have linked such racial marginalization and stigma experiences to student motivation and achievement. One potential mechanism is through the impact of college experiences on students’ identities, given the relevance of the college context for students’ personal identity development, including personal beliefs systems around social identities salient in this context. However, little research examines the impact of the college context on Black students’ racial identities. This study examined change in Black college students’ (N=329) racial identity beliefs over the freshman year at three predominantly White U.S. universities. Using cluster analyses, we identified profile groups reflecting different patterns of stability and change in students’ racial centrality (importance of race to overall self-concept), private regard (personal group affect/group pride), and public regard (perceptions of societal views of Blacks) from beginning of year (Time 1) to end of year (Time 2). Multinomial logit regression analyses indicated that the racial identity change clusters were predicted by pre-college background (racial composition of high school and neighborhood), as well as college-based experiences (racial discrimination, interracial friendships, and perceived campus racial climate). In particular, experiencing campus racial discrimination related to high, stable centrality, and decreases in private regard and public regard. Perceiving racial climates norms of institutional support for intergroup interactions on campus related to maintaining low and decreasing in private and public regard. Multivariate Analyses of Variance results showed change cluster effects on achievement motivation outcomes at the end of students’ academic year. Having high, stable centrality and high private regard related to more positive outcomes overall (academic competence, positive academic affect, academic curiosity and persistence). Students decreasing in private regard and public regard were particularly vulnerable to negative motivation outcomes. Findings support scholarship indicating both stability in racial identity beliefs and the importance of critical context transitions in racial identity development and adjustment outcomes among emerging adults. Findings also are consistent with research suggesting promotive effects of a strong, positive racial identity on student motivation, as well as research linking awareness of racial stigma to decreased academic engagement.

Keywords: diversity, motivation, learning, ethnic minority achievement, higher education

Procedia PDF Downloads 517
69 Critiquing Israel as Child Abuse: How Colonial White Feminism Disrupts Critical Pedagogies of Culturally Responsive and Relevant Practices and Inclusion through Ongoing and Historical Maternalism and Neoliberal Settler Colonialism

Authors: Wafaa Hasan

Abstract:

In May of 2022, Palestinian parents in Toronto, Canada, became aware that educators and staff in the Toronto District School Board were attempting to include the International Holocaust and Remembrance Definition of Antisemitism (IHRA) in The Child Abuse and Neglect Policy of the largest school board in Canada, The Toronto District School Board (TDSB). The idea was that if students were to express any form of antisemitism, as defined by the IHRA, then an investigation could follow with Child Protective Services (CPS). That is, the student’s parents could be reported to the state and investigated for custodial rights to their children. The TDSB has set apparent goals for “Decolonizing Pedagogy” (“TDSB Equity Leadership Competencies”), Culturally Responsive and Relevant Practices (CRRP) and inclusive education. These goals promote the centering of colonized, racialized and marginalized voices. CRRP cannot be effective without the application of anti-racist and settler colonial analyses. In order for CRRP to be effective, school boards need a comprehensive understanding of the ways in which the vilification of Palestinians operates through anti-indigenous and white supremacist systems and logic. Otherwise, their inclusion will always be in tension with the inclusion of settler colonial agendas and worldviews. Feminist maternalism frames racial mothering as degenerate (viewing the contributions of racialized students and their parents as products of primitive and violent cultures) and also indirectly inhibits the actualization of the tenets of CRRP and inclusive education through its extensions into the welfare state and public education. The contradiction between the tenets of CRRP and settler colonial systems of erasure and repression is resolved by the continuation of tactics to 1) force assimilation, 2) punish those who push back on that assimilation and 3) literally fragment familial and community structures of racialized students, educators and parents. This paper draws on interdisciplinary (history, philosophy, anthropology) critiques of white feminist “maternalism” from the 19th century onwards in North America and Europe (Jacobs, Weber), as well as “anti-racist education” theory (Dei), and more specifically,” culturally responsive learning,” (Muhammad) and “bandwidth” pedagogy theory (Verschelden) to make its claims. This research contributes to vibrant debates about anti-racist and decolonial pedagogies in public education systems globally. This paper also documents first-hand interviews and experiences of diasporic Palestinian mothers and motherhoods and situates their experiences within longstanding histories of white feminist maternalist (and eugenicist) politics. This informal qualitative data from "participatory conversations" (Swain) is situated within a set of formal interview data collected with Palestinian women in the West Bank (approved by the McMaster University Humanities Research Ethics Board) relating to white feminist maternalism in the peace and dialogue industry.

Keywords: decolonial feminism, maternal feminism, anti-racist pedagogies, settler colonial studies, motherhood studies, pedagogy theory, cultural theory

Procedia PDF Downloads 73
68 Improved Elastoplastic Bounding Surface Model for the Mathematical Modeling of Geomaterials

Authors: Andres Nieto-Leal, Victor N. Kaliakin, Tania P. Molina

Abstract:

The nature of most engineering materials is quite complex. It is, therefore, difficult to devise a general mathematical model that will cover all possible ranges and types of excitation and behavior of a given material. As a result, the development of mathematical models is based upon simplifying assumptions regarding material behavior. Such simplifications result in some material idealization; for example, one of the simplest material idealization is to assume that the material behavior obeys the elasticity. However, soils are nonhomogeneous, anisotropic, path-dependent materials that exhibit nonlinear stress-strain relationships, changes in volume under shear, dilatancy, as well as time-, rate- and temperature-dependent behavior. Over the years, many constitutive models, possessing different levels of sophistication, have been developed to simulate the behavior geomaterials, particularly cohesive soils. Early in the development of constitutive models, it became evident that elastic or standard elastoplastic formulations, employing purely isotropic hardening and predicated in the existence of a yield surface surrounding a purely elastic domain, were incapable of realistically simulating the behavior of geomaterials. Accordingly, more sophisticated constitutive models have been developed; for example, the bounding surface elastoplasticity. The essence of the bounding surface concept is the hypothesis that plastic deformations can occur for stress states either within or on the bounding surface. Thus, unlike classical yield surface elastoplasticity, the plastic states are not restricted only to those lying on a surface. Elastoplastic bounding surface models have been improved; however, there is still need to improve their capabilities in simulating the response of anisotropically consolidated cohesive soils, especially the response in extension tests. Thus, in this work an improved constitutive model that can more accurately predict diverse stress-strain phenomena exhibited by cohesive soils was developed. Particularly, an improved rotational hardening rule that better simulate the response of cohesive soils in extension. The generalized definition of the bounding surface model provides a convenient and elegant framework for unifying various previous versions of the model for anisotropically consolidated cohesive soils. The Generalized Bounding Surface Model for cohesive soils is a fully three-dimensional, time-dependent model that accounts for both inherent and stress induced anisotropy employing a non-associative flow rule. The model numerical implementation in a computer code followed an adaptive multistep integration scheme in conjunction with local iteration and radial return. The one-step trapezoidal rule was used to get the stiffness matrix that defines the relationship between the stress increment and the strain increment. After testing the model in simulating the response of cohesive soils through extensive comparisons of model simulations to experimental data, it has been shown to give quite good simulations. The new model successfully simulates the response of different cohesive soils; for example, Cardiff Kaolin, Spestone Kaolin, and Lower Cromer Till. The simulated undrained stress paths, stress-strain response, and excess pore pressures are in very good agreement with the experimental values, especially in extension.

Keywords: bounding surface elastoplasticity, cohesive soils, constitutive model, modeling of geomaterials

Procedia PDF Downloads 315