Search results for: sensor monitoring
445 Lessons Learned from Push-Plus Implementation in Northern Nigeria
Authors: Aisha Giwa, Mohammed-Faosy Adeniran, Olufunke Femi-Ojo
Abstract:
Four decades ago, the World Health Organization (WHO) launched the Expanded Programme on Immunization (EPI). The EPI blueprint laid out the technical and managerial functions necessary to routinely vaccinate children with a limited number of vaccines, providing protection against diphtheria, tetanus, whooping cough, measles, polio, and tuberculosis, and to prevent maternal and neonatal tetanus by vaccinating women of childbearing age with tetanus toxoid. Despite global efforts, the Routine Immunization (RI) coverage in two of the World Health Organization (WHO) regions; the African Region and the South-East Asia Region, still remains short of its targets. As a result, the WHO Regional Director for Africa declared 2012 as the year for intensifying RI in these regions and this also coincided with the declaration of polio as a programmatic emergency by the WHO Executive Board. In order to intensify routine immunization, the National Routine Immunization Strategic Plan (2013-2015) stated that its core priority is to ensure 100% adequacy and availability of vaccines for safe immunization. To achieve 100% availability, the “PUSH System” and then “Push-Plus” were adopted for vaccine distribution, which replaced the inefficient “PULL” method. The NPHCDA plays the key role in coordinating activities in area advocacy, capacity building, engagement of 3PL for the state as well as monitoring and evaluation of the vaccine delivery process. eHealth Africa (eHA) is a player as a 3PL service provider engaged by State Primary Health Care Boards (SPHCDB) to ensure vaccine availability through Vaccine Direct Delivery (VDD) project which is essential to successful routine immunization services. The VDD project ensures the availability and adequate supply of high-quality vaccines and immunization-related materials to last-mile facilities. eHA’s commitment to the VDD project saw the need for an assessment of the project vis-a-vis the overall project performance, evaluation of a process for necessary improvement suggestions as well as general impact across Kano State (Where eHA had transitioned to the state), Bauchi State (currently manage delivery to all LGAs except 3 LGAs currently being managed by the state), Sokoto State (eHA currently covers all LGAs) and Zamfara State (Currently, in-sourced and managed solely by the state).Keywords: cold chain logistics, health supply chain system strengthening, logistics management information system, vaccine delivery traceability and accountability
Procedia PDF Downloads 317444 A Geographical Spatial Analysis on the Benefits of Using Wind Energy in Kuwait
Authors: Obaid AlOtaibi, Salman Hussain
Abstract:
Wind energy is associated with many geographical factors including wind speed, climate change, surface topography, environmental impacts, and several economic factors, most notably the advancement of wind technology and energy prices. It is the fastest-growing and least economically expensive method for generating electricity. Wind energy generation is directly related to the characteristics of spatial wind. Therefore, the feasibility study for the wind energy conversion system is based on the value of the energy obtained relative to the initial investment and the cost of operation and maintenance. In Kuwait, wind energy is an appropriate choice as a source of energy generation. It can be used in groundwater extraction in agricultural areas such as Al-Abdali in the north and Al-Wafra in the south, or in fresh and brackish groundwater fields or remote and isolated locations such as border areas and projects away from conventional power electricity services, to take advantage of alternative energy, reduce pollutants, and reduce energy production costs. The study covers the State of Kuwait with an exception of metropolitan area. Climatic data were attained through the readings of eight distributed monitoring stations affiliated with Kuwait Institute for Scientific Research (KISR). The data were used to assess the daily, monthly, quarterly, and annual available wind energy accessible for utilization. The researchers applied the Suitability Model to analyze the study by using the ArcGIS program. It is a model of spatial analysis that compares more than one location based on grading weights to choose the most suitable one. The study criteria are: the average annual wind speed, land use, topography of land, distance from the main road networks, urban areas. According to the previous criteria, the four proposed locations to establish wind farm projects are selected based on the weights of the degree of suitability (excellent, good, average, and poor). The percentage of areas that represents the most suitable locations with an excellent rank (4) is 8% of Kuwait’s area. It is relatively distributed as follows: Al-Shqaya, Al-Dabdeba, Al-Salmi (5.22%), Al-Abdali (1.22%), Umm al-Hayman (0.70%), North Wafra and Al-Shaqeeq (0.86%). The study recommends to decision-makers to consider the proposed location (No.1), (Al-Shqaya, Al-Dabdaba, and Al-Salmi) as the most suitable location for future development of wind farms in Kuwait, this location is economically feasible.Keywords: Kuwait, renewable energy, spatial analysis, wind energy
Procedia PDF Downloads 151443 A Semi-Automated GIS-Based Implementation of Slope Angle Design Reconciliation Process at Debswana Jwaneng Mine, Botswana
Authors: K. Mokatse, O. M. Barei, K. Gabanakgosi, P. Matlhabaphiri
Abstract:
The mining of pit slopes is often associated with some level of deviation from design recommendations, and this may translate to associated changes in the stability of the excavated pit slopes. Therefore slope angle design reconciliations are essential for assessing and monitoring compliance of excavated pit slopes to accepted slope designs. These associated changes in slope stability may be reflected by changes in the calculated factors of safety and/or probabilities of failure. Reconciliations of as-mined and slope design profiles are conducted periodically to assess the implications of these deviations on pit slope stability. Currently, the slope design reconciliation process being implemented in Jwaneng Mine involves the measurement of as-mined and design slope angles along vertical sections cut along the established geotechnical design section lines on the GEOVIA GEMS™ software. Bench retentions are calculated as a percentage of the available catchment area, less over-mined and under-mined areas, to that of the designed catchment area. This process has proven to be both tedious and requires a lot of manual effort and time to execute. Consequently, a new semi-automated mine-to-design reconciliation approach that utilizes laser scanning and GIS-based tools is being proposed at Jwaneng Mine. This method involves high-resolution scanning of targeted bench walls, subsequent creation of 3D surfaces from point cloud data and the derivation of slope toe lines and crest lines on the Maptek I-Site Studio software. The toe lines and crest lines are then exported to the ArcGIS software where distance offsets between the design and actual bench toe lines and crest lines are calculated. Retained bench catchment capacity is measured as distances between the toe lines and crest lines on the same bench elevations. The assessment of the performance of the inter-ramp and overall slopes entails the measurement of excavated and design slope angles along vertical sections on the ArcGIS software. Excavated and design toe-to-toe or crest-to-crest slope angles are measured for inter-ramp stack slope reconciliations. Crest-to-toe slope angles are also measured for overall slope angle design reconciliations. The proposed approach allows for a more automated, accurate, quick and easier workflow for carrying out slope angle design reconciliations. This process has proved highly effective and timeous in the assessment of slope performance in Jwaneng Mine. This paper presents a newly proposed process for assessing compliance to slope angle designs for Jwaneng Mine.Keywords: slope angle designs, slope design recommendations, slope performance, slope stability
Procedia PDF Downloads 237442 Power Recovery from Waste Air of Mine Ventilation Fans Using Wind Turbines
Authors: Soumyadip Banerjee, Tanmoy Maity
Abstract:
The recovery of power from waste air generated by mine ventilation fans presents a promising avenue for enhancing energy efficiency in mining operations. This abstract explores the feasibility and benefits of utilizing turbine generators to capture the kinetic energy present in waste air and convert it into electrical power. By integrating turbine generator systems into mine ventilation infrastructures, the potential to harness and utilize the previously untapped energy within the waste air stream is realized. This study examines the principles underlying turbine generator technology and its application within the context of mine ventilation systems. The process involves directing waste air from ventilation fans through specially designed turbines, where the kinetic energy of the moving air is converted into rotational motion. This mechanical energy is then transferred to connected generators, which convert it into electrical power. The recovered electricity can be employed for various on-site applications, including powering mining equipment, lighting, and control systems. The benefits of power recovery from waste air using turbine generators are manifold. Improved energy efficiency within the mining environment results in reduced dependence on external power sources and associated cost savings. Additionally, this approach contributes to environmental sustainability by utilizing a previously wasted resource for power generation. Resource conservation is further enhanced, aligning with modern principles of sustainable mining practices. However, successful implementation requires careful consideration of factors such as waste air characteristics, turbine design, generator efficiency, and integration into existing mine infrastructure. Maintenance and monitoring protocols are necessary to ensure consistent performance and longevity of the turbine generator systems. While there is an initial investment associated with equipment procurement, installation, and integration, the long-term benefits of reduced energy costs and environmental impact make this approach economically viable. In conclusion, the recovery of power from waste air from mine ventilation fans using turbine generators offers a tangible solution to enhance energy efficiency and sustainability within mining operations. By capturing and converting the kinetic energy of waste air into usable electrical power, mines can optimize resource utilization, reduce operational costs, and contribute to a greener future for the mining industry.Keywords: waste to energy, wind power generation, exhaust air, power recovery
Procedia PDF Downloads 37441 Cultural and Natural Heritage Conservation by GIS Tourism Inventory System Project
Authors: Gamze Safak, Umut Arslanoglu
Abstract:
Cultural and tourism conservation and development zones and tourism centers are the boundaries declared for the purpose of protecting, using, and evaluating the sectoral development and planned development in areas where historical and cultural values are heavily involved and/or where tourism potential is high. The most rapidly changing regions in Turkey are tourism areas, especially the coastal areas. Planning these regions is not about only an economic gain but also a natural and physical environment and refers to a complex process. If the tourism sector is not well controlled, excessive use of natural resources and wrong location choices may cause damage to natural areas, historical values, and socio-cultural structure. Since the strategic decisions taken in the environmental order and zoning plans, which are the means of guiding the physical environment of the Ministry of Culture and Tourism, which have the authority to make plans in tourism centers, are transformed into plan decisions that find the spatial expression, comprehensive evaluation of all kinds of data, following the historical development and based on the correct and current data is required. In addition, the authority has a number of competences in tourism promotion as well as the authority to plan, leading to the necessity of taking part in the applications requiring complex analysis such as the management and integration of the country's economic, political, social and cultural resources. For this purpose, Tourism Inventory System (TES) project, which consists of a series of subsystems, has been developed in order to solve complex planning and method problems in the management of site-related information. The scope of the project is based on the integration of numerical and verbal data in the regions within the jurisdiction of the authority, and the monitoring of the historical development of urban planning studies, making the spatial data of the institution easily accessible, shared, questionable and traceable in international standards. A dynamic and continuous system design has been put into practice by utilizing the advantage of the use of Geographical Information Systems in the planning process to play a role in making the right decisions, revealing the tools of social, economic, cultural development, and preservation of natural and cultural values. This paper, which is prepared by the project team members in TES (Tourism Inventory System), will present a study regarding the applicability of GIS in cultural and natural heritage conservation.Keywords: cultural conservation, GIS, geographic information system, tourism inventory system, urban planning
Procedia PDF Downloads 120440 The Good Form of a Sustainable Creative Learning City Based on “The Theory of a Good City Form“ by Kevin Lynch
Authors: Fatemeh Moosavi, Tumelo Franck Nkoshwane
Abstract:
Peter Drucker the renowned management guru once said, “The best way to predict the future is to create it.” Mr. Drucker is also the man who placed human capital as the most vital resource of any institution. As such any institution bent on creating a better future, requires a competent human capital, one that is able to execute with efficiency and effectiveness the objective a society aspires to. Technology today is accelerating the rate at which many societies transition to knowledge based societies. In this accelerated paradigm, it is imperative that those in leadership establish a platform capable of sustaining the planned future; intellectual capital. The capitalist economy going into the future will not just be sustained by dollars and cents, but by individuals who possess the creativity to enterprise, innovate and create wealth from ideas. This calls for cities of the future, to have this premise at the heart of their future plan, if the objective of designing sustainable and liveable future cities will be realised. The knowledge economy, now transitioning to the creative economy, requires cities of the future to be ‘gardens’ of inspiration, to be places where knowledge, creativity, and innovation can thrive as these instruments are becoming critical assets for creating wealth in the new economic system. Developing nations must accept that learning is a lifelong process that requires keeping abreast with change and should invest in teaching people how to keep learning. The need to continuously update one’s knowledge, turn these cities into vibrant societies, where new ideas create knowledge and in turn enriches the quality of life of the residents. Cities of the future must have as one of their objectives, the ability to motivate their citizens to learn, share knowledge, evaluate the knowledge and use it to create wealth for a just society. The five functional factors suggested by Kevin Lynch;-vitality, meaning/sense, adaptability, access, control, and monitoring should form the basis on which policy makers and urban designers base their plans for future cities. The authors of this paper believe that developing nations “creative economy clusters”, cities where creative industries drive the need for constant new knowledge creating sustainable learning creative cities. Obviously the form, shape and size of these districts should be cognisant of the environmental, cultural and economic characteristics of each locale. Gaborone city in the republic of Botswana is presented as the case study for this paper.Keywords: learning city, sustainable creative city, creative industry, good city form
Procedia PDF Downloads 311439 Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets
Authors: Ece Cigdem Mutlu, Burak Alakent
Abstract:
Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.Keywords: average run length, M-estimators, quality control, robust estimators
Procedia PDF Downloads 191438 Automatic Moderation of Toxic Comments in the Face of Local Language Complexity in Senegal
Authors: Edouard Ngor Sarr, Abel Diatta, Serigne Mor Toure, Ousmane Sall, Lamine Faty
Abstract:
Thanks to Web 2, we are witnessing a form of democratization of the spoken word, an exponential increase in the number of users on the web, but also, and above all, the accumulation of a daily flow of content that is becoming, at times, uncontrollable. Added to this is the rise of a violent social fabric characterised by hateful and racial comments, insults, and other content that contravenes social rules and the platforms' terms of use. Consequently, managing and regulating this mass of new content is proving increasingly difficult, requiring substantial human, technical, and technological resources. Without regulation and with the complicity of anonymity, this toxic content can pollute discussions and make these online spaces highly conducive to abuse, which very often has serious consequences for certain internet users, ranging from anxiety to suicide, depression, or withdrawal. The toxicity of a comment is defined as anything that is rude, disrespectful, or likely to cause someone to leave a discussion or to take violent action against a person or a community. Two levels of measures are needed to deal with this deleterious situation. The first measures are being taken by governments through draft laws with a dual objective: (i) to punish the perpetrators of these abuses and (ii) to make online platforms accountable for the mistakes made by their users. The second measure comes from the platforms themselves. By assessing the content left by users, they can set up filters to block and/or delete content or decide to suspend the user in question for good. However, the speed of discussions and the volume of data involved mean that platforms are unable to properly monitor the moderation of content produced by Internet users. That's why they use human moderators, either through recruitment or outsourcing. Moderating comments on the web means assessing and monitoring users‘ comments on online platforms in order to strike the right balance between protection against abuse and users’ freedom of expression. It makes it possible to determine which publications and users are allowed to remain online and which are deleted or suspended, how authorised publications are displayed, and what actions accompany content deletions. In this study, we look at the problem of automatic moderation of toxic comments in the face of local African languages and, more specifically, on social network comments in Senegal. We review the state of the art, highlighting the different approaches, algorithms, and tools for moderating comments. We also study the issues and challenges of moderation in the face of web ecosystems with lesser-known languages, such as local languages.Keywords: moderation, local languages, Senegal, toxic comments
Procedia PDF Downloads 12437 Technological Affordances of a Mobile Fitness Application- A Role of Escapism and Social Outcome Expectation
Authors: Inje Cho
Abstract:
The leading health risks threatening the world today are associated with a modern lifestyle characterized by sedentary behavior, stress, anxiety, and an obesogenic food environment. To counter this alarming trend, the Centers for Disease Control and Prevention have proffered Physical Activity guidelines to bolster physical engagement. Concurrently, the burgeon of smartphones and mobile applications has witnessed a proliferation of fitness applications aimed at invigorating exercise adherence and real-time activity monitoring. Grounded in the Uses and gratification theory, this study delves into the technological affordances of mobile fitness applications, discerning the mediating influences of escapism and social outcome expectations on attitudes and exercise intention. The theory explains how individuals employ distinct communication mediums to satiate their exigencies and desires. Technological affordances manifest as attributes of emerging technologies that galvanize personal engagement in physical activities. Several features of mobile fitness applications include affordances for goal setting, virtual rewards, peer support, and exercise information. Escapism, denoting the inclination to disengage from normal routines, has emerged as a salient motivator for the consumption of new media. This study postulates that individual’s perceptions technological affordances within mobile fitness applications, can affect escapism and social outcome expectations, potentially influencing attitude, and behavior formation. Thus, the integrated model has been developed to empirically examine the interrelationships between technological affordances, escapism, social outcome expectations, and exercise intention. Structural Equation Modelling serves as the methodological tool, and a cohort of 400 Fitbit users shall be enlisted from the Prolific, data collection platform. A sequence of multivariate data analyses will scrutinize both the measurement and hypothesized structural models. By delving into the effects of mobile fitness applications, this study contributes to the growing of new media studies in sport management. Moreover, the novel integration of the uses and gratification theory, technological affordances, via the prism of escapism, illustrates the dynamics that underlies mobile fitness user’s attitudes and behavioral intentions. Therefore, the findings from this study contribute to theoretical understanding and provide pragmatic insights to developers and practitioners in optimizing the impact of mobile fitness applications.Keywords: technological affordances, uses and gratification, mobile fitness apps, escapism, physical activity
Procedia PDF Downloads 81436 Dry Modifications of PCL/Chitosan/PCL Tissue Scaffolds
Authors: Ozan Ozkan, Hilal Turkoglu Sasmazel
Abstract:
Natural polymers are widely used in tissue engineering applications, because of their biocompatibility, biodegradability and solubility in the physiological medium. On the other hand, synthetic polymers are also widely utilized in tissue engineering applications, because they carry no risk of infectious diseases and do not cause immune system reaction. However, the disadvantages of both polymer types block their individual usages as tissue scaffolds efficiently. Therefore, the idea of usage of natural and synthetic polymers together as a single 3D hybrid scaffold which has the advantages of both and the disadvantages of none has been entered to the literature. On the other hand, even though these hybrid structures support the cell adhesion and/or proliferation, various surface modification techniques applied to the surfaces of them to create topographical changes on the surfaces and to obtain reactive functional groups required for the immobilization of biomolecules, especially on the surfaces of synthetic polymers in order to improve cell adhesion and proliferation. In a study presented here, to improve the surface functionality and topography of the layer by layer electrospun 3D poly-epsilon-caprolactone/chitosan/poly-epsilon-caprolactone hybrid tissue scaffolds by using atmospheric pressure plasma method, thus to improve cell adhesion and proliferation of these tissue scaffolds were aimed. The formation/creation of the functional hydroxyl and amine groups and topographical changes on the surfaces of scaffolds were realized by using two different atmospheric pressure plasma systems (nozzle type and dielectric barrier discharge (DBD) type) carried out under different gas medium (air, Ar+O2, Ar+N2). The plasma modification time and distance for the nozzle type plasma system as well as the plasma modification time and the gas flow rate for DBD type plasma system were optimized with monitoring the changes in surface hydrophilicity by using contact angle measurements. The topographical and chemical characterizations of these modified biomaterials’ surfaces were carried out with SEM and ESCA, respectively. The results showed that the atmospheric pressure plasma modifications carried out with both nozzle type plasma and DBD plasma caused topographical and functionality changes on the surfaces of the layer by layer electrospun tissue scaffolds. However, the shelf life studies indicated that the hydrophilicity introduced to the surfaces was mainly because of the functionality changes. Therefore, according to the optimized results, samples treated with nozzle type air plasma modification applied for 9 minutes from a distance of 17 cm and Ar+O2 DBD plasma modification applied for 1 minute under 70 cm3/min O2 flow rate were found to have the highest hydrophilicity compared to pristine samples.Keywords: biomaterial, chitosan, hybrid, plasma
Procedia PDF Downloads 276435 Quantifying Impairments in Whiplash-Associated Disorders and Association with Patient-Reported Outcomes
Authors: Harpa Ragnarsdóttir, Magnús Kjartan Gíslason, Kristín Briem, Guðný Lilja Oddsdóttir
Abstract:
Introduction: Whiplash-Associated Disorder (WAD) is a health problem characterized by motor, neurological and psychosocial symptoms, stressing the need for a multimodal treatment approach. To achieve individualized multimodal approach, prognostic factors need to be identified early using validated patient-reported and objective outcome measures. The aim of this study is to demonstrate the degree of association between patient-reported and clinical outcome measures of WAD patients in the subacute phase. Methods: Individuals (n=41) with subacute (≥1, ≤3 months) WAD (I-II), medium to high-risk symptoms, or neck pain rating ≥ 4/10 on the Visual Analog Scale (VAS) were examined. Outcome measures included measurements for movement control (Butterfly test) and cervical active range of motion (cAROM) using the NeckSmart system, a computer system using an inertial measurement unit (IMU) that connects to a computer. The IMU sensor is placed on the participant’s head, who receives visual feedback about the movement of the head. Patient-reported neck disability, pain intensity, general health, self-perceived handicap, central sensitization, and difficulties due to dizziness were measured using questionnaires. Excel and R statistical software were used for statistical analyses. Results: Forty-one participants, 15 males (37%), 26 females (63%), mean (SD) age 36.8 (±12.7), underwent data collection. Mean amplitude accuracy (AA) (SD) in the Butterfly test for easy, medium, and difficult paths were 2.4mm (0.9), 4.4mm (1.8), and 6.8mm (2.7), respectively. Mean cAROM (SD) for flexion, extension, left-, and right rotation were 46.3° (18.5), 48.8° (17.8), 58.2° (14.3), and 58.9° (15.0), respectively. Mean scores on the Neck Disability Index (NDI), VAS, Dizziness Handicap Inventory (DHI), Central Sensitization Inventory (CSI), and 36-Item Short Form Survey RAND version (RAND) were 43% (17.4), 7 (1.7), 37 (25.4), 51 (17.5), and 39.2 (17.7) respectively. Females showed significantly greater deviation for AA compared to males for easy and medium Butterfly paths (p<0.05). Statistically significant moderate to strong positive correlation was found between the DHI and easy (r=0.6, p=0.05), medium (r=0.5, p=0.05)) and difficult (r=0.5, p<0.05) Butterfly paths, between the total RAND score and all cAROMs (r between 0.4-0.7, p≤0.05) except flexion (r=0.4, p=0.7), and between the NDI score and CSI (r=0.7, p<0.01), VAS (r=0.5, p<0.01), and DHI (r=0.7, p<0.01) scores respectively. Discussion: All patient-reported and objective measures were found to be outside the reference range. Results suggest females have worse movement control in the neck in the subacute WAD phase. However, no statistical difference based on gender was found in patient-reported measures. Suggesting females might have worse movement control than males in general in this phase. The correlation found between DHI and the Butterfly test can be explained because the DHI measures proprioceptive symptoms like dizziness and eye movement disorders that can affect the outcome of movement control tests. A correlation was found between the total RAND score and cAROM, suggesting that a reduced range of motion affects the quality of life. Significance: The NeckSmart system can detect abnormalities in cAROM, fine movement control, and kinesthesia of the neck. Results suggest females have worse movement control than males. Results show a moderate to a high correlation between several patient-reported and objective measurements.Keywords: whiplash associated disorders, car-collision, neck, trauma, subacute
Procedia PDF Downloads 70434 The Effects of Geographical and Functional Diversity of Collaborators on Quality of Knowledge Generated
Authors: Ajay Das, Sandip Basu
Abstract:
Introduction: There is increasing recognition that diverse streams of knowledge can often be recombined in novel ways to generate new knowledge. However, knowledge recombination theory has not been applied to examine the effects of collaborator diversity on the quality of knowledge such collaborators produce. This is surprising because one would expect that a collaborative team with certain aspects of diversity should be able to recombine process elements related to knowledge development, which are relatively tacit, but also complementary because of the collaborator’s varying backgrounds. Theory and Hypotheses: We propose to examine two aspects of diversity in the environments of collaborative teams to try and capture such potential recombinations of relatively tacit, process knowledge. The first aspect of diversity in team members’ environments is geographical. Collaborators with more geographical distance between them (perhaps working in different countries) often have more autonomy in the processes they adopt for knowledge development. In the absence of overt monitoring, such collaborators are likely to adopt differing approaches to knowledge development. The sharing of such varying approaches among collaborators is likely to result in greater quality of the common collaborative pursuit. The second aspect is diversity in the work backgrounds of team members. Such diversity can also increase the potential for knowledge recombination. For example, if one or more members are from a manufacturing center (versus all of them being from a purely R&D center), such members will provide unique perspectives on the implementation of innovative ideas. Again, knowledge that has been evaluated from these diverse perspectives is likely to be of a higher quality. In addition to the above aspects of environmental diversity among team members, we also plan to examine the extent to which individual collaborators are in different environments from the primary innovation center of their employing firms. Proposed Methods: We will test our model on a sample of firms in the semiconductor industry. Our level of analysis will be individual patents generated by these firms and the teams involved in the generation of these. Information on manufacturing activities of our sample firms will be obtained from SEMI, a proprietary database of the semiconductor industry, as well as company 10-K reports. Conclusion: We believe that our results will represent a preliminary attempt to understand how various forms of diversity in collaborative teams impact the knowledge development process. Our dependent variable of knowledge quality is important to study since higher values of this variable can not only drive firm performance but the broader development of regions and societies through spillover impacts on future innovation. The results of this study will, therefore, inform future research and practice in innovation, geographical location, and vertical integration.Keywords: innovation, manufacturing strategy, knowledge, diversity
Procedia PDF Downloads 353433 Building the Professional Readiness of Graduates from Day One: An Empirical Approach to Curriculum Continuous Improvement
Authors: Fiona Wahr, Sitalakshmi Venkatraman
Abstract:
Industry employers require new graduates to bring with them a range of knowledge, skills and abilities which mean these new employees can immediately make valuable work contributions. These will be a combination of discipline and professional knowledge, skills and abilities which give graduates the technical capabilities to solve practical problems whilst interacting with a range of stakeholders. Underpinning the development of these disciplines and professional knowledge, skills and abilities, are “enabling” knowledge, skills and abilities which assist students to engage in learning. These are academic and learning skills which are essential to common starting points for both the learning process of students entering the course as well as forming the foundation for the fully developed graduate knowledge, skills and abilities. This paper reports on a project created to introduce and strengthen these enabling skills into the first semester of a Bachelor of Information Technology degree in an Australian polytechnic. The project uses an action research approach in the context of ongoing continuous improvement for the course to enhance the overall learning experience, learning sequencing, graduate outcomes, and most importantly, in the first semester, student engagement and retention. The focus of this is implementing the new curriculum in first semester subjects of the course with the aim of developing the “enabling” learning skills, such as literacy, research and numeracy based knowledge, skills and abilities (KSAs). The approach used for the introduction and embedding of these KSAs, (as both enablers of learning and to underpin graduate attribute development), is presented. Building on previous publications which reported different aspects of this longitudinal study, this paper recaps on the rationale for the curriculum redevelopment and then presents the quantitative findings of entering students’ reading literacy and numeracy knowledge and skills degree as well as their perceived research ability. The paper presents the methodology and findings for this stage of the research. Overall, the cohort exhibits mixed KSA levels in these areas, with a relatively low aggregated score. In addition, the paper describes the considerations for adjusting the design and delivery of the new subjects with a targeted learning experience, in response to the feedback gained through continuous monitoring. Such a strategy is aimed at accommodating the changing learning needs of the students and serves to support them towards achieving the enabling learning goals starting from day one of their higher education studies.Keywords: enabling skills, student retention, embedded learning support, continuous improvement
Procedia PDF Downloads 249432 Coupling of Microfluidic Droplet Systems with ESI-MS Detection for Reaction Optimization
Authors: Julia R. Beulig, Stefan Ohla, Detlev Belder
Abstract:
In contrast to off-line analytical methods, lab-on-a-chip technology delivers direct information about the observed reaction. Therefore, microfluidic devices make an important scientific contribution, e.g. in the field of synthetic chemistry. Herein, the rapid generation of analytical data can be applied for the optimization of chemical reactions. These microfluidic devices enable a fast change of reaction conditions as well as a resource saving method of operation. In the presented work, we focus on the investigation of multiphase regimes, more specifically on a biphasic microfluidic droplet systems. Here, every single droplet is a reaction container with customized conditions. The biggest challenge is the rapid qualitative and quantitative readout of information as most detection techniques for droplet systems are non-specific, time-consuming or too slow. An exception is the electrospray mass spectrometry (ESI-MS). The combination of a reaction screening platform with a rapid and specific detection method is an important step in droplet-based microfluidics. In this work, we present a novel approach for synthesis optimization on the nanoliter scale with direct ESI-MS detection. The development of a droplet-based microfluidic device, which enables the modification of different parameters while simultaneously monitoring the effect on the reaction within a single run, is shown. By common soft- and photolithographic techniques a polydimethylsiloxane (PDMS) microfluidic chip with different functionalities is developed. As an interface for the MS detection, we use a steel capillary for ESI and improve the spray stability with a Teflon siphon tubing, which is inserted underneath the steel capillary. By optimizing the flow rates, it is possible to screen parameters of various reactions, this is exemplarity shown by a Domino Knoevenagel Hetero-Diels-Alder reaction. Different starting materials, catalyst concentrations and solvent compositions are investigated. Due to the high repetition rate of the droplet production, each set of reaction condition is examined hundreds of times. As a result, of the investigation, we receive possible reagents, the ideal water-methanol ratio of the solvent and the most effective catalyst concentration. The developed system can help to determine important information about the optimal parameters of a reaction within a short time. With this novel tool, we make an important step on the field of combining droplet-based microfluidics with organic reaction screening.Keywords: droplet, mass spectrometry, microfluidics, organic reaction, screening
Procedia PDF Downloads 302431 Linking Soil Spectral Behavior and Moisture Content for Soil Moisture Content Retrieval at Field Scale
Authors: Yonwaba Atyosi, Moses Cho, Abel Ramoelo, Nobuhle Majozi, Cecilia Masemola, Yoliswa Mkhize
Abstract:
Spectroscopy has been widely used to understand the hyperspectral remote sensing of soils. Accurate and efficient measurement of soil moisture is essential for precision agriculture. The aim of this study was to understand the spectral behavior of soil at different soil water content levels and identify the significant spectral bands for soil moisture content retrieval at field-scale. The study consisted of 60 soil samples from a maize farm, divided into four different treatments representing different moisture levels. Spectral signatures were measured for each sample in laboratory under artificial light using an Analytical Spectral Device (ASD) spectrometer, covering a wavelength range from 350 nm to 2500 nm, with a spectral resolution of 1 nm. The results showed that the absorption features at 1450 nm, 1900 nm, and 2200 nm were particularly sensitive to soil moisture content and exhibited strong correlations with the water content levels. Continuum removal was developed in the R programming language to enhance the absorption features of soil moisture and to precisely understand its spectral behavior at different water content levels. Statistical analysis using partial least squares regression (PLSR) models were performed to quantify the correlation between the spectral bands and soil moisture content. This study provides insights into the spectral behavior of soil at different water content levels and identifies the significant spectral bands for soil moisture content retrieval. The findings highlight the potential of spectroscopy for non-destructive and rapid soil moisture measurement, which can be applied to various fields such as precision agriculture, hydrology, and environmental monitoring. However, it is important to note that the spectral behavior of soil can be influenced by various factors such as soil type, texture, and organic matter content, and caution should be taken when applying the results to other soil systems. The results of this study showed a good agreement between measured and predicted values of Soil Moisture Content with high R2 and low root mean square error (RMSE) values. Model validation using independent data was satisfactory for all the studied soil samples. The results has significant implications for developing high-resolution and precise field-scale soil moisture retrieval models. These models can be used to understand the spatial and temporal variation of soil moisture content in agricultural fields, which is essential for managing irrigation and optimizing crop yield.Keywords: soil moisture content retrieval, precision agriculture, continuum removal, remote sensing, machine learning, spectroscopy
Procedia PDF Downloads 100430 Hyperspectral Imagery for Tree Speciation and Carbon Mass Estimates
Authors: Jennifer Buz, Alvin Spivey
Abstract:
The most common greenhouse gas emitted through human activities, carbon dioxide (CO2), is naturally consumed by plants during photosynthesis. This process is actively being monetized by companies wishing to offset their carbon dioxide emissions. For example, companies are now able to purchase protections for vegetated land due-to-be clear cut or purchase barren land for reforestation. Therefore, by actively preventing the destruction/decay of plant matter or by introducing more plant matter (reforestation), a company can theoretically offset some of their emissions. One of the biggest issues in the carbon credit market is validating and verifying carbon offsets. There is a need for a system that can accurately and frequently ensure that the areas sold for carbon credits have the vegetation mass (and therefore for carbon offset capability) they claim. Traditional techniques for measuring vegetation mass and determining health are costly and require many person-hours. Orbital Sidekick offers an alternative approach that accurately quantifies carbon mass and assesses vegetation health through satellite hyperspectral imagery, a technique which enables us to remotely identify material composition (including plant species) and condition (e.g., health and growth stage). How much carbon a plant is capable of storing ultimately is tied to many factors, including material density (primarily species-dependent), plant size, and health (trees that are actively decaying are not effectively storing carbon). All of these factors are capable of being observed through satellite hyperspectral imagery. This abstract focuses on speciation. To build a species classification model, we matched pixels in our remote sensing imagery to plants on the ground for which we know the species. To accomplish this, we collaborated with the researchers at the Teakettle Experimental Forest. Our remote sensing data comes from our airborne “Kato” sensor, which flew over the study area and acquired hyperspectral imagery (400-2500 nm, 472 bands) at ~0.5 m/pixel resolution. Coverage of the entire teakettle experimental forest required capturing dozens of individual hyperspectral images. In order to combine these images into a mosaic, we accounted for potential variations of atmospheric conditions throughout the data collection. To do this, we ran an open source atmospheric correction routine called ISOFIT1 (Imaging Spectrometer Optiman FITting), which converted all of our remote sensing data from radiance to reflectance. A database of reflectance spectra for each of the tree species within the study area was acquired using the Teakettle stem map and the geo-referenced hyperspectral images. We found that a wide variety of machine learning classifiers were able to identify the species within our images with high (>95%) accuracy. For the most robust quantification of carbon mass and the best assessment of the health of a vegetated area, speciation is critical. Through the use of high resolution hyperspectral data, ground-truth databases, and complex analytical techniques, we are able to determine the species present within a pixel to a high degree of accuracy. These species identifications will feed directly into our carbon mass model.Keywords: hyperspectral, satellite, carbon, imagery, python, machine learning, speciation
Procedia PDF Downloads 131429 Discourse Analysis: Where Cognition Meets Communication
Authors: Iryna Biskub
Abstract:
The interdisciplinary approach to modern linguistic studies is exemplified by the merge of various research methods, which sometimes causes complications related to the verification of the research results. This methodological confusion can be resolved by means of creating new techniques of linguistic analysis combining several scientific paradigms. Modern linguistics has developed really productive and efficient methods for the investigation of cognitive and communicative phenomena of which language is the central issue. In the field of discourse studies, one of the best examples of research methods is the method of Critical Discourse Analysis (CDA). CDA can be viewed both as a method of investigation, as well as a critical multidisciplinary perspective. In CDA the position of the scholar is crucial from the point of view exemplifying his or her social and political convictions. The generally accepted approach to obtaining scientifically reliable results is to use a special well-defined scientific method for researching special types of language phenomena: cognitive methods applied to the exploration of cognitive aspects of language, whereas communicative methods are thought to be relevant only for the investigation of communicative nature of language. In the recent decades discourse as a sociocultural phenomenon has been the focus of careful linguistic research. The very concept of discourse represents an integral unity of cognitive and communicative aspects of human verbal activity. Since a human being is never able to discriminate between cognitive and communicative planes of discourse communication, it doesn’t make much sense to apply cognitive and communicative methods of research taken in isolation. It is possible to modify the classical CDA procedure by means of mapping human cognitive procedures onto the strategic communicative planning of discourse communication. The analysis of the electronic petition 'Block Donald J Trump from UK entry. The signatories believe Donald J Trump should be banned from UK entry' (584, 459 signatures) and the parliamentary debates on it has demonstrated the ability to map cognitive and communicative levels in the following way: the strategy of discourse modeling (communicative level) overlaps with the extraction of semantic macrostructures (cognitive level); the strategy of discourse management overlaps with the analysis of local meanings in discourse communication; the strategy of cognitive monitoring of the discourse overlaps with the formation of attitudes and ideologies at the cognitive level. Thus, the experimental data have shown that it is possible to develop a new complex methodology of discourse analysis, where cognition would meet communication, both metaphorically and literally. The same approach may appear to be productive for the creation of computational models of human-computer interaction, where the automatic generation of a particular type of a discourse could be based on the rules of strategic planning involving cognitive models of CDA.Keywords: cognition, communication, discourse, strategy
Procedia PDF Downloads 255428 Spatial and Temporal Analysis of Forest Cover Change with Special Reference to Anthropogenic Activities in Kullu Valley, North-Western Indian Himalayan Region
Authors: Krisala Joshi, Sayanta Ghosh, Renu Lata, Jagdish C. Kuniyal
Abstract:
Throughout the world, monitoring and estimating the changing pattern of forests across diverse landscapes through remote sensing is instrumental in understanding the interactions of human activities and the ecological environment with the changing climate. Forest change detection using satellite imageries has emerged as an important means to gather information on a regional scale. Kullu valley in Himachal Pradesh, India is situated in a transitional zone between the lesser and the greater Himalayas. Thus, it presents a typical rugged mountainous terrain with moderate to high altitude which varies from 1200 meters to over 6000 meters. Due to changes in agricultural cropping patterns, urbanization, industrialization, hydropower generation, climate change, tourism, and anthropogenic forest fire, it has undergone a tremendous transformation in forest cover in the past three decades. The loss and degradation of forest cover results in soil erosion, loss of biodiversity including damage to wildlife habitats, and degradation of watershed areas, and deterioration of the overall quality of nature and life. The supervised classification of LANDSAT satellite data was performed to assess the changes in forest cover in Kullu valley over the years 2000 to 2020. Normalized Burn Ratio (NBR) was calculated to discriminate between burned and unburned areas of the forest. Our study reveals that in Kullu valley, the increasing number of forest fire incidents specifically, those due to anthropogenic activities has been on a rise, each subsequent year. The main objective of the present study is, therefore, to estimate the change in the forest cover of Kullu valley and to address the various social aspects responsible for the anthropogenic forest fires. Also, to assess its impact on the significant changes in the regional climatic factors, specifically, temperature, humidity, and precipitation over three decades, with the help of satellite imageries and ground data. The main outcome of the paper, we believe, will be helpful for the administration for making a quantitative assessment of the forest cover area changes due to anthropogenic activities and devising long-term measures for creating awareness among the local people of the area.Keywords: Anthropogenic Activities, Forest Change Detection, Normalized Burn Ratio (NBR), Supervised Classification
Procedia PDF Downloads 173427 Pill-Box Dispenser as a Strategy for Therapeutic Management: A Qualitative Evaluation
Authors: Bruno R. Mendes, Francisco J. Caldeira, Rita S. Luís
Abstract:
Population ageing is directly correlated to an increase in medicine consumption. Beyond the latter and the polymedicated profile of elderly, it is possible to see a need for pharmacotherapeutic monitoring due to cognitive and physical impairment. In this sense, the tracking, organization and administration of medicines become a daily challenge and the pill-box dispenser system a solution. The pill-box dispenser (system) consists in a small compartmentalized container to unit dose organization, which means a container able to correlate the patient’s prescribed dose regimen and the time schedule of intake. In many European countries, this system is part of pharmacist’s role in clinical pharmacy. Despite this simple solution, therapy compliance is only possible if the patient adheres to the system, so it is important to establish a qualitative and quantitative analysis on the perception of the patient on the benefits and risks of the pill-box dispenser as well as the identification of the ideal system. The analysis was conducted through an observational study, based on the application of a standardized questionnaire structured with the numerical scale of Likert (5 levels) and previously validated on the population. The study was performed during a limited period of time and under a randomized sample of 188 participants. The questionnaire consisted of 22 questions: 6 background measures and 16 specific measures. The standards for the final comparative analysis were obtained through the state-of-the-art on the subject. The study carried out using the Likert scale afforded a degree of agreement and discordance between measures (Sample vs. Standard) of 56,25% and 43,75%, respectively. It was concluded that the pill-box dispenser has greater acceptance among a younger population, that was not the initial target of the system. However, this allows us to guarantee a high adherence in the future. Additionally, it was noted that the cost associated with this service is not a limiting factor for its use. The pill-box dispenser system, as currently implemented, demonstrates an important weakness regarding the quality and effectiveness of the medicines, which is not understood by the patient, revealing a significant lack of literacy when it concerns with medicine area. The characteristics of an ideal system remain unchanged, which means that the size, appearance and availability of information in the pill-box continue to be indispensable elements for the compliance with the system. The pill-box dispenser remains unsuitable regarding container size and the type of treatment to which it applies. Despite that, it might be a future standard for clinical pharmacy, allowing a differentiation of the pharmacist role, as well as a wider range of applications to other age groups and treatments.Keywords: clinical pharmacy, medicines, patient safety, pill-box dispenser
Procedia PDF Downloads 198426 Coil-Over Shock Absorbers Compared to Inherent Material Damping
Authors: Carina Emminger, Umut D. Cakmak, Evrim Burkut, Rene Preuer, Ingrid Graz, Zoltan Major
Abstract:
Damping accompanies us daily in everyday life and is used to protect (e.g., in shoes) and make our life more comfortable (damping of unwanted motion) and calm (noise reduction). In general, damping is the absorption of energy which is either stored in the material (vibration isolation systems) or changed into heat (vibration absorbers). In case of the last, the damping mechanism can be split in active, passive, as well as semi-active (a combination of active and passive). Active damping is required to enable an almost perfect damping over the whole application range and is used, for instance, in sport cars. In contrast, passive damping is a response of the material due to external loading. Consequently, the material composition has a huge influence on the damping behavior. For elastomers, the material behavior is inherent viscoelastic, temperature, and frequency dependent. However, passive damping is not adjustable during application. Therefore, it is of importance to understand the fundamental viscoelastic behavior and the dissipation capability due to external loading. The objective of this work is to assess the limitation and applicability of viscoelastic material damping for applications in which currently coil-over shock absorbers are utilized. Coil-over shock absorbers are usually made of various mechanical parts and incorporate fluids within the damper. These shock absorbers are well-known and studied in the industry, and when needed, they can be easily adjusted during their product lifetime. In contrary, dampers made of – ideally – a single material are more resource efficient, have an easier serviceability, and are easier manufactured. However, they lack of adaptability and adjustability in service. Therefore, a case study with a remote-controlled sport car was conducted. The original shock absorbers were redesigned, and the spring-dashpot system was replaced by both an elastomer and a thermoplastic-elastomer, respectively. Here, five different formulations of elastomers were used, including a pure and an iron-particle filled thermoplastic poly(urethan) (TPU) and blends of two different poly(dimethyl siloxane) (PDMS). In addition, the TPUs were investigated as full and hollow dampers to investigate the difference between solid and structured material. To get comparative results each material formulation was comprehensively characterized, by monotonic uniaxial compression tests, dynamic thermomechanical analysis (DTMA), and rebound resilience. Moreover, the new material-based shock absorbers were compared with spring-dashpot shock absorbers. The shock absorbers were analyzed under monotonic and cyclic loading. In addition, an impact loading was applied on the remote-controlled car to measure the damping properties in operation. A servo-hydraulic high-speed linear actuator was utilized to apply the loads. The acceleration of the car and the displacement of specific measurement points were recorded while testing by a sensor and high-speed camera, respectively. The results prove that elastomers are suitable in damping applications, but they are temperature and frequency dependent. This is a limitation in applicability of viscous material damper. Feasible fields of application may be in the case of micromobility, like bicycles, e-scooters, and e-skateboards. Furthermore, the viscous material damping could be used to increase the inherent damping of a whole structure, e.g., in bicycle-frames.Keywords: damper structures, material damping, PDMS, TPU
Procedia PDF Downloads 115425 A Comprehensive Study on Freshwater Aquatic Life Health Quality Assessment Using Physicochemical Parameters and Planktons as Bio Indicator in a Selected Region of Mahaweli River in Kandy District, Sri Lanka
Authors: S. M. D. Y. S. A. Wijayarathna, A. C. A. Jayasundera
Abstract:
Mahaweli River is the longest and largest river in Sri Lanka and it is the major drinking water source for a large portion of 2.5 million inhabitants in the Central Province. The aim of this study was to the determination of water quality and aquatic life health quality in a selected region of Mahaweli River. Six sampling locations (Site 1: 7° 16' 50" N, 80° 40' 00" E; Site 2: 7° 16' 34" N, 80° 40' 27" E; Site 3: 7° 16' 15" N, 80° 41' 28" E; Site 4: 7° 14' 06" N, 80° 44' 36" E; Site 5: 7° 14' 18" N, 80° 44' 39" E; Site 6: 7° 13' 32" N, 80° 46' 11" E) with various anthropogenic activities at bank of the river were selected for a period of three months from Tennekumbura Bridge to Victoria Reservoir. Temperature, pH, Electrical Conductivity (EC), Total Dissolved Solids (TDS), Dissolved Oxygen (DO), 5-day Biological Oxygen Demand (BOD5), Total Suspended Solids (TSS), hardness, the concentration of anions, and metal concentration were measured according to the standard methods, as physicochemical parameters. Planktons were considered as biological parameters. Using a plankton net (20 µm mesh size), surface water samples were collected into acid washed dried vials and were stored in an ice box during transportation. Diversity and abundance of planktons were identified within 4 days of sample collection using standard manuals of plankton identification under the light microscope. Almost all the measured physicochemical parameters were within the CEA standards limits for aquatic life, Sri Lanka Standards (SLS) or World Health Organization’s Guideline for drinking water. Concentration of orthophosphate ranged between 0.232 to 0.708 mg L-1, and it has exceeded the standard limit of aquatic life according to CEA guidelines (0.400 mg L-1) at Site 1 and Site 2, where there is high disturbance by cultivations and close households. According to the Pearson correlation (significant correlation at p < 0.05), it is obvious that some physicochemical parameters (temperature, DO, TDS, TSS, phosphate, sulphate, chloride fluoride, and sodium) were significantly correlated to the distribution of some plankton species such as Aulocoseira, Navicula, Synedra, Pediastrum, Fragilaria, Selenastrum, Oscillataria, Tribonema and Microcystis. Furthermore, species that appear in blooms (Aulocoseira), organic pollutants (Navicula), and phosphate high eutrophic water (Microcystis) were found, indicating deteriorated water quality in Mahaweli River due to agricultural activities, solid waste disposal, and release of domestic effluents. Therefore, it is necessary to improve environmental monitoring and management to control the further deterioration of water quality of the river.Keywords: bio indicator, environmental variables, planktons, physicochemical parameters, water quality
Procedia PDF Downloads 106424 Safety Tolerance Zone for Driver-Vehicle-Environment Interactions under Challenging Conditions
Authors: Matjaž Šraml, Marko Renčelj, Tomaž Tollazzi, Chiara Gruden
Abstract:
Road safety is a worldwide issue with numerous and heterogeneous factors influencing it. On the side, driver state – comprising distraction/inattention, fatigue, drowsiness, extreme emotions, and socio-cultural factors highly affect road safety. On the other side, the vehicle state has an important role in mitigating (or not) the road risk. Finally, the road environment is still one of the main determinants of road safety, defining driving task complexity. At the same time, thanks to technological development, a lot of detailed data is easily available, creating opportunities for the detection of driver state, vehicle characteristics and road conditions and, consequently, for the design of ad hoc interventions aimed at improving driver performance, increase awareness and mitigate road risks. This is the challenge faced by the i-DREAMS project. i-DREAMS, which stands for a smart Driver and Road Environment Assessment and Monitoring System, is a 3-year project funded by the European Union’s Horizon 2020 research and innovation program. It aims to set up a platform to define, develop, test and validate a ‘Safety Tolerance Zone’ to prevent drivers from getting too close to the boundaries of unsafe operation by mitigating risks in real-time and after the trip. After the definition and development of the Safety Tolerance Zone concept and the concretization of the same in an Advanced driver-assistance system (ADAS) platform, the system was tested firstly for 2 months in a driving simulator environment in 5 different countries. After that, naturalistic driving studies started for a 10-month period (comprising a 1-month pilot study, 3-month baseline study and 6 months study implementing interventions). Currently, the project team has approved a common evaluation approach, and it is developing the assessment of the usage and outcomes of the i-DREAMS system, which is turning positive insights. The i-DREAMS consortium consists of 13 partners, 7 engineering universities and research groups, 4 industry partners and 2 partners (European Transport Safety Council - ETSC - and POLIS cities and regions for transport innovation) closely linked to transport safety stakeholders, covering 8 different countries altogether.Keywords: advanced driver assistant systems, driving simulator, safety tolerance zone, traffic safety
Procedia PDF Downloads 68423 Remote Sensing-Based Prediction of Asymptomatic Rice Blast Disease Using Hyperspectral Spectroradiometry and Spectral Sensitivity Analysis
Authors: Selvaprakash Ramalingam, Rabi N. Sahoo, Dharmendra Saraswat, A. Kumar, Rajeev Ranjan, Joydeep Mukerjee, Viswanathan Chinnasamy, K. K. Chaturvedi, Sanjeev Kumar
Abstract:
Rice is one of the most important staple food crops in the world. Among the various diseases that affect rice crops, rice blast is particularly significant, causing crop yield and economic losses. While the plant has defense mechanisms in place, such as chemical indicators (proteins, salicylic acid, jasmonic acid, ethylene, and azelaic acid) and resistance genes in certain varieties that can protect against diseases, susceptible varieties remain vulnerable to these fungal diseases. Early prediction of rice blast (RB) disease is crucial, but conventional techniques for early prediction are time-consuming and labor-intensive. Hyperspectral remote sensing techniques hold the potential to predict RB disease at its asymptomatic stage. In this study, we aimed to demonstrate the prediction of RB disease at the asymptomatic stage using non-imaging hyperspectral ASD spectroradiometer under controlled laboratory conditions. We applied statistical spectral discrimination theory to identify unknown spectra of M. Oryzae, the fungus responsible for rice blast disease. The infrared (IR) region was found to be significantly affected by RB disease. These changes may result in alterations in the absorption, reflection, or emission of infrared radiation by the affected plant tissues. Our research revealed that the protein spectrum in the IR region is impacted by RB disease. In our study, we identified strong correlations in the region (Amide group - I) around X 1064 nm and Y 1300 nm with the Lambda / Lambda derived spectra methods for protein detection. During the stages when the disease is developing, typically from day 3 to day 5, the plant's defense mechanisms are not as effective. This is especially true for the PB-1 variety of rice, which is highly susceptible to rice blast disease. Consequently, the proteins in the plant are adversely affected during this critical time. The spectral contour plot reveals the highly correlated spectral regions 1064 nm and Y 1300 nm associated with RB disease infection. Based on these spectral sensitivities, we developed new spectral disease indices for predicting different stages of disease emergence. The goal of this research is to lay the foundation for future UAV and satellite-based studies aimed at long-term monitoring of RB disease.Keywords: rice blast, asymptomatic stage, spectral sensitivity, IR
Procedia PDF Downloads 87422 Sentinel-2 Based Burn Area Severity Assessment Tool in Google Earth Engine
Authors: D. Madhushanka, Y. Liu, H. C. Fernando
Abstract:
Fires are one of the foremost factors of land surface disturbance in diverse ecosystems, causing soil erosion and land-cover changes and atmospheric effects affecting people's lives and properties. Generally, the severity of the fire is calculated as the Normalized Burn Ratio (NBR) index. This is performed manually by comparing two images obtained afterward. Then by using the bitemporal difference of the preprocessed satellite images, the dNBR is calculated. The burnt area is then classified as either unburnt (dNBR<0.1) or burnt (dNBR>= 0.1). Furthermore, Wildfire Severity Assessment (WSA) classifies burnt areas and unburnt areas using classification levels proposed by USGS and comprises seven classes. This procedure generates a burn severity report for the area chosen by the user manually. This study is carried out with the objective of producing an automated tool for the above-mentioned process, namely the World Wildfire Severity Assessment Tool (WWSAT). It is implemented in Google Earth Engine (GEE), which is a free cloud-computing platform for satellite data processing, with several data catalogs at different resolutions (notably Landsat, Sentinel-2, and MODIS) and planetary-scale analysis capabilities. Sentinel-2 MSI is chosen to obtain regular processes related to burnt area severity mapping using a medium spatial resolution sensor (15m). This tool uses machine learning classification techniques to identify burnt areas using NBR and to classify their severity over the user-selected extent and period automatically. Cloud coverage is one of the biggest concerns when fire severity mapping is performed. In WWSAT based on GEE, we present a fully automatic workflow to aggregate cloud-free Sentinel-2 images for both pre-fire and post-fire image compositing. The parallel processing capabilities and preloaded geospatial datasets of GEE facilitated the production of this tool. This tool consists of a Graphical User Interface (GUI) to make it user-friendly. The advantage of this tool is the ability to obtain burn area severity over a large extent and more extended temporal periods. Two case studies were carried out to demonstrate the performance of this tool. The Blue Mountain national park forest affected by the Australian fire season between 2019 and 2020 is used to describe the workflow of the WWSAT. This site detected more than 7809 km2, using Sentinel-2 data, giving an error below 6.5% when compared with the area detected on the field. Furthermore, 86.77% of the detected area was recognized as fully burnt out, of which high severity (17.29%), moderate-high severity (19.63%), moderate-low severity (22.35%), and low severity (27.51%). The Arapaho and Roosevelt National Forest Park, California, the USA, which is affected by the Cameron peak fire in 2020, is chosen for the second case study. It was found that around 983 km2 had burned out, of which high severity (2.73%), moderate-high severity (1.57%), moderate-low severity (1.18%), and low severity (5.45%). These spots also can be detected through the visual inspection made possible by cloud-free images generated by WWSAT. This tool is cost-effective in calculating the burnt area since satellite images are free and the cost of field surveys is avoided.Keywords: burnt area, burnt severity, fires, google earth engine (GEE), sentinel-2
Procedia PDF Downloads 238421 Nutrition Transition in Bangladesh: Multisectoral Responsiveness of Health Systems and Innovative Measures to Mobilize Resources Are Required for Preventing This Epidemic in Making
Authors: Shusmita Khan, Shams El Arifeen, Kanta Jamil
Abstract:
Background: Nutrition transition in Bangladesh has progressed across various relevant socio-demographic contextual issues. For a developing country like Bangladesh, its is believed that, overnutrition is less prevalent than undernutrition. However, recent evidence suggests that a rapid shift is taking place where overweight is subduing underweight. With this rapid increase, for Bangladesh, it will be challenging to achieve the global agenda on halting overweight and obesity. Methods: A secondary analysis was performed from six successive national demographic and health surveys to get the trend on undernutrition and overnutrition for women from reproductive age. In addition, national relevant policy papers were reviewed to determine the countries readiness for whole of the systems approach to tackle this epidemic. Results: Over the last decade, the proportion of women with low body mass index (BMI<18.5), an indicator of undernutrition, has decreased markedly from 34% to 19%. However, the proportion of overweight women (BMI ≥25) increased alarmingly from 9% to 24% over the same period. If the WHO cutoff for public health action (BMI ≥23) is used, the proportion of overweight women has increased from 17% in 2004 to 39% in 2014. The increasing rate of obesity among women is a major challenge to obstetric practice for both women and fetuses. In the long term, overweight women are also at risk of future obesity, diabetes, hyperlipidemia, hypertension, and heart disease. These diseases have serious impact on health care systems. Costs associated with overweight and obesity involves direct and indirect costs. Direct costs include preventive, diagnostic, and treatment services related to obesity. Indirect costs relate to morbidity and mortality costs including productivity. Looking at the Bangladesh Health Facility Survey, it is found that the country is bot prepared for providing nutrition-related health services, regarding prevention, screening, management and treatment. Therefore, if this nutrition transition is not addressed properly, Bangladesh will not be able to achieve the target of the NCD global monitoring framework of the WHO. Conclusion: Addressing this nutrition transition requires contending ‘malnutrition in all its forms’ and addressing it with integrated approaches. Whole of the systems action is required at all levels—starting from improving multi-sectoral coordination to scaling up nutrition-specific and nutrition-sensitive mainstreamed interventions keeping health system in mind.Keywords: nutrition transition, Bangladesh, health system, undernutrition, overnutrition, obesity
Procedia PDF Downloads 288420 Promoting 'One Health' Surveillance and Response Approach Implementation Capabilities against Emerging Threats and Epidemics Crisis Impact in African Countries
Authors: Ernest Tambo, Ghislaine Madjou, Jeanne Y. Ngogang, Shenglan Tang, Zhou XiaoNong
Abstract:
Implementing national to community-based 'One Health' surveillance approach for human, animal and environmental consequences mitigation offers great opportunities and value-added in sustainable development and wellbeing. 'One Health' surveillance approach global partnerships, policy commitment and financial investment are much needed in addressing the evolving threats and epidemics crises mitigation in African countries. The paper provides insights onto how China-Africa health development cooperation in promoting “One Health” surveillance approach in response advocacy and mitigation. China-Africa health development initiatives provide new prospects in guiding and moving forward appropriate and evidence-based advocacy and mitigation management approaches and strategies in attaining Universal Health Coverage (UHC) and Sustainable Development Goals (SDGs). Early and continuous quality and timely surveillance data collection and coordinated information sharing practices in malaria and other diseases are demonstrated in Comoros, Zanzibar, Ghana and Cameroon. Improvements of variety of access to contextual sources and network of data sharing platforms are needed in guiding evidence-based and tailored detection and response to unusual hazardous events. Moreover, understanding threats and diseases trends, frontline or point of care response delivery is crucial to promote integrated and sustainable targeted local, national “One Health” surveillance and response approach needs implementation. Importantly, operational guidelines are vital in increasing coherent financing and national workforce capacity development mechanisms. Strengthening participatory partnerships, collaboration and monitoring strategies in achieving global health agenda effectiveness in Africa. At the same enhancing surveillance data information streams reporting and dissemination usefulness in informing policies decisions, health systems programming and financial mobilization and prioritized allocation pre, during and post threats and epidemics crises programs strengths and weaknesses. Thus, capitalizing on “One Health” surveillance and response approach advocacy and mitigation implementation is timely in consolidating Africa Union 2063 agenda and Africa renaissance capabilities and expectations.Keywords: Africa, one health approach, surveillance, response
Procedia PDF Downloads 422419 Dietary Anion-Cation Balance of Grass and Net Acid-Base Excretion in Urine of Suckler Cows
Authors: H. Scholz, P. Kuehne, G. Heckenberger
Abstract:
Dietary Anion-Cation Balance (DCAB) in grazing systems under German conditions has a tendency to decrease from May until September and often are measured DCAB lower than 100 meq per kg dry matter. Lower DCAB in grass feeding system can change the metabolic status of suckler cows and often are results in acidotic metabolism. Measurement of acid-base excretion in dairy cows has been proved to a method to evaluate the acid-base status. The hypothesis was that metabolic imbalances could be identified by urine measurement in suckler cows. The farm study was conducted during the grazing seasons 2017 and 2018 and involved 7 suckler cow farms in Germany. Suckler cows were grazing during the whole time of the investigation and had no access to other feeding components. Cows had free access to water and salt block and free access to minerals (loose). The dry matter of the grass was determined at 60 °C and were then analysed for energy and nutrient content and for the Dietary Cation-Anion Balance (DCAB). Urine was collected in 50 ml-glasses and analysed for net acid-base excretion (NSBA) and the concentration of creatinine and urea in the laboratory. Statistical analysis took place with ANOVA with fixed effects of farms (1-7), month (May until September), and number of lactations (1, 2, and ≥ 3 lactations) using SPSS Version 25.0 for windows. An alpha of 0.05 was used for all statistical tests. During the grazing periods of years 2017 and 2018, an average DCAB was observed in the grass of 167 meq per kg DM. A very high mean variation could be determined from -42 meq/kg to +439 meq/kg. Reference values in relation to DCAB were described between 150 meq and 400 meq per kg DM. It was found the high chlorine content with reduced potassium level led to this reduction in DCAB at the end of the grazing period. Between the DCAB of the grass and the NSBA in urine of suckler cows was a correlation according to PEARSON of r = 0.478 (p ≤ 0.001) or after SPEARMAN of r = 0.601 (p ≤ 0.001) observed. For the control of urine values of grazing suckler cows, the wide spread of the values poses a challenge of the interpretation, especially since the DCAB is unknown. The influence of several feeding components such as chlorine, sulfur, potassium, and sodium (ions for the DCAB) and dry matter feed intake during the grazing period of suckler cows should be taken into account in further research. The results obtained show that up a decrease in the DCAB is related to a decrease in NSBA in urine of suckler cows. Monitoring of metabolic disturbances should include analysis of urine, blood, milk, and ruminal fluid.Keywords: dietary anion-cation balance, DCAB, net acid-base excretion, NSBA, suckler cow, grazing period
Procedia PDF Downloads 151418 The Healing 'Touch' of Music: A Neuro-Acoustics Approach to Understand Its Therapeutic Effect
Authors: Jagmeet S. Kanwal, Julia F. Langley
Abstract:
Music can heal the body, but a mechanistic understanding of this phenomenon is lacking. This study explores the effects of music presentation on neurologic and physiologic responses leading to metabolic changes in the human body. The mind and body co-exist in a corporeal entity and within this framework, sickness ensues when the mind-body balance goes awry. It is further hypothesized that music has the capacity to directly reset this balance. Two lines of inquiry taken together can provide a mechanistic understanding of this phenomenon 1) Empirical evidence for a sound-sensitive pressure sensor system in the body, and 2) The notion of a “healing center” within the brain that is activated by specific patterns of sounds. From an acoustics perspective, music is spatially distributed as pressure waves ranging from a few cm to several meters in wavelength. These waves interact and propagate in three-dimensions in unique ways, depending on the wavelength. Furthermore, music creates dynamically changing wave-fronts. Frequencies between 200 Hz and 1 kHz generate wavelengths that range from 5'6" to 1 foot. These dimensions are in the range of the body size of most people making it plausible that these pressure waves can geometrically interact with the body surface and create distinct patterns of pressure stimulation across the skin surface. For humans, short wavelength, high frequency (> 200 Hz) sounds are best received via cochlear receptors. For low frequency (< 200 Hz), long wavelength sound vibrations, however, the whole body may act as an ideal receiver. A vast array of highly sensitive pressure receptors (Pacinian corpuscles) is present just beneath the skin surface, as well as in the tendons, bones, several organs in the abdomen, and the sexual organs. Per the available empirical evidence, these receptors contribute to music perception by allowing the whole body to function as a sound receiver, and knowledge of how they function is essential to fully understanding the therapeutic effect of music. Neuroscientific studies have established that music stimulates the limbic system that can trigger states of anxiety, arousal, fear, and other emotions. These emotional states of brain activity play a crucial role in filtering top-down feedback from thoughts and bottom-up sensory inputs to the autonomic system, which automatically regulates bodily functions. Music likely exerts its pleasurable and healing effects by enhancing functional and effective connectivity and feedback mechanisms between brain regions that mediate reward, autonomic, and cognitive processing. Stimulation of pressure receptors under the skin by low-frequency music-induced sensations can activate multiple centers in the brain, including the amygdala, the cingulate cortex, and nucleus accumbens. Melodies in music in the low (< 600 Hz) frequency range may augment auditory inputs after convergence of the pressure-sensitive inputs from the vagus nerve onto emotive processing regions within the limbic system. The integration of music-generated auditory and somato-visceral inputs may lead to a synergistic input to the brain that promotes healing. Thus, music can literally heal humans through “touch” as it energizes the brain’s autonomic system for restoring homeostasis.Keywords: acoustics, brain, music healing, pressure receptors
Procedia PDF Downloads 167417 Psychophysiological Adaptive Automation Based on Fuzzy Controller
Authors: Liliana Villavicencio, Yohn Garcia, Pallavi Singh, Luis Fernando Cruz, Wilfrido Moreno
Abstract:
Psychophysiological adaptive automation is a concept that combines human physiological data and computer algorithms to create personalized interfaces and experiences for users. This approach aims to enhance human learning by adapting to individual needs and preferences and optimizing the interaction between humans and machines. According to neurosciences, the working memory demand during the student learning process is modified when the student is learning a new subject or topic, managing and/or fulfilling a specific task goal. A sudden increase in working memory demand modifies the level of students’ attention, engagement, and cognitive load. The proposed psychophysiological adaptive automation system will adapt the task requirements to optimize cognitive load, the process output variable, by monitoring the student's brain activity. Cognitive load changes according to the student’s previous knowledge, the type of task, the difficulty level of the task, and the overall psychophysiological state of the student. Scaling the measured cognitive load as low, medium, or high; the system will assign a task difficulty level to the next task according to the ratio between the previous-task difficulty level and student stress. For instance, if a student becomes stressed or overwhelmed during a particular task, the system detects this through signal measurements such as brain waves, heart rate variability, or any other psychophysiological variables analyzed to adjust the task difficulty level. The control of engagement and stress are considered internal variables for the hypermedia system which selects between three different types of instructional material. This work assesses the feasibility of a fuzzy controller to track a student's physiological responses and adjust the learning content and pace accordingly. Using an industrial automation approach, the proposed fuzzy logic controller is based on linguistic rules that complement the instrumentation of the system to monitor and control the delivery of instructional material to the students. From the test results, it can be proved that the implemented fuzzy controller can satisfactorily regulate the delivery of academic content based on the working memory demand without compromising students’ health. This work has a potential application in the instructional design of virtual reality environments for training and education.Keywords: fuzzy logic controller, hypermedia control system, personalized education, psychophysiological adaptive automation
Procedia PDF Downloads 82416 Bank Internal Controls and Credit Risk in Europe: A Quantitative Measurement Approach
Authors: Ellis Kofi Akwaa-Sekyi, Jordi Moreno Gené
Abstract:
Managerial actions which negatively profile banks and impair corporate reputation are addressed through effective internal control systems. Disregard for acceptable standards and procedures for granting credit have affected bank loan portfolios and could be cited for the crises in some European countries. The study intends to determine the effectiveness of internal control systems, investigate whether perceived agency problems exist on the part of board members and to establish the relationship between internal controls and credit risk among listed banks in the European Union. Drawing theoretical support from the behavioural compliance and agency theories, about seventeen internal control variables (drawn from the revised COSO framework), bank-specific, country, stock market and macro-economic variables will be involved in the study. A purely quantitative approach will be employed to model internal control variables covering the control environment, risk management, control activities, information and communication and monitoring. Panel data from 2005-2014 on listed banks from 28 European Union countries will be used for the study. Hypotheses will be tested and the Generalized Least Squares (GLS) regression will be run to establish the relationship between dependent and independent variables. The Hausman test will be used to select whether random or fixed effect model will be used. It is expected that listed banks will have sound internal control systems but their effectiveness cannot be confirmed. A perceived agency problem on the part of the board of directors is expected to be confirmed. The study expects significant effect of internal controls on credit risk. The study will uncover another perspective of internal controls as not only an operational risk issue but credit risk too. Banks will be cautious that observing effective internal control systems is an ethical and socially responsible act since the collapse (crisis) of financial institutions as a result of excessive default is a major contagion. This study deviates from the usual primary data approach to measuring internal control variables and rather models internal control variables in a quantitative approach for the panel data. Thus a grey area in approaching the revised COSO framework for internal controls is opened for further research. Most bank failures and crises could be averted if effective internal control systems are religiously adhered to.Keywords: agency theory, credit risk, internal controls, revised COSO framework
Procedia PDF Downloads 320