Search results for: National Health Information System
1746 A Novel Harmonic Compensation Algorithm for High Speed Drives
Authors: Lakdar Sadi-Haddad
Abstract:
The past few years study of very high speed electrical drives have seen a resurgence of interest. An inventory of the number of scientific papers and patents dealing with the subject makes it relevant. In fact democratization of magnetic bearing technology is at the origin of recent developments in high speed applications. These machines have as main advantage a much higher power density than the state of the art. Nevertheless particular attention should be paid to the design of the inverter as well as control and command. Surface mounted permanent magnet synchronous machine is the most appropriate technology to address high speed issues. However, it has the drawback of using a carbon sleeve to contain magnets that could tear because of the centrifugal forces generated in rotor periphery. Carbon fiber is well known for its mechanical properties but it has poor heat conduction. It results in a very bad evacuation of eddy current losses induce in the magnets by time and space stator harmonics. The three-phase inverter is the main harmonic source causing eddy currents in the magnets. In high speed applications such harmonics are harmful because on the one hand the characteristic impedance is very low and on the other hand the ratio between the switching frequency and that of the fundamental is much lower than that of the state of the art. To minimize the impact of these harmonics a first lever is to use strategy of modulation producing low harmonic distortion while the second is to introduce a sinus filter between the inverter and the machine to smooth voltage and current waveforms applied to the machine. Nevertheless, in very high speed machine the interaction of the processes mentioned above may introduce particular harmonics that can irreversibly damage the system: harmonics at the resonant frequency, harmonics at the shaft mode frequency, subharmonics etc. Some studies address these issues but treat these phenomena with separate solutions (specific strategy of modulation, active damping methods ...). The purpose of this paper is to present a complete new active harmonic compensation algorithm based on an improvement of the standard vector control as a global solution to all these issues. This presentation will be based on a complete theoretical analysis of the processes leading to the generation of such undesired harmonics. Then a state of the art of available solutions will be provided before developing the content of a new active harmonic compensation algorithm. The study will be completed by a validation study using simulations and practical case on a high speed machine.Keywords: active harmonic compensation, eddy current losses, high speed machine
Procedia PDF Downloads 3951745 Purification of Bacillus Lipopeptides for Diverse Applications
Authors: Vivek Rangarajan, Kim G. Clarke
Abstract:
Bacillus lipopeptides are biosurfactants with wide ranging applications in the medical, food, agricultural, environmental and cosmetic industries. They are produced as a mix of three families, surfactin, iturin and fengycin, each comprising a large number of homologues of varying functionalities. Consequently, the method and degree of purification of the lipopeptide cocktail becomes particularly important if the functionality of the lipopeptide end-product is to be maximized for the specific application. However, downstream processing of Bacillus lipopeptides is particularly challenging due to the subtle variations observed in the different lipopeptide homologues and isoforms. To date, the most frequently used lipopeptide purification operations have been acid precipitation, solvent extraction, membrane ultrafiltration, adsorption and size exclusion. RP-HPLC (reverse phase high pressure liquid chromatography) also has potential for fractionation of the lipopeptide homologues. In the studies presented here, membrane ultrafiltration and RP-HPLC were evaluated for lipopeptide purification to different degrees of purities for maximum functionality. Batch membrane ultrafiltration using 50 kDa polyether sulphone (PES) membranes resulted in lipopeptide recovery of about 68% for surfactin and 82 % for fengycin. The recovery was further improved to 95% by using size-conditioned lipopeptide micelles. The conditioning of lipopeptides with Ca2+ ions resulted in uniformly sized micelles with average size of 96.4 nm and a polydispersity index of 0.18. The size conditioning also facilitated removal of impurities (molecular weight ranging between 2335-3500 Da) through operation of the system under dia-filtration mode, in a way similar to salt removal from protein by dialysis. The resultant purified lipopeptide was devoid of macromolecular impurities and could ideally suit applications in the cosmetic and food industries. Enhanced purification using RP-HPLC was carried out in an analytical C18 column, with the aim to fractionate lipopeptides into their constituent homologues. The column was eluted with mobile phase comprising acetonitrile and water over an acetonitrile gradient, 35% - 80%, over 70 minutes. The gradient elution program resulted in as many as 41 fractions of individual lipopeptide homologues. The efficacy test of these fractions against fungal phytopathogens showed that first 21 fractions, identified to be homologues of iturins and fengycins, displayed maximum antifungal activities, suitable for biocontrol in the agricultural industry. Thus, in the current study, the downstream processing of lipopeptides leading to tailor-made products for selective applications was demonstrated using two major downstream unit operations.Keywords: bacillus lipopeptides, membrane ultrafiltration, purification, RP-HPLC
Procedia PDF Downloads 2051744 Partisan Agenda Setting in Digital Media World
Authors: Hai L. Tran
Abstract:
Previous research on agenda setting effects has often focused on the top-down influence of the media at the aggregate level, while overlooking the capacity of audience members to select media and content to fit their individual dispositions. The decentralized characteristics of online communication and digital news create more choices and greater user control, thereby enabling each audience member to seek out a unique blend of media sources, issues, and elements of messages and to mix them into a coherent individual picture of the world. This study examines how audiences use media differently depending on their prior dispositions, thereby making sense of the world in ways that are congruent with their preferences and cognitions. The current undertaking is informed by theoretical frameworks from two distinct lines of scholarship. According to the ideological migration hypothesis, individuals choose to live in communities with ideologies like their own to satisfy their need to belong. One tends to move away from Zip codes that are incongruent and toward those that are more aligned with one’s ideological orientation. This geographical division along ideological lines has been documented in social psychology research. As an extension of agenda setting, the agendamelding hypothesis argues that audiences seek out information in attractive media and blend them into a coherent narrative that fits with a common agenda shared by others, who think as they do and communicate with them about issues of public. In other words, individuals, through their media use, identify themselves with a group/community that they want to join. Accordingly, the present study hypothesizes that because ideology plays a role in pushing people toward a physical community that fits their need to belong, it also leads individuals to receive an idiosyncratic blend of media and be influenced by such selective exposure in deciding what issues are more relevant. Consequently, the individualized focus of media choices impacts how audiences perceive political news coverage and what they know about political issues. The research project utilizes recent data from The American Trends Panel survey conducted by Pew Research Center to explore the nuanced nature of agenda setting at the individual level and amid heightened polarization. Hypothesis testing is performed with both nonparametric and parametric procedures, including regression and path analysis. This research attempts to explore the media-public relationship from a bottom-up approach, considering the ability of active audience members to select among media in a larger process that entails agenda setting. It helps encourage agenda-setting scholars to further examine effects at the individual, rather than aggregate, level. In addition to theoretical contributions, the study’s findings are useful for media professionals in building and maintaining relationships with the audience considering changes in market share due to the spread of digital and social media.Keywords: agenda setting, agendamelding, audience fragmentation, ideological migration, partisanship, polarization
Procedia PDF Downloads 591743 The Social Ecology of Serratia entomophila: Pathogen of Costelytra giveni
Authors: C. Watson, T. Glare, M. O'Callaghan, M. Hurst
Abstract:
The endemic New Zealand grass grub (Costelytra giveni, Coleoptera: Scarabaeidae) is an economically significant grassland pest in New Zealand. Due to their impacts on production within the agricultural sector, one of New Zealand's primary industries, several methods are being used to either control or prevent the establishment of new grass grub populations in the pasture. One such method involves the use of a biopesticide based on the bacterium Serratia entomophila. This species is one of the causative agents of amber disease, a chronic disease of the larvae which results in death via septicaemia after approximately 2 to 3 months. The ability of S. entomophila to cause amber disease is dependant upon the presence of the amber disease associated plasmid (pADAP), which encodes for the key virulence determinants required for the establishment and maintenance of the disease. Following the collapse of grass grub populations within the soil, resulting from either natural population build-up or application of the bacteria, non-pathogenic plasmid-free Serratia strains begin to predominate within the soil. Whilst the interactions between S. entomophila and grass grub larvae are well studied, less information is known on the interactions between plasmid-bearing and plasmid-free strains, particularly the potential impact of these interactions upon the efficacy of an applied biopesticide. Using a range of constructed strains with antibiotic tags, in vitro (broth culture) and in vivo (soil and larvae) experiments were conducted using inoculants comprised of differing ratios of isogenic pathogenic and non-pathogenic Serratia strains, enabling the relative growth of pADAP+ and pADAP- strains under competition conditions to be assessed. In nutrient-rich, the non-pathogenic pADAP- strain outgrew the pathogenic pADAP+ strain by day 3 when inoculated in equal quantities, and by day 5 when applied as the minority inoculant, however, there was an overall gradual decline in the number of viable bacteria for both strains over a 7-day period. Similar results were obtained in additional experiments using the same strains and continuous broth cultures re-inoculated at 24-hour intervals, although in these cultures, the viable cell count did not diminish over the 7-day period. When the same ratios were assessed in soil microcosms with limited available nutrients, the strains remained relatively stable over a 2-month period. Additionally, in vivo grass grub co-infections assays using the same ratios of tagged Serratia strains revealed similar results to those observed in the soil, but there was also evidence of horizontal transfer of pADAP from the pathogenic to the non-pathogenic strain within the larval gut after a period of 4 days. Whilst the influence of competition is more apparent in broth cultures than within the soil or larvae, further testing is required to determine whether this competition between pathogenic and non-pathogenic Serratia strains has any influence on efficacy and disease progression, and how this may impact on the ability of S. entomophila to cause amber disease within grass grub larvae when applied as a biopesticide.Keywords: biological control, entomopathogen, microbial ecology, New Zealand
Procedia PDF Downloads 1561742 Development of a Stable RNAi-Based Biological Control for Sheep Blowfly Using Bentonite Polymer Technology
Authors: Yunjia Yang, Peng Li, Gordon Xu, Timothy Mahony, Bing Zhang, Neena Mitter, Karishma Mody
Abstract:
Sheep flystrike is one of the most economically important diseases affecting the Australian sheep and wool industry (>356M/annually). Currently, control of Lucillia cuprina relies almost exclusively on chemicals controls and the parasite has developed resistance to nearly all control chemicals used in the past. It is therefore critical to develop an alternative solution for the sustainable control and management of flystrike. RNA interference (RNAi) technologies have been successfully explored in multiple animal industries for developing parasites controls. This research project aims to develop a RNAi based biological control for sheep blowfly. Double-stranded RNA (dsRNA) has already proven successful against viruses, fungi and insects. However, the environmental instability of dsRNA is a major bottleneck for successful RNAi. Bentonite polymer (BenPol) technology can overcome this problem, as it can be tuned for the controlled release of dsRNA in the gut challenging pH environment of the blowfly larvae, prolonging its exposure time to and uptake by target cells. To investigate the potential of BenPol technology for dsRNA delivery, four different BenPol carriers were tested for their dsRNA loading capabilities, and three of them were found to be capable of affording dsRNA stability under multiple temperatures (4°C, 22°C, 40°C, 55°C) in sheep serum. Based on stability results, dsRNA from potential targeted genes was loaded onto BenPol carriers and tested in larvae feeding assays, three genes resulting in knockdowns. Meanwhile, a primary blowfly embryo cell line (BFEC) derived from L. cuprina embryos was successfully established, aim for an effective insect cell model for testing RNAi efficacy for preliminary assessments and screening. The results of this study establish that the dsRNA is stable when loaded on BenPol particles, unlike naked dsRNA rapidly degraded in sheep serum. The stable nanoparticle delivery system offered by BenPol technology can protect and increase the inherent stability of dsRNA molecules at higher temperatures in a complex biological fluid like serum, providing promise for its future use in enhancing animal protection.Keywords: flystrike, RNA interference, bentonite polymer technology, Lucillia cuprina
Procedia PDF Downloads 921741 Effect of Low to Moderate Altitude on Football Performance: An Analysis of Thirteen Seasons in the South African Premier Soccer League
Authors: Khatija Bahdur, Duane Dell’Oca
Abstract:
There is limited information on how altitude impacts performance in a team sport. Most altitude research in football has been conducted at high elevation ( > 2500m), resulting in a chasm of understanding whether low to moderate altitude affects performance. The South African Premier Soccer League (PSL) fixtures entail matches played at altitudes from sea level to 1700m above mean sea level. Despite coaches highlighting the effect of altitude on performance outcomes in matches, further research is needed to establish whether altitude does impact match results. Greater insight into if and how altitude impacts performance in the PSL will assist coaches in deciding if and how to incorporate altitude in their planning. The purpose of this study is to fill in this gap through the use of a retrospective analysis of PSL matches. This quantitative study is based on a descriptive analysis of 181 PSL matches involving one team based at sea-level, taking place over a period of thirteen seasons. The following data were obtained: altitude at which the match was played, match result, the timing of goals, and timing of substitutions. The altitude was classified in 2 ways: inland ( > 500m) and coastal ( < 500m) and also further subdivided into narrower categories ( < 500m, 500-1000m, 1000-1300m; 1300-1500m, > 1500m). The analysis included a 2-sample t-test to determine differences in total goals scored and timing of goals for inland and coastal matches and the chi-square test to identify the significance of altitude on match results. The level of significance was set at the alpha level of 0.05. Match results are significantly affected by the altitude and level of altitude within inland teams most likely to win when playing at inland venues (p=0.000). The proportion of draws was slightly higher at the coast. At altitudes between 500-1000m, 1300-1500m, and 1500-1700m, a greater percentage of matches were won by coastal teams as opposed to draws. The timing of goals varied based on the team’s base altitude and the match elevation. The most significant differences were between 36-40 minutes (p=0.023), 41-45 minutes (p=0.000) and 50-65 minutes (p=0.000). When breaking down inland team’s matches to different altitude categories, greater differences were highlighted. Inland teams scored more goals per minute between 10-20 minute (p=0.009), 41-45 minutes (p=0.003) and 50-65 minutes (p=0.015). The total number of goals scored per match at different altitudes by a) inland teams (p=0.000), b) coastal teams (p=0.006). Coastal teams made significantly more substitutions when playing at altitude (p=0.034), although there were no significant differences when comparing the different altitude categories. The timing of all three changes, however, did vary significantly at the different altitudes. There were no significant differences in timing or number of substitutions for inland teams. Match results and timing of goals are influenced by altitude, with differences between the level of altitude also playing a role. The trends indicate that inland teams win more matches when playing at altitude against coastal teams, and they score more goals just prior to half-time and in the first quarter of the second half.Keywords: coastal teams, inland teams, timing of goals, results, substitutions
Procedia PDF Downloads 1311740 Teaching Timber: The Role of the Architectural Student and Studio Course within an Interdisciplinary Research Project
Authors: Catherine Sunter, Marius Nygaard, Lars Hamran, Børre Skodvin, Ute Groba
Abstract:
Globally, the construction and operation of buildings contribute up to 30% of annual green house gas emissions. In addition, the building sector is responsible for approximately a third of global waste. In this context, the utilization of renewable resources in buildings, especially materials that store carbon, will play a significant role in the growing city. These are two reasons for introducing wood as a building material with a growing relevance. A third is the potential economic value in countries with a forest industry that is not currently used to capacity. In 2013, a four-year interdisciplinary research project titled “Wood Be Better” was created, with the principle goal to produce and publicise knowledge that would facilitate increased use of wood in buildings in urban areas. The research team consisted of architects, engineers, wood technologists and mycologists, both from research institutions and industrial organisations. Five structured work packages were included in the initial research proposal. Work package 2 was titled “Design-based research” and proposed using architecture master courses as laboratories for systematic architectural exploration. The aim was twofold: to provide students with an interdisciplinary team of experts from consultancies and producers, as well as teachers and researchers, that could offer the latest information on wood technologies; whilst at the same time having the studio course test the effects of the use of wood on the functional, technical and tectonic quality within different architectural projects on an urban scale, providing results that could be fed back into the research material. The aim of this article is to examine the successes and failures of this pedagogical approach in an architecture school, as well as the opportunities for greater integration between academic research projects, industry experts and studio courses in the future. This will be done through a set of qualitative interviews with researchers, teaching staff and students of the studio courses held each semester since spring 2013. These will investigate the value of the various experts of the course; the different themes of each course; the response to the urban scale, architectural form and construction detail; the effect of working with the goals of a research project; and the value of the studio projects to the research. In addition, six sample projects will be presented as case studies. These will show how the projects related to the research and could be collected and further analysed, innovative solutions that were developed during the course, different architectural expressions that were enabled by timber, and how projects were used as an interdisciplinary testing ground for integrated architectural and engineering solutions between the participating institutions. The conclusion will reflect on the original intentions of the studio courses, the opportunities and challenges faced by students, researchers and teachers, the educational implications, and on the transparent and inclusive discourse between the architectural researcher, the architecture student and the interdisciplinary experts.Keywords: architecture, interdisciplinary, research, studio, students, wood
Procedia PDF Downloads 3121739 Transition from Linear to Circular Business Models with Service Design Methodology
Authors: Minna-Maari Harmaala, Hanna Harilainen
Abstract:
Estimates of the economic value of transitioning to circular economy models vary but it has been estimated to represent $1 trillion worth of new business into the global economy. In Europe alone, estimates claim that adopting circular-economy principles could not only have environmental and social benefits but also generate a net economic benefit of €1.8 trillion by 2030. Proponents of a circular economy argue that it offers a major opportunity to increase resource productivity, decrease resource dependence and waste, and increase employment and growth. A circular system could improve competitiveness and unleash innovation. Yet, most companies are not capturing these opportunities and thus the even abundant circular opportunities remain uncaptured even though they would seem inherently profitable. Service design in broad terms relates to developing an existing or a new service or service concept with emphasis and focus on the customer experience from the onset of the development process. Service design may even mean starting from scratch and co-creating the service concept entirely with the help of customer involvement. Service design methodologies provide a structured way of incorporating customer understanding and involvement in the process of designing better services with better resonance to customer needs. A business model is a depiction of how the company creates, delivers, and captures value; i.e. how it organizes its business. The process of business model development and adjustment or modification is also called business model innovation. Innovating business models has become a part of business strategy. Our hypothesis is that in addition to linear models still being easier to adopt and often with lower threshold costs, companies lack an understanding of how circular models can be adopted into their business and how customers will be willing and ready to adopt the new circular business models. In our research, we use robust service design methodology to develop circular economy solutions with two case study companies. The aim of the process is to not only develop the service concepts and portfolio, but to demonstrate the willingness to adopt circular solutions exists in the customer base. In addition to service design, we employ business model innovation methods to develop, test, and validate the new circular business models further. The results clearly indicate that amongst the customer groups there are specific customer personas that are willing to adopt and in fact are expecting the companies to take a leading role in the transition towards a circular economy. At the same time, there is a group of indifferents, to whom the idea of circularity provides no added value. In addition, the case studies clearly show what changes adoption of circular economy principles brings to the existing business model and how they can be integrated.Keywords: business model innovation, circular economy, circular economy business models, service design
Procedia PDF Downloads 1351738 Motivational Profiles of the Entrepreneurial Career in Spanish Businessmen
Authors: Magdalena Suárez-Ortega, M. Fe. Sánchez-García
Abstract:
This paper focuses on the analysis of the motivations that lead people to undertake and consolidate their business. It is addressed from the framework of planned behavior theory, which recognizes the importance of the social environment and cultural values, both in the decision to undertake business and in business consolidation. Similarly, it is also based on theories of career development, which emphasize the importance of career management competencies and their connections to other vital aspects of people, including their roles within their families and other personal activities. This connects directly with the impact of entrepreneurship on the career and the professional-personal project of each individual. This study is part of the project titled Career Design and Talent Management (Ministry of Economy and Competitiveness of Spain, State Plan 2013-2016 Excellence Ref. EDU2013-45704-P). The aim of the study is to identify and describe entrepreneurial competencies and motivational profiles in a sample of 248 Spanish entrepreneurs, considering the consolidated profile and the profile in transition (n = 248).In order to obtain the information, the Questionnaire of Motivation and conditioners of the entrepreneurial career (MCEC) has been applied. This consists of 67 items and includes four scales (E1-Conflicts in conciliation, E2-Satisfaction in the career path, E3-Motivations to undertake, E4-Guidance Needs). Cluster analysis (mixed method, combining k-means clustering with a hierarchical method) was carried out, characterizing the groups profiles according to the categorical variables (chi square, p = 0.05), and the quantitative variables (ANOVA). The results have allowed us to characterize three motivational profiles relevant to the motivation, the degree of conciliation between personal and professional life, and the degree of conflict in conciliation, levels of career satisfaction and orientation needs (in the entrepreneurial project and life-career). The first profile is formed by extrinsically motivated entrepreneurs, professionally satisfied and without conflict of vital roles. The second profile acts with intrinsic motivation and also associated with family models, and although it shows satisfaction with their professional career, it finds a high conflict in their family and professional life. The third is composed of entrepreneurs with high extrinsic motivation, professional dissatisfaction and at the same time, feel the conflict in their professional life by the effect of personal roles. Ultimately, the analysis has allowed us to line the kinds of entrepreneurs to different levels of motivation, satisfaction, needs and articulation in professional and personal life, showing characterizations associated with the use of time for leisure, and the care of the family. Associations related to gender, age, activity sector, environment (rural, urban, virtual), and the use of time for domestic tasks are not identified. The model obtained and its implications for the design of training actions and orientation to entrepreneurs is also discussed.Keywords: motivation, entrepreneurial career, guidance needs, life-work balance, job satisfaction, assessment
Procedia PDF Downloads 3011737 Optimization of Heat Insulation Structure and Heat Flux Calculation Method of Slug Calorimeter
Authors: Zhu Xinxin, Wang Hui, Yang Kai
Abstract:
Heat flux is one of the most important test parameters in the ground thermal protection test. Slug calorimeter is selected as the main sensor measuring heat flux in arc wind tunnel test due to the convenience and low cost. However, because of excessive lateral heat transfer and the disadvantage of the calculation method, the heat flux measurement error of the slug calorimeter is large. In order to enhance measurement accuracy, the heat insulation structure and heat flux calculation method of slug calorimeter were improved. The heat transfer model of the slug calorimeter was built according to the energy conservation principle. Based on the heat transfer model, the insulating sleeve of the hollow structure was designed, which helped to greatly decrease lateral heat transfer. And the slug with insulating sleeve of hollow structure was encapsulated using a package shell. The improved insulation structure reduced heat loss and ensured that the heat transfer characteristics were almost the same when calibrated and tested. The heat flux calibration test was carried out in arc lamp system for heat flux sensor calibration, and the results show that test accuracy and precision of slug calorimeter are improved greatly. In the meantime, the simulation model of the slug calorimeter was built. The heat flux values in different temperature rise time periods were calculated by the simulation model. The results show that extracting the data of the temperature rise rate as soon as possible can result in a smaller heat flux calculation error. Then the different thermal contact resistance affecting calculation error was analyzed by the simulation model. The contact resistance between the slug and the insulating sleeve was identified as the main influencing factor. The direct comparison calibration correction method was proposed based on only heat flux calibration. The numerical calculation correction method was proposed based on the heat flux calibration and simulation model of slug calorimeter after the simulation model was solved by solving the contact resistance between the slug and the insulating sleeve. The simulation and test results show that two methods can greatly reduce the heat flux measurement error. Finally, the improved slug calorimeter was tested in the arc wind tunnel. And test results show that the repeatability accuracy of improved slug calorimeter is less than 3%. The deviation of measurement value from different slug calorimeters is less than 3% in the same fluid field. The deviation of measurement value between slug calorimeter and Gordon Gage is less than 4% in the same fluid field.Keywords: correction method, heat flux calculation, heat insulation structure, heat transfer model, slug calorimeter
Procedia PDF Downloads 1181736 The Life Skills Project: Client-Centered Approaches to Life Skills Acquisition for Homeless and At-Risk Populations
Authors: Leah Burton, Sara Cumming, Julianne DiSanto
Abstract:
Homelessness is a widespread and complex problem in Canada and around the globe. Many Canadians will face homelessness at least once in their lifetime, with several experiencing subsequent bouts or cyclical patterns of housing precarity. While a Housing First approach to homelessness is a long-standing and widely accepted best practice, it is also recognized that the acquisition of life skills is an effective way to reduce cycles of homelessness. Indeed, when individuals are provided with a range of life skills—such as (but not limited to) financial literacy, household management, interpersonal skills, critical thinking, and resource management—they are given the tools required to maintain long-term Housing for a lifetime; thus reducing a repetitive need for services. However, there is limited research regarding the best ways to teach life skills, a problem that has been further complicated in a post-pandemic world, where services are being delivered online or in a hybrid model of care. More than this, it is difficult to provide life skills on a large scale without losing a client-centered approach to services. This lack of client-centeredness is also seen in the lack of attention to culturally sensitive life skills, which consider the diverse needs of individuals and imbed equity, diversity, and inclusion (EDI) within the skills being taught. This study aims to fill these identified gaps in the literature by employing a community-engaged (CER) approach. Academic, government, funders, front-line staff, and clients at 15 not-for-profits from across the Greater Toronto Area in Ontario, Canada, collaborated to co-create a virtual, client-centric, EDI-informed life skill learning management system. A triangulation methodology was utilized for this research. An environmental scan was conducted for current best practices, and over 100 front-line staff (including workers, managers, and executive directors who work with homeless populations) participated in two separate Creative Problem Solving Sessions. Over 200 individuals with experience in homelessness completed quantitative and open-ended surveys. All sections of this research aimed to discover the areas of skills that individuals need to maintain Housing and to ascertain what a more client-driven EDI approach to life skills training should include. This presentation will showcase the findings on which life skills are deemed essential for homeless and precariously housed individuals.Keywords: homelessness, housing first, life skills, community engaged research, client- centered
Procedia PDF Downloads 1011735 The Impact of Coronal STIR Imaging in Routine Lumbar MRI: Uncovering Hidden Causes to Enhanced Diagnostic Yield of Back Pain and Sciatica
Authors: Maysoon Nasser Samhan, Somaya Alkiswani, Abdullah Alzibdeh
Abstract:
Background: Routine lumbar MRIs for back pain may yield normal results despite persistent symptoms, which means the possibility of other causes for this pain, which was not shown on the routine images. Research suggests including coronal STIR imaging to detect additional pathologies like sacroiliitis. Objectives: This study aims to enhance diagnostic accuracy and aid in determining treatment processes for patients with persistent back pain who have normal routine lumbar MRI (T1 and T2 images) by incorporating coronal STIR into the examination. Methods: A prospectively conducted study involving 274 patients, 115 males and 159 females, with an age range of 6–92 years, reviewed their medical records and imaging data following a lumbar spine MRI. This study included patients with back pain and sciatica as their primary complaints, all of whom underwent lumbar spine MRIs at our hospital to identify potential pathologies. Using a GE Signa HD 1.5T MRI System, each patient received a standard MRI protocol that included T1 and T2 sagittal and axial sequences, as well as a coronal STIR sequence. We collected relevant MRI findings, including abnormalities and structural variations, from radiology reports. We classified these findings into tables and documented them as counts and percentages, using Fisher’s exact test to assess differences between categorical variables. We conducted a statistical analysis using Prism GraphPad software version 10.1.2. The study adhered to ethical guidelines, institutional review board approvals, and patient confidentiality regulations. Results: Exclusion of the coronal STIR sequence led to 83 subjects (30.29%) being classified as within normal limits on MRI examination. 36 patients without abnormalities on T1 and T2 sequences showed abnormalities on the coronal STIR sequence, with 26 cases attributed to spinal pathologies and 10 to non-spinal pathologies. In addition to that, Fisher's exact test demonstrated a significant association between sacroiliitis diagnosis and abnormalities identified solely through the coronal STIR sequence (P < 0.0001). Conclusion: Implementing coronal STIR imaging as part of routine lumbar MRI protocols has the potential to improve patient care by facilitating a more comprehensive evaluation and management of persistent back pain.Keywords: magnetic resonance imaging, lumber MRI, radiology, neurology
Procedia PDF Downloads 141734 Effect of Phenolic Acids on Human Saliva: Evaluation by Diffusion and Precipitation Assays on Cellulose Membranes
Authors: E. Obreque-Slier, F. Orellana-Rodríguez, R. López-Solís
Abstract:
Phenolic compounds are secondary metabolites present in some foods, such as wine. Polyphenols comprise two main groups: flavonoids (anthocyanins, flavanols, and flavonols) and non-flavonoids (stilbenes and phenolic acids). Phenolic acids are low molecular weight non flavonoid compounds that are usually grouped into benzoic (gallic, vanillinic and protocatechuic acids) and cinnamic acids (ferulic, p-coumaric and caffeic acids). Likewise, tannic acid is an important polyphenol constituted mainly by gallic acid. Phenolic compounds are responsible for important properties in foods and drinks, such as color, aroma, bitterness, and astringency. Astringency is a drying, roughing, and sometimes puckering sensation that is experienced on the various oral surfaces during or immediately after tasting foods. Astringency perception has been associated with interactions between flavanols present in some foods and salivary proteins. Despite the quantitative relevance of phenolic acids in food and beverages, there is no information about its effect on salivary proteins and consequently on the sensation of astringency. The objective of this study was assessed the interaction of several phenolic acids (gallic, vanillinic, protocatechuic, ferulic, p-coumaric and caffeic acids) with saliva. Tannic acid was used as control. Thus, solutions of each phenolic acids (5 mg/mL) were mixed with human saliva (1:1 v/v). After incubation for 5 min at room temperature, 15-μL aliquots of the mixtures were dotted on a cellulose membrane and allowed to diffuse. The dry membrane was fixed in 50 g/L trichloroacetic acid, rinsed in 800 mL/L ethanol and stained for protein with Coomassie blue for 20 min, destained with several rinses of 73 g/L acetic acid and dried under a heat lamp. Both diffusion area and stain intensity of the protein spots were semiqualitative estimates for protein-tannin interaction (diffusion test). The rest of the whole saliva-phenol solution mixtures of the diffusion assay were centrifuged and fifteen-μL aliquots of each supernatant were dotted on a cellulose membrane, allowed to diffuse and processed for protein staining, as indicated above. In this latter assay, reduced protein staining was taken as indicative of protein precipitation (precipitation test). The diffusion of the salivary protein was restricted by the presence of each phenolic acids (anti-diffusive effect), while tannic acid did not alter diffusion of the salivary protein. By contrast, phenolic acids did not provoke precipitation of the salivary protein, while tannic acid produced precipitation of salivary proteins. In addition, binary mixtures (mixtures of two components) of various phenolic acids with gallic acid provoked a restriction of saliva. Similar effect was observed by the corresponding individual phenolic acids. Contrary, binary mixtures of phenolic acid with tannic acid, as well tannic acid alone, did not affect the diffusion of the saliva but they provoked an evident precipitation. In summary, phenolic acids showed a relevant interaction with the salivary proteins, thus suggesting that these wine compounds can also contribute to the sensation of astringency.Keywords: astringency, polyphenols, tannins, tannin-protein interaction
Procedia PDF Downloads 2461733 Toxin-Producing Algae of Nigerian Coast, Gulf of Guinea
Authors: Medina O. Kadiri, Jeffrey U. Ogbebor
Abstract:
Toxin-producing algae are algal species that produce potent toxins, which accumulate in food chains and cause various gastrointestinal and neurological illnesses in humans and other animals. They result in shellfish toxicity, ecosystem alteration, cause fish kills and mortality of other animals and humans, in addition to compromised product quality as well as decreased consumer confidence. Animals, including man, are directly exposed to toxins by absorbing toxins from the water via swimming, drinking water with toxins, or ingestion of algal species via feeding on contaminated seafood. These toxins, algal toxins, undergo bioaccumulation, biotransformation, biotransferrence, and biomagnification through the natural food chains and food webs, thereby endangering animals and humans. The Nigerian coast is situated on the Atlantic Ocean, the Gulf of Guinea, one of Africa’s five large marine ecosystems (LME), and studies on toxic algae in this ecosystem are generally lacking. Algal samples were collected from eight coastal states and ten locations spanning the Bight of Bonny and the Bight of Benin. A total of 70 species of toxin-producing algae were found in the coastal waters of Nigeria. There was a great variety of toxin-producing algae in the coastal waters of Nigeria. They were Domoic acid-producing forms (DSP), Saxitoxin-producing, Gonyautoxin-producing, and Yessotoxin-producing (all PSP). Others were Okadaic acid-producing, Dinophysistoxin-producing, and Palytoxin-producing, which are representatives of DSP; CFP was represented by Ciguatoxin-producing forms and NSP by Brevitoxin-producing species. Emerging or new toxins are comprising of Gymnodimines, Spirolides, Palytoxins, and Prorocentrolidess-producing algae. The CyanoToxin Poisoning (CTP) was represented by Anatoxin-, Microcystin-, Cylindrospermopsis-Lyngbyatoxin-, Nordularin-Applyssiatoxin and Debromoapplatoxin-producing species. The highest group was the Saxitoxin-producing species, followed by Microcystin-producing species, then Anatoxin-producing species. Gonyautoxin (PSP), Palytoxin (DSP), Emerging toxins, and Cylindrospermopsin -producing species had a very substantial representation. Only Ciguatoxin-producing species, Lyngbyatoxin-Nordularin, Applyssiatoxin, and Debromoapplatoxin-producing species were represented by one taxon each. The presence of such overwhelming diversity of toxin-producing algae on the Nigerian coast is a source of concern for fisheries, aquaculture, human health, and ecosystem services. Therefore routine monitoring of toxic and harmful algae is greatly recommended.Keywords: algal syndromes, Atlantic Ocean, harmful algae, Nigeria
Procedia PDF Downloads 2071732 Reducing the Computational Cost of a Two-way Coupling CFD-FEA Model via a Multi-scale Approach for Fire Determination
Authors: Daniel Martin Fellows, Sean P. Walton, Jennifer Thompson, Oubay Hassan, Kevin Tinkham, Ella Quigley
Abstract:
Structural integrity for cladding products is a key performance parameter, especially concerning fire performance. Cladding products such as PIR-based sandwich panels are tested rigorously, in line with industrial standards. Physical fire tests are necessary to ensure the customer's safety but can give little information about critical behaviours that can help develop new materials. Numerical modelling is a tool that can help investigate a fire's behaviour further by replicating the fire test. However, fire is an interdisciplinary problem as it is a chemical reaction that behaves fluidly and impacts structural integrity. An analysis using Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) is needed to capture all aspects of a fire performance test. One method is a two-way coupling analysis that imports the updated changes in thermal data, due to the fire's behaviour, to the FEA solver in a series of iterations. In light of our recent work with Tata Steel U.K using a two-way coupling methodology to determine the fire performance, it has been shown that a program called FDS-2-Abaqus can make predictions of a BS 476 -22 furnace test with a degree of accuracy. The test demonstrated the fire performance of Tata Steel U.K Trisomet product, a Polyisocyanurate (PIR) based sandwich panel used for cladding. Previous works demonstrated the limitations of the current version of the program, the main limitation being the computational cost of modelling three Trisomet panels, totalling an area of 9 . The computational cost increases substantially, with the intention to scale up to an LPS 1181-1 test, which includes a total panel surface area of 200 .The FDS-2-Abaqus program is developed further within this paper to overcome this obstacle and better accommodate Tata Steel U.K PIR sandwich panels. The new developments aim to reduce the computational cost and error margin compared to experimental data. One avenue explored is a multi-scale approach in the form of Reduced Order Modeling (ROM). The approach allows the user to include refined details of the sandwich panels, such as the overlapping joints, without a computationally costly mesh size.Comparative studies will be made between the new implementations and the previous study completed using the original FDS-2-ABAQUS program. Validation of the study will come from physical experiments in line with governing body standards such as BS 476 -22 and LPS 1181-1. The physical experimental data includes the panels' gas and surface temperatures and mechanical deformation. Conclusions are drawn, noting the new implementations' impact factors and discussing the reasonability for scaling up further to a whole warehouse.Keywords: fire testing, numerical coupling, sandwich panels, thermo fluids
Procedia PDF Downloads 791731 Sentiment Analysis on University Students’ Evaluation of Teaching and Their Emotional Engagement
Authors: Elisa Santana-Monagas, Juan L. Núñez, Jaime León, Samuel Falcón, Celia Fernández, Rocío P. Solís
Abstract:
Teaching practices have been widely studied in relation to students' outcomes, positioning themselves as one of their strongest catalysts and influencing students' emotional experiences. In the higher education context, teachers become even more crucial as many students ground their decisions on which courses to enroll in based on opinions and ratings of teachers from other students. Unfortunately, sometimes universities do not provide the personal, social, and academic stimulation students demand to be actively engaged. To evaluate their teachers, universities often rely on students' evaluations of teaching (SET) collected via Likert scale surveys. Despite its usefulness, such a method has been questioned in terms of validity and reliability. Alternatively, researchers can rely on qualitative answers to open-ended questions. However, the unstructured nature of the answers and a large amount of information obtained requires an overwhelming amount of work. The present work presents an alternative approach to analyse such data: sentiment analysis. To the best of our knowledge, no research before has included results from SA into an explanatory model to test how students' sentiments affect their emotional engagement in class. The sample of the present study included a total of 225 university students (Mean age = 26.16, SD = 7.4, 78.7 % women) from the Educational Sciences faculty of a public university in Spain. Data collection took place during the academic year 2021-2022. Students accessed an online questionnaire using a QR code. They were asked to answer the following open-ended question: "If you had to explain to a peer who doesn't know your teacher how he or she communicates in class, what would you tell them?". Sentiment analysis was performed using Microsoft's pre-trained model. The reliability of the measure was estimated between the tool and one of the researchers who coded all answers independently. The Cohen's kappa and the average pairwise percent agreement were estimated with ReCal2. Cohen's kappa was .68, and the agreement reached was 90.8%, both considered satisfactory. To test the hypothesis relations among SA and students' emotional engagement, a structural equation model (SEM) was estimated. Results demonstrated a good fit of the data: RMSEA = .04, SRMR = .03, TLI = .99, CFI = .99. Specifically, the results showed that student’s sentiment regarding their teachers’ teaching positively predicted their emotional engagement (β == .16 [.02, -.30]). In other words, when students' opinion toward their instructors' teaching practices is positive, it is more likely for students to engage emotionally in the subject. Altogether, the results show a promising future for sentiment analysis techniques in the field of education. They suggest the usefulness of this tool when evaluating relations among teaching practices and student outcomes.Keywords: sentiment analysis, students' evaluation of teaching, structural-equation modelling, emotional engagement
Procedia PDF Downloads 851730 Analysis of Ozone Episodes in the Forest and Vegetation Areas with Using HYSPLIT Model: A Case Study of the North-West Side of Biga Peninsula, Turkey
Authors: Deniz Sari, Selahattin İncecik, Nesimi Ozkurt
Abstract:
Surface ozone, which named as one of the most critical pollutants in the 21th century, threats to human health, forest and vegetation. Specifically, in rural areas surface ozone cause significant influences on agricultural productions and trees. In this study, in order to understand to the surface ozone levels in rural areas we focus on the north-western side of Biga Peninsula which covers by the mountainous and forested area. Ozone concentrations were measured for the first time with passive sampling at 10 sites and two online monitoring stations in this rural area from 2013 and 2015. Using with the daytime hourly O3 measurements during light hours (08:00–20:00) exceeding the threshold of 40 ppb over the 3 months (May, June and July) for agricultural crops, and over the six months (April to September) for forest trees AOT40 (Accumulated hourly O3 concentrations Over a Threshold of 40 ppb) cumulative index was calculated. AOT40 is defined by EU Directive 2008/50/EC to evaluate whether ozone pollution is a risk for vegetation, and is calculated by using hourly ozone concentrations from monitoring systems. In the present study, we performed the trajectory analysis by The Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model to follow the long-range transport sources contributing to the high ozone levels in the region. The ozone episodes observed between 2013 and 2015 were analysed using the HYSPLIT model developed by the NOAA-ARL. In addition, the cluster analysis is used to identify homogeneous groups of air mass transport patterns can be conducted through air trajectory clustering by grouping similar trajectories in terms of air mass movement. Backward trajectories produced for 3 years by HYSPLIT model were assigned to different clusters according to their moving speed and direction using a k-means clustering algorithm. According to cluster analysis results, northerly flows to study area cause to high ozone levels in the region. The results present that the ozone values in the study area are above the critical levels for forest and vegetation based on EU Directive 2008/50/EC.Keywords: AOT40, Biga Peninsula, HYSPLIT, surface ozone
Procedia PDF Downloads 2551729 Physiological Indicators and Stress Index of Scavenging Chickens at Lafarge and Dangote Cement Factory Areas of Ogun State
Authors: Oluwadele Joshua Femi, Akinlabi Ebenezer Yemi, Onaopemipo Adeitan, Kazeem Bello, Anthony Ekeocha, Miraim Tawose
Abstract:
This study was carried out to determine the physiological and stress index of scavenging chickens in LAFARGE (Ewekoro) and Dangote (Ibese) Cement Factories Area of Ogun State. One hundred adult scavenging chickens comprising of 25 chickens from LAFARGE, Dangote and respective adjourning communities (Imasayi and Wasimi) were used. Experimental birds were caught at night on their perch and kept in cages till the next morning. Data were collected on rectal temperature, pulse rate, and respiratory rate of the birds. Also, 5ml blood was collected through the wing vein of the chickens in each location using a sterilized needle and syringe and transported to laboratory for analysis. Significant (P<0.05) highest pulse rate (215.64 beat/minute) and respiratory rate (19.90 breaths/minute) were recorded among scavenging chickens at LAFARGE (Ewekoro) Area and the least (198.61 beat/minute and 16.93 breaths/minute, respectively) at Imasayi. There was no significant (P>0.05) difference in the rectal temperature of the birds in the study area. Significant (P<0.05) differences were also recorded in the Packed Cell Volume (PCV), Hemoglobin (Hb), White Blood Cell (WBC), Monocyte, and Glucose level of the chickens in study area with the highest (P<0.05) Packed Cell Volume (28.06%) and Haemoglobin (4.01g/dl) recorded in Ibese and the least Packed Cell Volume (22.00%) and Haemoglobin (288g/dl) in Imasayi. Highest (P<0.05) Monocyte (4.28%) and glucose (256.53g/dl) were recorded among scavenging chickens at Dangote (Ibese) while the least Monocyte (0.00%) and Glucose (194.53g/dl) was recorded among chickens at Wasimi. Highest (P<0.05) White Blood Cell (6488.89×103µl) was recorded among chickens at Ewekoro and the lowest value in Ibese (4388.44×103µl). There was no significant (P>0.05) difference in the Heterophyl, Lymphocyte, Basophyl and Heterophyl/Lymphocyte ratio of the chickens in the study Area. The study concluded that chickens reared at LAFARGE (Ewekoro) were stressed and had comprised welfare and health status compared to Dangote (Ibese) cement area and other agrarian communities. Effective environmental mitigation programme should be put in place to enhance the welfare of the scavenging chickens in LAFARGE Cement Factory Area.Keywords: blood, chicken, poisonous substances, pack cell volume, communities
Procedia PDF Downloads 861728 The Use of Emerging Technologies in Higher Education Institutions: A Case of Nelson Mandela University, South Africa
Authors: Ayanda P. Deliwe, Storm B. Watson
Abstract:
The COVID-19 pandemic has disrupted the established practices of higher education institutions (HEIs). Most higher education institutions worldwide had to shift from traditional face-to-face to online learning. The online environment and new online tools are disrupting the way in which higher education is presented. Furthermore, the structures of higher education institutions have been impacted by rapid advancements in information and communication technologies. Emerging technologies should not be viewed in a negative light because, as opposed to the traditional curriculum that worked to create productive and efficient researchers, emerging technologies encourage creativity and innovation. Therefore, using technology together with traditional means will enhance teaching and learning. Emerging technologies in higher education not only change the experience of students, lecturers, and the content, but it is also influencing the attraction and retention of students. Higher education institutions are under immense pressure because not only are they competing locally and nationally, but emerging technologies also expand the competition internationally. Emerging technologies have eliminated border barriers, allowing students to study in the country of their choice regardless of where they are in the world. Higher education institutions are becoming indifferent as technology is finding its way into the lecture room day by day. Academics need to utilise technology at their disposal if they want to get through to their students. Academics are now competing for students' attention with social media platforms such as WhatsApp, Snapchat, Instagram, Facebook, TikTok, and others. This is posing a significant challenge to higher education institutions. It is, therefore, critical to pay attention to emerging technologies in order to see how they can be incorporated into the classroom in order to improve educational quality while remaining relevant in the work industry. This study aims to understand how emerging technologies have been utilised at Nelson Mandela University in presenting teaching and learning activities since April 2020. The primary objective of this study is to analyse how academics are incorporating emerging technologies in their teaching and learning activities. This primary objective was achieved by conducting a literature review on clarifying and conceptualising the emerging technologies being utilised by higher education institutions, reviewing and analysing the use of emerging technologies, and will further be investigated through an empirical analysis of the use of emerging technologies at Nelson Mandela University. Findings from the literature review revealed that emerging technology is impacting several key areas in higher education institutions, such as the attraction and retention of students, enhancement of teaching and learning, increase in global competition, elimination of border barriers, and highlighting the digital divide. The literature review further identified that learning management systems, open educational resources, learning analytics, and artificial intelligence are the most prevalent emerging technologies being used in higher education institutions. The identified emerging technologies will be further analysed through an empirical analysis to identify how they are being utilised at Nelson Mandela University.Keywords: artificial intelligence, emerging technologies, learning analytics, learner management systems, open educational resources
Procedia PDF Downloads 691727 Control of Belts for Classification of Geometric Figures by Artificial Vision
Authors: Juan Sebastian Huertas Piedrahita, Jaime Arturo Lopez Duque, Eduardo Luis Perez Londoño, Julián S. Rodríguez
Abstract:
The process of generating computer vision is called artificial vision. The artificial vision is a branch of artificial intelligence that allows the obtaining, processing, and analysis of any type of information especially the ones obtained through digital images. Actually the artificial vision is used in manufacturing areas for quality control and production, as these processes can be realized through counting algorithms, positioning, and recognition of objects that can be measured by a single camera (or more). On the other hand, the companies use assembly lines formed by conveyor systems with actuators on them for moving pieces from one location to another in their production. These devices must be previously programmed for their good performance and must have a programmed logic routine. Nowadays the production is the main target of every industry, quality, and the fast elaboration of the different stages and processes in the chain of production of any product or service being offered. The principal base of this project is to program a computer that recognizes geometric figures (circle, square, and triangle) through a camera, each one with a different color and link it with a group of conveyor systems to organize the mentioned figures in cubicles, which differ from one another also by having different colors. This project bases on artificial vision, therefore the methodology needed to develop this project must be strict, this one is detailed below: 1. Methodology: 1.1 The software used in this project is QT Creator which is linked with Open CV libraries. Together, these tools perform to realize the respective program to identify colors and forms directly from the camera to the computer. 1.2 Imagery acquisition: To start using the libraries of Open CV is necessary to acquire images, which can be captured by a computer’s web camera or a different specialized camera. 1.3 The recognition of RGB colors is realized by code, crossing the matrices of the captured images and comparing pixels, identifying the primary colors which are red, green, and blue. 1.4 To detect forms it is necessary to realize the segmentation of the images, so the first step is converting the image from RGB to grayscale, to work with the dark tones of the image, then the image is binarized which means having the figure of the image in a white tone with a black background. Finally, we find the contours of the figure in the image to detect the quantity of edges to identify which figure it is. 1.5 After the color and figure have been identified, the program links with the conveyor systems, which through the actuators will classify the figures in their respective cubicles. Conclusions: The Open CV library is a useful tool for projects in which an interface between a computer and the environment is required since the camera obtains external characteristics and realizes any process. With the program for this project any type of assembly line can be optimized because images from the environment can be obtained and the process would be more accurate.Keywords: artificial intelligence, artificial vision, binarized, grayscale, images, RGB
Procedia PDF Downloads 3791726 The Importance of Efficient and Sustainable Water Resources Management and the Role of Artificial Intelligence in Preventing Forced Migration
Authors: Fateme Aysin Anka, Farzad Kiani
Abstract:
Forced migration is a situation in which people are forced to leave their homes against their will due to political conflicts, wars and conflicts, natural disasters, climate change, economic crises, or other emergencies. This type of migration takes place under conditions where people cannot lead a sustainable life due to reasons such as security, shelter and meeting their basic needs. This type of migration may occur in connection with different factors that affect people's living conditions. In addition to these general and widespread reasons, water security and resources will be one that is starting now and will be encountered more and more in the future. Forced migration may occur due to insufficient or depleted water resources in the areas where people live. In this case, people's living conditions become unsustainable, and they may have to go elsewhere, as they cannot obtain their basic needs, such as drinking water, water used for agriculture and industry. To cope with these situations, it is important to minimize the causes, as international organizations and societies must provide assistance (for example, humanitarian aid, shelter, medical support and education) and protection to address (or mitigate) this problem. From the international perspective, plans such as the Green New Deal (GND) and the European Green Deal (EGD) draw attention to the need for people to live equally in a cleaner and greener world. Especially recently, with the advancement of technology, science and methods have become more efficient. In this regard, in this article, a multidisciplinary case model is presented by reinforcing the water problem with an engineering approach within the framework of the social dimension. It is worth emphasizing that this problem is largely linked to climate change and the lack of a sustainable water management perspective. As a matter of fact, the United Nations Development Agency (UNDA) draws attention to this problem in its universally accepted sustainable development goals. Therefore, an artificial intelligence-based approach has been applied to solve this problem by focusing on the water management problem. The most general but also important aspect in the management of water resources is its correct consumption. In this context, the artificial intelligence-based system undertakes tasks such as water demand forecasting and distribution management, emergency and crisis management, water pollution detection and prevention, and maintenance and repair control and forecasting.Keywords: water resource management, forced migration, multidisciplinary studies, artificial intelligence
Procedia PDF Downloads 871725 Effects of Bleaching Procedures on Dentine Sensitivity
Authors: Suhayla Reda Al-Banai
Abstract:
Problem Statement: Tooth whitening was used for over one hundred and fifty year. The question concerning the whiteness of teeth is a complex one since tooth whiteness will vary from individual to individual, dependent on age and culture, etc. Tooth whitening following treatment may be dependent on the type of whitening system used to whiten the teeth. There are a few side-effects to the process, and these include tooth sensitivity and gingival irritation. Some individuals may experience no pain or sensitivity following the procedure. Purpose: To systematically review the available published literature until 31st December 2021 to identify all relevant studies for inclusion and to determine whether there was any evidence demonstrating that the application of whitening procedures resulted in the tooth sensitivity. Aim: Systematically review the available published works of literature to identify all relevant studies for inclusion and to determine any evidence demonstrating that application of 10% & 15% carbamide peroxide in tooth whitening procedures resulted in tooth sensitivity. Material and Methods: Following a review of 70 relevant papers from searching both electronic databases (OVID MEDLINE and PUBMED) and hand searching of relevant written journals, 49 studies were identified, 42 papers were subsequently excluded, and 7 studies were finally accepted for inclusion. The extraction of data for inclusion was conducted by two reviewers. The main outcome measures were the methodology and assessment used by investigators to evaluate tooth sensitivity in tooth whitening studies. Results: The reported evaluation of tooth sensitivity during tooth whitening procedures was based on the subjective response of subjects rather than a recognized methodology for evaluating. One of the problems in evaluating was the lack of homogeneity in study design. Seven studies were included. The studies included essential features namely: randomized group, placebo controls, doubleblind and single-blind. Drop-out was obtained from two of included studies. Three of the included studies reported sensitivity at the baseline visit. Two of the included studies mentioned the exclusion criteria Conclusions: The results were inconclusive due to: Limited number of included studies, the study methodology, and evaluation of DS reported. Tooth whitening procedures adversely affect both hard and soft tissues in the oral cavity. Sideeffects are mild and transient in nature. Whitening solutions with greater than 10% carbamide peroxide causes more tooth sensitivity. Studies using nightguard vital bleaching with 10% carbamide peroxide reported two side effects tooth sensitivity and gingival irritation, although tooth sensitivity was more prevalent than gingival irritationKeywords: dentine, sensitivity, bleaching, carbamide peroxde
Procedia PDF Downloads 701724 Systematic Study of Structure Property Relationship in Highly Crosslinked Elastomers
Authors: Natarajan Ramasamy, Gurulingamurthy Haralur, Ramesh Nivarthu, Nikhil Kumar Singha
Abstract:
Elastomers are polymeric materials with varied backbone architectures ranging from linear to dendrimeric structures and wide varieties of monomeric repeat units. These elastomers show strongly viscous and weakly elastic when it is not cross-linked. But when crosslinked, based on the extent the properties of these elastomers can range from highly flexible to highly stiff nature. Lightly cross-linked systems are well studied and reported. Understanding the nature of highly cross-linked rubber based upon chemical structure and architecture is critical for varieties of applications. One of the critical parameters is cross-link density. In the current work, we have studied the highly cross-linked state of linear, lightly branched to star-shaped branched elastomers and determined the cross-linked density by using different models. Change in hardness, shift in Tg, change in modulus and swelling behavior were measured experimentally as a function of the extent of curing. These properties were analyzed using varied models to determine cross-link density. We used hardness measurements to examine cure time. Hardness to the extent of curing relationship is determined. It is well known that micromechanical transitions like Tg and storage modulus are related to the extent of crosslinking. The Tg of the elastomer in different crosslinked state was determined by DMA, and based on plateau modulus the crosslink density is estimated by using Nielsen’s model. Usually for lightly crosslinked systems, based on equilibrium swelling ratio in solvent the cross link density is estimated by using Flory–Rhener model. When it comes to highly crosslinked system, Flory-Rhener model is not valid because of smaller chain length. So models based on the assumption of polymer as a Non-Gaussian chain like 1) Helmis–Heinrich–Straube (HHS) model, 2) Gloria M.gusler and Yoram Cohen Model, 3) Barbara D. Barr-Howell and Nikolaos A. Peppas model is used for estimating crosslink density. In this work, correction factors are determined to the existing models and based upon it structure-property relationship of highly crosslinked elastomers was studied.Keywords: dynamic mechanical analysis, glass transition temperature, parts per hundred grams of rubber, crosslink density, number of networks per unit volume of elastomer
Procedia PDF Downloads 1651723 Virtual Reality in COVID-19 Stroke Rehabilitation: Preliminary Outcomes
Authors: Kasra Afsahi, Maryam Soheilifar, S. Hossein Hosseini
Abstract:
Background: There is growing evidence that Cerebral Vascular Accident (CVA) can be a consequence of Covid-19 infection. Understanding novel treatment approaches are important in optimizing patient outcomes. Case: This case explores the use of Virtual Reality (VR) in the treatment of a 23-year-old COVID-positive female presenting with left hemiparesis in August 2020. Imaging showed right globus pallidus, thalamus, and internal capsule ischemic stroke. Conventional rehabilitation was started two weeks later, with virtual reality (VR) included. This game-based virtual reality (VR) technology developed for stroke patients was based on upper extremity exercises and functions for stroke. Physical examination showed left hemiparesis with muscle strength 3/5 in the upper extremity and 4/5 in the lower extremity. The range of motion of the shoulder was 90-100 degrees. The speech exam showed a mild decrease in fluency. Mild lower lip dynamic asymmetry was seen. Babinski was positive on the left. Gait speed was decreased (75 steps per minute). Intervention: Our game-based VR system was developed based on upper extremity physiotherapy exercises for post-stroke patients to increase the active, voluntary movement of the upper extremity joints and improve the function. The conventional program was initiated with active exercises, shoulder sanding for joint ROMs, walking shoulder, shoulder wheel, and combination movements of the shoulder, elbow, and wrist joints, alternative flexion-extension, pronation-supination movements, Pegboard and Purdo pegboard exercises. Also, fine movements included smart gloves, biofeedback, finger ladder, and writing. The difficulty of the game increased at each stage of the practice with progress in patient performances. Outcome: After 6 weeks of treatment, gait and speech were normal and upper extremity strength was improved to near normal status. No adverse effects were noted. Conclusion: This case suggests that VR is a useful tool in the treatment of a patient with covid-19 related CVA. The safety of newly developed instruments for such cases provides new approaches to improve the therapeutic outcomes and prognosis as well as increased satisfaction rate among patients.Keywords: covid-19, stroke, virtual reality, rehabilitation
Procedia PDF Downloads 1411722 Na Doped ZnO UV Filters with Reduced Photocatalytic Activity for Sunscreen Application
Authors: Rafid Mueen, Konstantin Konstantinov, Micheal Lerch, Zhenxiang Cheng
Abstract:
In the past two decades, the concern for skin protection from ultraviolet (UV) radiation has attracted considerable attention due to the increased intensity of UV rays that can reach the Earth’s surface as a result of the breakdown of ozone layer. Recently, UVA has also attracted attention, since, in comparison to UVB, it can penetrate deeply into the skin, which can result in significant health concerns. Sunscreen agents are one of the significant tools to protect the skin from UV irradiation, and it is either organic or in organic. Developing of inorganic UV blockers is essential, which provide efficient UV protection over a wide spectrum rather than organic filters. Furthermore inorganic UV blockers are good comfort, and high safety when applied on human skin. Inorganic materials can absorb, reflect, or scatter the ultraviolet radiation, depending on their particle size, unlike the organic blockers, which absorb the UV irradiation. Nowadays, most inorganic UV-blocking filters are based on (TiO2) and ZnO). ZnO can provide protection in the UVA range. Indeed, ZnO is attractive for in sunscreen formulization, and this relates to many advantages, such as its modest refractive index (2.0), absorption of a small fraction of solar radiation in the UV range which is equal to or less than 385 nm, its high probable recombination of photogenerated carriers (electrons and holes), large direct band gap, high exciton binding energy, non-risky nature, and high tendency towards chemical and physical stability which make it transparent in the visible region with UV protective activity. A significant issue for ZnO use in sunscreens is that it can generate ROS in the presence of UV light because of its photocatalytic activity. Therefore it is essential to make a non-photocatalytic material through modification by other metals. Several efforts have been made to deactivate the photocatalytic activity of ZnO by using inorganic surface modifiers. The doping of ZnO by different metals is another way to modify its photocatalytic activity. Recently, successful doping of ZnO with different metals such as Ce, La, Co, Mn, Al, Li, Na, K, and Cr by various procedures, such as a simple and facile one pot water bath, co-precipitation, hydrothermal, solvothermal, combustion, and sol gel methods has been reported. These materials exhibit greater performance than undoped ZnO towards increasing the photocatalytic activity of ZnO in visible light. Therefore, metal doping can be an effective technique to modify the ZnO photocatalytic activity. However, in the current work, we successfully reduce the photocatalytic activity of ZnO through Na doped ZnO fabricated via sol-gel and hydrothermal methods.Keywords: photocatalytic, ROS, UVA, ZnO
Procedia PDF Downloads 1441721 Rotterdam in Transition: A Design Case for a Low-Carbon Transport Node in Lombardijen
Authors: Halina Veloso e Zarate, Manuela Triggianese
Abstract:
The urban challenges posed by rapid population growth, climate adaptation, and sustainable living have compelled Dutch cities to reimagine their built environment and transportation systems. As a pivotal contributor to CO₂ emissions, the transportation sector in the Netherlands demands innovative solutions for transitioning to low-carbon mobility. This study investigates the potential of transit oriented development (TOD) as a strategy for achieving carbon reduction and sustainable urban transformation. Focusing on the Lombardijen station area in Rotterdam, which is targeted for significant densification, this paper presents a design-oriented exploration of a low-carbon transport node. By employing a research-by-design methodology, this study delves into multifaceted factors and scales, aiming to propose future scenarios for Lombardijen. Drawing from a synthesis of existing literature, applied research, and practical insights, a robust design framework emerges. To inform this framework, governmental data concerning the built environment and material embodied carbon are harnessed. However, the restricted access to crucial datasets, such as property ownership information from the cadastre and embodied carbon data from De Nationale Milieudatabase, underscores the need for improved data accessibility, especially during the concept design phase. The findings of this research contribute fundamental insights not only to the Lombardijen case but also to TOD studies across Rotterdam's 13 nodes and similar global contexts. Spatial data related to property ownership facilitated the identification of potential densification sites, underscoring its importance for informed urban design decisions. Additionally, the paper highlights the disparity between the essential role of embodied carbon data in environmental assessments for building permits and its limited accessibility due to proprietary barriers. Although this study lays the groundwork for sustainable urbanization through TOD-based design, it acknowledges an area of future research worthy of exploration: the socio-economic dimension. Given the complex socio-economic challenges inherent in the Lombardijen area, extending beyond spatial constraints, a comprehensive approach demands integration of mobility infrastructure expansion, land-use diversification, programmatic enhancements, and climate adaptation. While the paper adopts a TOD lens, it refrains from an in-depth examination of issues concerning equity and inclusivity, opening doors for subsequent research to address these aspects crucial for holistic urban development.Keywords: Rotterdam zuid, transport oriented development, carbon emissions, low-carbon design, cross-scale design, data-supported design
Procedia PDF Downloads 841720 Basics for Corruption Reduction and Fraud Prevention in Industrial/Humanitarian Organizations through Supplier Management in Supply Chain Systems
Authors: Ibrahim Burki
Abstract:
Unfortunately, all organizations (Industrial and Humanitarian/ Non-governmental organizations) are prone to fraud and corruption in their supply chain management routines. The reputational and financial fallout can be disastrous. With the growing number of companies using suppliers based in the local market has certainly increased the threat of fraud as well as corruption. There are various potential threats like, poor or non-existent record keeping, purchasing of lower quality goods at higher price, excessive entertainment of staff by suppliers, deviations in communications between procurement staff and suppliers, such as calls or text messaging to mobile phones, staff demanding extended periods of notice before they allow an audit to take place, inexperienced buyers and more. But despite all the above-mentioned threats, this research paper emphasize upon the effectiveness of well-maintained vendor/s records and sorting/filtration of vendor/s to cut down the possible threats of corruption and fraud. This exercise is applied in a humanitarian organization of Pakistan but it is applicable to whole South Asia region due to the similarity of culture and contexts. In that firm, there were more than 550 (five hundred and fifty) registered vendors. As during the disasters or emergency phases requirements are met on urgent basis thus, providing golden opportunities for the fake companies or for the brother/sister companies of the already registered companies to be involved in the tendering process without declaration or even under some different (new) company’s name. Therefore, a list of required documents (along with checklist) was developed and sent to all of the vendor(s) in the current database and based upon the receipt of the requested documents vendors were sorted out. Furthermore, these vendors were divided into active (meeting the entire set criterion) and non-active groups. This initial filtration stage allowed the firm to continue its work without a complete shutdown that is only vendors falling in the active group shall be allowed to participate in the tenders by the time whole process is completed. Likewise only those companies or firms meeting the set criterion (active category) shall be allowed to get registered in the future along with a dedicated filing system (soft and hard shall be maintained), and all of the companies/firms in the active group shall be physically verified (visited) by the Committee comprising of senior members of at least Finance department, Supply Chain (other than procurement) and Security department.Keywords: corruption reduction, fraud prevention, supplier management, industrial/humanitarian organizations
Procedia PDF Downloads 5401719 Performance Assessment of the Gold Coast Desalination Plant Offshore Multiport Brine Diffuser during ‘Hot Standby’ Operation
Authors: M. J. Baum, B. Gibbes, A. Grinham, S. Albert, D. Gale, P. Fisher
Abstract:
Alongside the rapid expansion of Seawater Reverse Osmosis technologies there is a concurrent increase in the production of hypersaline brine by-products. To minimize environmental impact, these by-products are commonly disposed into open-coastal environments via submerged diffuser systems as inclined dense jet outfalls. Despite the widespread implementation of this process, diffuser designs are typically based on small-scale laboratory experiments under idealistic quiescent conditions. Studies concerning diffuser performance in the field are limited. A set of experiments were conducted to assess the near field characteristics of brine disposal at the Gold Coast Desalination Plant offshore multiport diffuser. The aim of the field experiments was to determine the trajectory and dilution characteristics of the plume under various discharge configurations with production ranging 66 – 100% of plant operative capacity. The field monitoring system employed an unprecedented static array of temperature and electrical conductivity sensors in a three-dimensional grid surrounding a single diffuser port. Complimenting these measurements, Acoustic Doppler Current Profilers were also deployed to record current variability over the depth of the water column and wave characteristics. Recorded data suggested the open-coastal environment was highly active over the experimental duration with ambient velocities ranging 0.0 – 0.5 m∙s-1, with considerable variability over the depth of the water column observed. Variations in background electrical conductivity corresponding to salinity fluctuations of ± 1.7 g∙kg-1 were also observed. Increases in salinity were detected during plant operation and appeared to be most pronounced 10 – 30 m from the diffuser, consistent with trajectory predictions described by existing literature. Plume trajectories and respective dilutions extrapolated from salinity data are compared with empirical scaling arguments. Discharge properties were found to adequately correlate with modelling projections. Temporal and spatial variation of background processes and their subsequent influence upon discharge outcomes are discussed with a view to incorporating the influence of waves and ambient currents in the design of brine outfalls into the future.Keywords: brine disposal, desalination, field study, negatively buoyant discharge
Procedia PDF Downloads 2391718 Technology for Good: Deploying Artificial Intelligence to Analyze Participant Response to Anti-Trafficking Education
Authors: Ray Bryant
Abstract:
3Strands Global Foundation (3SGF), a non-profit with a mission to mobilize communities to combat human trafficking through prevention education and reintegration programs, launched a groundbreaking study that calls out the usage and benefits of artificial intelligence in the war against human trafficking. Having gathered more than 30,000 stories from counselors and school staff who have gone through its PROTECT Prevention Education program, 3SGF sought to develop a methodology to measure the effectiveness of the training, which helps educators and school staff identify physical signs and behaviors indicating a student is being victimized. The program further illustrates how to recognize and respond to trauma and teaches the steps to take to report human trafficking, as well as how to connect victims with the proper professionals. 3SGF partnered with Levity, a leader in no-code Artificial Intelligence (AI) automation, to create the research study utilizing natural language processing, a branch of artificial intelligence, to measure the effectiveness of their prevention education program. By applying the logic created for the study, the platform analyzed and categorized each story. If the story, directly from the educator, demonstrated one or more of the desired outcomes; Increased Awareness, Increased Knowledge, or Intended Behavior Change, a label was applied. The system then added a confidence level for each identified label. The study results were generated with a 99% confidence level. Preliminary results show that of the 30,000 stories gathered, it became overwhelmingly clear that a significant majority of the participants now have increased awareness of the issue, demonstrated better knowledge of how to help prevent the crime, and expressed an intention to change how they approach what they do daily. In addition, it was observed that approximately 30% of the stories involved comments by educators expressing they wish they’d had this knowledge sooner as they can think of many students they would have been able to help. Objectives Of Research: To solve the problem of needing to analyze and accurately categorize more than 30,000 data points of participant feedback in order to evaluate the success of a human trafficking prevention program by using AI and Natural Language Processing. Methodologies Used: In conjunction with our strategic partner, Levity, we have created our own NLP analysis engine specific to our problem. Contributions To Research: The intersection of AI and human rights and how to utilize technology to combat human trafficking.Keywords: AI, technology, human trafficking, prevention
Procedia PDF Downloads 591717 Exploring the Synergistic Effects of Aerobic Exercise and Cinnamon Extract on Metabolic Markers in Insulin-Resistant Rats through Advanced Machine Learning and Deep Learning Techniques
Authors: Masoomeh Alsadat Mirshafaei
Abstract:
The present study aims to explore the effect of an 8-week aerobic training regimen combined with cinnamon extract on serum irisin and leptin levels in insulin-resistant rats. Additionally, this research leverages various machine learning (ML) and deep learning (DL) algorithms to model the complex interdependencies between exercise, nutrition, and metabolic markers, offering a groundbreaking approach to obesity and diabetes research. Forty-eight Wistar rats were selected and randomly divided into four groups: control, training, cinnamon, and training cinnamon. The training protocol was conducted over 8 weeks, with sessions 5 days a week at 75-80% VO2 max. The cinnamon and training-cinnamon groups were injected with 200 ml/kg/day of cinnamon extract. Data analysis included serum data, dietary intake, exercise intensity, and metabolic response variables, with blood samples collected 72 hours after the final training session. The dataset was analyzed using one-way ANOVA (P<0.05) and fed into various ML and DL models, including Support Vector Machines (SVM), Random Forest (RF), and Convolutional Neural Networks (CNN). Traditional statistical methods indicated that aerobic training, with and without cinnamon extract, significantly increased serum irisin and decreased leptin levels. Among the algorithms, the CNN model provided superior performance in identifying specific interactions between cinnamon extract concentration and exercise intensity, optimizing the increase in irisin and the decrease in leptin. The CNN model achieved an accuracy of 92%, outperforming the SVM (85%) and RF (88%) models in predicting the optimal conditions for metabolic marker improvements. The study demonstrated that advanced ML and DL techniques could uncover nuanced relationships and potential cellular responses to exercise and dietary supplements, which is not evident through traditional methods. These findings advocate for the integration of advanced analytical techniques in nutritional science and exercise physiology, paving the way for personalized health interventions in managing obesity and diabetes.Keywords: aerobic training, cinnamon extract, insulin resistance, irisin, leptin, convolutional neural networks, exercise physiology, support vector machines, random forest
Procedia PDF Downloads 39