Search results for: product evaluation
852 Term Creation in Specialized Fields: An Evaluation of Shona Phonetics and Phonology Terminology at Great Zimbabwe University
Authors: Peniah Mabaso-Shamano
Abstract:
The paper evaluates Shona terms that were created to teach Phonetics and Phonology courses at Great Zimbabwe University (GZU). The phonetics and phonology terms to be discussed in this paper were created using different processes and strategies such as translation, borrowing, neologising, compounding, transliteration, circumlocution among many others. Most phonetics and phonology terms are alien to Shona and as a result, there are no suitable Shona equivalents. The lecturers and students for these courses have a mammoth task of creating terminology for the different modules offered in Shona and other Zimbabwean indigenous languages. Most linguistic reference books are written in English. As such, lecturers and students translate information from English to Shona, a measure which is proving to be too difficult for them. A term creation workshop was held at GZU to try to address the problem of lack of terminology in indigenous languages. Different indigenous language practitioners from different tertiary institutions convened for a two-day workshop at GZU. Due to the 'specialized' nature of phonetics and phonology, it was too difficult to come up with 'proper' indigenous terms. The researcher will consult tertiary institutions lecturers who teach linguistics courses and linguistics students to get their views on the created terms. The people consulted will not be the ones who took part in the term creation workshop held at GZU. The selected participants will be asked to evaluate and back-translate some of the terms. In instances where they feel the terms created are not suitable or user-friendly, they will be asked to suggest other terms. Since the researcher is also a linguistics lecturer, her observation and views will be important. From her experience in using some of the terms in teaching phonetics and phonology courses to undergraduate students, the researcher noted that most of the terms created have shortcomings since they are not user-friendly. These shortcomings include terms longer than the English terms as some terms are translated to Shona through a whole statement. Most of these terms are neologisms, compound neologisms, transliterations, circumlocutions, and blends. The paper will show that there is overuse of transliterated terms due to the lack of Shona equivalents for English terms. Most single English words were translated into compound neologisms or phrases after attempts to reduce them to one word terms failed. In other instances, circumlocution led to the problem of creating longer terms than the original and as a result, the terms are not user-friendly. The paper will discuss and evaluate the different phonetics and phonology terms created and the different strategies and processes used in creating them.Keywords: blending, circumlocution, term creation, translation
Procedia PDF Downloads 147851 Analysis of Road Risk in Four French Overseas Territories Compared with Metropolitan France
Authors: Mohamed Mouloud Haddak, Bouthayna Hayou
Abstract:
Road accidents in French overseas territories have been understudied, with relevant data often collected late and incompletely. Although these territories account for only 3% to 4% of road traffic injuries in France, their unique characteristics merit closer attention. Despite lower mobility and, consequently, lower exposure to road risks, the actual road risk in Overseas France is as high or even higher than in Metropolitan France. Significant disparities exist not only between Metropolitan France and Overseas territories but also among the overseas territories themselves. The varying population densities in these regions do not fully explain these differences, as each territory has its own distinct vulnerabilities and road safety challenges. This analysis, based on BAAC data files from 2005 to 2018 for both Metropolitan France and the overseas departments and regions, examines key variables such as gender, age, type of road user, type of obstacle hit, type of trip, road category, traffic conditions, weather, and location of accidents. Logistic regression models were built for each region to investigate the risk factors associated with fatal road accidents, focusing on the probability of being killed versus injured. Due to insufficient data, Mayotte and the Overseas Communities (French Polynesia and New Caledonia) were not included in the models. The findings reveal that road safety is worse in the overseas territories compared to Metropolitan France, particularly for vulnerable road users such as pedestrians and motorized two-wheelers. These territories present an accident profile that sits between that of Metropolitan France and middle-income countries. A pressing need exists to standardize accident data collection between Metropolitan and Overseas France to allow for more detailed comparative analyses. Further epidemiological studies could help identify the specific road safety issues unique to each territory, particularly with regards to socio-economic factors such as social cohesion, which may influence road safety outcomes. Moreover, the lack of data on new modes of travel, such as electric scooters, and the absence of socio-economic details of accident victims complicate the evaluation of emerging risk factors. Additional research, including sociological and psychosocial studies, is essential for understanding road users' behavior and perceptions of road risk, which could also provide valuable insights into accident trends in peri-urban areas in France.Keywords: multivariate logistic regression, french overseas regions, road safety, road traffic accidents, territorial inequalities
Procedia PDF Downloads 10850 “Japan’s New Security Outlook: Implications for the US-Japan Alliance”
Authors: Agustin Maciel-Padilla
Abstract:
This paper explores the most significant change to Japan’s security strategy since the end of World War II, in particular Prime Minister Fumio Kishida’s government publication, in late 2022, of 3 policy documents (the National Security Strategy [NSS], the National Defense Strategy and the Defense Buildup Program) that basically propose to expand the country’s military capabilities and to increase military spending over a 5-year period. These policies represent a remarkable transformation of Japan’s defense-oriented policy followed since 1946. These proposals have been under analysis and debate since they were announced, as it was also Japan’s historic ambition to strengthening its deterrence capabilities in the context of a more complex regional security environment. Even though this new defense posture has attracted significant international attention, it is far from representing a done deal because of the fact that there is still a long way to go to implement this vision because of a wide variety of political and economic issues. Japan is currently experiencing the most dangerous security environment since the end of World War II, and this situation led Japan to intensify its dialogue with the United States to reflect a re-evaluation of deterrence in the face of a rapidly worsening security environment, a changing balance of power in East Asia, and the arrival of a new era of “great power competition”. Japan’s new documents, for instance, identify China and North Korea’s as posing, respectively, a strategic challenge and an imminent threat. Japan has also noted that Russia’s invasion of Ukraine has contributed to erode the foundation of the international order. It is considered that Russia’s aggression was possible because Ukraine’s defense capability was not enough for effective deterrence. Moreover, Japan’s call for “counterstrike capabilities” results from a recognition that China and North Korea’s ballistic and cruise missiles could overwhelm Japan’s air and missile defense systems, and therefore there is an urgent need to strengthen deterrence and resilience. In this context, this paper will focus on the impact of these changes on the US-Japan alliance. Adapting this alliance to Tokyo’s new ambitions and capabilities could be critical in terms of updating their traditional protection/access to bases arrangement, interoperability and joint command and control issues, as well as regarding the security–economy nexus. While China is Japan’s largest trading partner, and trade between the two has been growing, US-Japan economic relationship has been slower, notwithstanding the fact that US-Japan security cooperation has strengthened significantly in recent years.Keywords: us-japan alliance, japan security, great power competition, interoperability
Procedia PDF Downloads 65849 Preparation and Evaluation of Poly(Ethylene Glycol)-B-Poly(Caprolactone) Diblock Copolymers with Zwitterionic End Group for Thermo-Responsive Properties
Authors: Bo Keun Lee, Doo Yeon Kwon, Ji Hoon Park, Gun Hee Lee, Ji Hye Baek, Heung Jae Chun, Young Joo Koh, Moon Suk Kim
Abstract:
Thermo-responsive materials are viscoelastic materials that undergo a sol-to-gel phase transition at a specific temperature and many materials have been developed. MPEG-b-PCL (MPC) as a thermo-responsive material contained hydrophilic and hydrophobic segments and it formed an ordered crystalline structure of hydrophobic PCL segments in aqueous solutions. The ordered crystalline structure packed tightly or aggregated and finally induced an aggregated gel through intra- and inter-molecular interactions as a function of temperature. Thus, we introduced anionic and cationic groups into the end positions of the PCL chain to alter the hydrophobicity of the PCL segment. Introducing anionic and cationic groups into the PCL end position altered their solubility by changing the crystallinity and hydrophobicity of the PCL block domains. These results indicated that the properties of the end group in the hydrophobic PCL blockand the balance between hydrophobicity and hydrophilicity affect thermo-responsivebehavior of the copolymers in aqueous solutions. Thus, we concluded that determinant of the temperature-dependent thermo-responsive behavior of MPC depend on the ionic end group in the PCL block. So, we introduced zwitterionic end groups to investigate the thermo-responsive behavior of MPC. Methoxypoly(ethylene oxide) and ε-caprolactone (CL) were randomly copolymerized that introduced varying hydrophobic PCL lengths and an MPC featuring a zwitterionic sulfobetaine (MPC-ZW) at the chain end of the PCL segment. The MPC and MPC-ZW copolymers were obtained formed sol-state at room temperature when prepared as 20-wt% aqueous solutions. The solubility of MPC decreased when the PCL block was increased from molecular weight. The solubilization time of MPC-2.4k was around 20 min and MPC-2.8k, MPC-3.0k increased to 30 min and 1 h, respectively. MPC-3.6k was not solubilized. In case of MPC-ZW 3.6k, However, the zwitterion-modified MPC copolymers were solubilized in 3–5 min. This result indicates that the zwitterionic end group of the MPC-ZW diblock copolymer increased the aqueous solubility of the diblock copolymer even when the length of the hydrophobic PCL segment was increased. MPC and MPC-ZW diblock copolymers that featuring zwitterionic end groups were synthesized successfully. The sol-to-gel phase-transition was formed that specific temperature depend on the length of the PCL hydrophobic segments introduced and on the zwitterion groups attached to the MPC chain end. This result indicated that the zwitterionic end groups reduced the hydrophobicity in the PCL block and changed the solubilization. The MPC-ZW diblock copolymer can be utilized as a potential injectable drug and cell carrier.Keywords: thermo-responsive material, zwitterionic, hydrophobic, crystallization, phase transition
Procedia PDF Downloads 507848 Exploration and Evaluation of the Effect of Multiple Countermeasures on Road Safety
Authors: Atheer Al-Nuaimi, Harry Evdorides
Abstract:
Every day many people die or get disabled or injured on roads around the world, which necessitates more specific treatments for transportation safety issues. International road assessment program (iRAP) model is one of the comprehensive road safety models which accounting for many factors that affect road safety in a cost-effective way in low and middle income countries. In iRAP model road safety has been divided into five star ratings from 1 star (the lowest level) to 5 star (the highest level). These star ratings are based on star rating score which is calculated by iRAP methodology depending on road attributes, traffic volumes and operating speeds. The outcome of iRAP methodology are the treatments that can be used to improve road safety and reduce fatalities and serious injuries (FSI) numbers. These countermeasures can be used separately as a single countermeasure or mix as multiple countermeasures for a location. There is general agreement that the adequacy of a countermeasure is liable to consistent losses when it is utilized as a part of mix with different countermeasures. That is, accident diminishment appraisals of individual countermeasures cannot be easily added together. The iRAP model philosophy makes utilization of a multiple countermeasure adjustment factors to predict diminishments in the effectiveness of road safety countermeasures when more than one countermeasure is chosen. A multiple countermeasure correction factors are figured for every 100-meter segment and for every accident type. However, restrictions of this methodology incorporate a presumable over-estimation in the predicted crash reduction. This study aims to adjust this correction factor by developing new models to calculate the effect of using multiple countermeasures on the number of fatalities for a location or an entire road. Regression models have been used to establish relationships between crash frequencies and the factors that affect their rates. Multiple linear regression, negative binomial regression, and Poisson regression techniques were used to develop models that can address the effectiveness of using multiple countermeasures. Analyses are conducted using The R Project for Statistical Computing showed that a model developed by negative binomial regression technique could give more reliable results of the predicted number of fatalities after the implementation of road safety multiple countermeasures than the results from iRAP model. The results also showed that the negative binomial regression approach gives more precise results in comparison with multiple linear and Poisson regression techniques because of the overdispersion and standard error issues.Keywords: international road assessment program, negative binomial, road multiple countermeasures, road safety
Procedia PDF Downloads 240847 Modelling Flood Events in Botswana (Palapye) for Protecting Roads Structure against Floods
Authors: Thabo M. Bafitlhile, Adewole Oladele
Abstract:
Botswana has been affected by floods since long ago and is still experiencing this tragic event. Flooding occurs mostly in the North-West, North-East, and parts of Central district due to heavy rainfalls experienced in these areas. The torrential rains destroyed homes, roads, flooded dams, fields and destroyed livestock and livelihoods. Palapye is one area in the central district that has been experiencing floods ever since 1995 when its greatest flood on record occurred. Heavy storms result in floods and inundation; this has been exacerbated by poor and absence of drainage structures. Since floods are a part of nature, they have existed and will to continue to exist, hence more destruction. Furthermore floods and highway plays major role in erosion and destruction of roads structures. Already today, many culverts, trenches, and other drainage facilities lack the capacity to deal with current frequency for extreme flows. Future changes in the pattern of hydro climatic events will have implications for the design and maintenance costs of roads. Increase in rainfall and severe weather events can affect the demand for emergent responses. Therefore flood forecasting and warning is a prerequisite for successful mitigation of flood damage. In flood prone areas like Palapye, preventive measures should be taken to reduce possible adverse effects of floods on the environment including road structures. Therefore this paper attempts to estimate return periods associated with huge storms of different magnitude from recorded historical rainfall depth using statistical method. The method of annual maxima was used to select data sets for the rainfall analysis. In the statistical method, the Type 1 extreme value (Gumbel), Log Normal, Log Pearson 3 distributions were all applied to the annual maximum series for Palapye area to produce IDF curves. The Kolmogorov-Smirnov test and Chi Squared were used to confirm the appropriateness of fitted distributions for the location and the data do fit the distributions used to predict expected frequencies. This will be a beneficial tool for urgent flood forecasting and water resource administration as proper drainage design will be design based on the estimated flood events and will help to reclaim and protect the road structures from adverse impacts of flood.Keywords: drainage, estimate, evaluation, floods, flood forecasting
Procedia PDF Downloads 371846 The Application of Animal Welfare Certification System for Farm Animal in South Korea
Authors: Ahlyum Mun, Ji-Young Moon, Moon-Seok Yoon, Dong-Jin Baek, Doo-Seok Seo, Oun-Kyong Moon
Abstract:
There is a growing public concern over the standards of farm animal welfare, with higher standards of food safety. In addition, the recent low incidence of Avian Influenza in laying hens among certificated farms is receiving attention. In this study, we introduce animal welfare systems covering the rearing, transport and slaughter of farm animals in South Korea. The concepts of animal welfare farm certification are based on ensuring the five freedoms of animal. The animal welfare is also achieved by observing the condition of environment including shelter and resting area, feeding and water and the care for the animal health. The certification of farm animal welfare is handled by the Animal Protection & Welfare Division of Animal and Plant Quarantine Agency (APQA). Following the full amendment of Animal Protection Law in 2011, animal welfare farm certification program has been implemented since 2012. The certification system has expanded to cover laying hen, swine, broiler, beef cattle and dairy cow, goat and duck farms. Livestock farmers who want to be certified must apply for certification at the APQA. Upon receipt of the application, the APQA notifies the applicant of the detailed schedule of the on-site examination after reviewing the document and conducts the on-site inspection according to the evaluation criteria of the welfare standard. If the on-site audit results meet the certification criteria, APQA issues a certificate. The production process of certified farms is inspected at least once a year for follow-up management. As of 2017, a total of 145 farms have been certified (95 laying hen farms, 12 swine farms, 30 broiler farms and 8 dairy cow farms). In addition, animal welfare transportation vehicles and slaughterhouses have been designated since 2013 and currently 6 slaughterhouses have been certified. Animal Protection Law has been amended so that animal welfare certification marks can be affixed only to livestock products produced by animal welfare farms, transported through animal welfare vehicles and slaughtered at animal welfare slaughterhouses. The whole process including rearing–transportation- slaughtering completes the farm animal welfare system. APQA established its second 5-year animal welfare plan (2014-2019) that includes setting a minimum standard of animal welfare applicable to all livestock farms, transportation vehicles and slaughterhouses. In accordance with this plan, we will promote the farm animal welfare policy in order to truly advance the Korean livestock industry.Keywords: animal welfare, farm animal, certification system, South Korea
Procedia PDF Downloads 399845 Integrating Data Mining within a Strategic Knowledge Management Framework: A Platform for Sustainable Competitive Advantage within the Australian Minerals and Metals Mining Sector
Authors: Sanaz Moayer, Fang Huang, Scott Gardner
Abstract:
In the highly leveraged business world of today, an organisation’s success depends on how it can manage and organize its traditional and intangible assets. In the knowledge-based economy, knowledge as a valuable asset gives enduring capability to firms competing in rapidly shifting global markets. It can be argued that ability to create unique knowledge assets by configuring ICT and human capabilities, will be a defining factor for international competitive advantage in the mid-21st century. The concept of KM is recognized in the strategy literature, and increasingly by senior decision-makers (particularly in large firms which can achieve scalable benefits), as an important vehicle for stimulating innovation and organisational performance in the knowledge economy. This thinking has been evident in professional services and other knowledge intensive industries for over a decade. It highlights the importance of social capital and the value of the intellectual capital embedded in social and professional networks, complementing the traditional focus on creation of intellectual property assets. Despite the growing interest in KM within professional services there has been limited discussion in relation to multinational resource based industries such as mining and petroleum where the focus has been principally on global portfolio optimization with economies of scale, process efficiencies and cost reduction. The Australian minerals and metals mining industry, although traditionally viewed as capital intensive, employs a significant number of knowledge workers notably- engineers, geologists, highly skilled technicians, legal, finance, accounting, ICT and contracts specialists working in projects or functions, representing potential knowledge silos within the organisation. This silo effect arguably inhibits knowledge sharing and retention by disaggregating corporate memory, with increased operational and project continuity risk. It also may limit the potential for process, product, and service innovation. In this paper the strategic application of knowledge management incorporating contemporary ICT platforms and data mining practices is explored as an important enabler for knowledge discovery, reduction of risk, and retention of corporate knowledge in resource based industries. With reference to the relevant strategy, management, and information systems literature, this paper highlights possible connections (currently undergoing empirical testing), between an Strategic Knowledge Management (SKM) framework incorporating supportive Data Mining (DM) practices and competitive advantage for multinational firms operating within the Australian resource sector. We also propose based on a review of the relevant literature that more effective management of soft and hard systems knowledge is crucial for major Australian firms in all sectors seeking to improve organisational performance through the human and technological capability captured in organisational networks.Keywords: competitive advantage, data mining, mining organisation, strategic knowledge management
Procedia PDF Downloads 415844 Assessment of Milk Quality in Vehari: Evaluation of Public Health Concerns
Authors: Muhammad Farhan Saeed, Waheed Aslam Khan, Muhammad Nadeem, Iftikhar Ahmad, Zakir Ali
Abstract:
Milk is an important and fundamental nutrition source of human diet. In Pakistan, the milk used by the consumer is of low quality and is often contaminated due to the lack of quality controls. Mycotoxins produced from molds which contaminate the agriculture commodities of animal feed. Mycotoxins are poisons which affect the animals when they consume contaminated feeds. Aflatoxin AFM1 is naturally occurring form of mycotoxins in milk which is carcinogenic. To assess public awareness regarding milk Aflatoxin contamination, a population-based survey using a questionnaire was carried out from general public and from farmers of both rural and urban areas. It was revealed from the data that people of rural area were more satisfied about quality of available milk but the awareness level about milk contamination was found lower in both areas. Total 297 samples of milk were collected from rural (n=156) and urban (n=141) areas of district Vehari during June-July 2015. Milk samples were collected from three different point sources; farmer, milkman and milkshop. These point sources had three types of dairy milk including cow milk, buffalo milk and mixed milk. After performing ELISA test 18 samples with positive ELISA results were maintain per source for further analysis for aflatoxin M1 (AFM1) by High Performance Liquid Chromatography (HPLC). Higher percentages of samples were found exceeding the permissible limit for urban area. In rural area about 15% samples and from urban area about 35% samples were exceeded the permissible limit of AFM1 with 0.05µg/kg set by European Union. From urban areas about 55% of buffalo, 33% of cows and 17% of mixed milk samples were exceeded the permissible AFM1 level as compared with 17%, 11% and 17% for milk samples from rural areas respectively. Samples from urban areas 33%, 44% and 28% were exceeded the permissible AFM1 level for farmer, milkman and of milk shop respectively as compared with 28% and 17% of farmer and milkman’s samples from rural areas respectively. The presence of AFM1 in milk samples demands the implementation of strict regulations and also urges the need for continuous monitoring of milk and milk products in order to minimize the health hazards. Regulations regarding aflatoxins contamination and adulteration should be strictly imposed to prevent health problems related to milk quality. Permissible limits for aflatoxin should be enforced strongly in Pakistan so that economic loss due to aflatoxin contamination can be reduced.Keywords: Vehari, aflatoxins AFM1, milk, HPLC
Procedia PDF Downloads 374843 Urban Rehabilitation Assessment: Buildings' Integrity and Embodied Energy
Authors: Joana Mourão
Abstract:
Transition to a low carbon economy requires changes in consumption and production patterns, including the improvement of existing buildings’ environmental performance. Urban rehabilitation is a top policy priority in Europe, creating an opportunity to increase this performance. However, urban rehabilitation comprises different typologies of interventions with distinct levels of consideration for cultural urban heritage values and for environmental values, thus with different impacts. Cities rely on both material and non-material forms of heritage that are deep-rooted and resilient. One of the most relevant parts of that urban heritage is the historical pre-industrial housing stock, with an extensive presence in many European cities, as Lisbon. This stock is rehabilitated and transformed at the framework of urban management and local governance traditions, as well as the framework of the global economy, and in that context, faces opportunities and threats that need evaluation and control. The scope of this article is to define methodological bases and research lines for the assessment of impacts that urban rehabilitation initiatives set on the vulnerable and historical pre-industrial urban housing stock, considering it as an environmental and cultural unreplaceable material value and resource. As a framework, this article reviews the concepts of urban regeneration, urban renewal, current buildings conservation and refurbishment, and energy refurbishment of buildings, seeking to define key typologies of urban rehabilitation that represent different approaches to the urban fabric, in terms of scope, actors, and priorities. Moreover, main types of interventions - basing on a case-study in a XVIII century neighborhood in Lisbon - are defined and analyzed in terms of the elements lost in each type of intervention, and relating those to urbanistic, architectonic and constructive values of urban heritage, as well as to environmental and energy efficiency. Further, the article overviews environmental cultural heritage assessment and life-cycle assessment tools, selecting relevant and feasible impact assessment criteria for urban buildings rehabilitation regulation, focusing on multi-level urban heritage integrity. Urbanistic, architectonic, constructive and energetic integrity are studied as criteria for impact assessment and specific indicators are proposed. The role of these criteria in sustainable urban management is discussed. Throughout this article, the key challenges for urban rehabilitation planning and management, concerning urban built heritage as a resource for sustainability, are discussed and clarified.Keywords: urban rehabilitation, impact assessment criteria, buildings integrity, embodied energy
Procedia PDF Downloads 196842 Combination of Plantar Pressure and Star Excursion Balance Test for Evaluation of Dynamic Posture Control on High-Heeled Shoes
Authors: Yan Zhang, Jan Awrejcewicz, Lin Fu
Abstract:
High-heeled shoes force the foot into plantar flexion position resulting in foot arch rising and disturbance of the articular congruence between the talus and tibiofibular mortice, all of which may increase the challenge of balance maintenance. Plantar pressure distribution of the stance limb during the star excursion balance test (SEBT) contributes to the understanding of potential sources of reaching excursions in SEBT. The purpose of this study is to evaluate the dynamic posture control while wearing high-heeled shoes using SEBT in a combination of plantar pressure measurement. Twenty healthy young females were recruited. Shoes of three heel heights were used: flat (0.8 cm), low (4.0 cm), high (6.6 cm). The testing grid of SEBT consists of three lines extending out at 120° from each other, which were defined as anterior, posteromedial, and posterolateral directions. Participants were instructed to stand on their dominant limb with the heel in the middle of the testing grid and hands on hips and to reach the non-stance limb as far as possible towards each direction. The distal portion of the reaching limb lightly touched the ground without shifting weight. Then returned the reaching limb to the beginning position. The excursion distances were normalized to leg length. The insole plantar measurement system was used to record peak pressure, contact area, and pressure-time integral of the stance limb. Results showed that normalized excursion distance decreased significantly as heel height increased. The changes of plantar pressure in SEBT as heel height increased were more obvious in the medial forefoot (MF), medial midfoot (MM), rearfoot areas. At MF, the peak pressure and pressure-time integral of low and high shoes increased significantly compared with that of flat shoes, while the contact area decreased significantly as heel height increased. At MM, peak pressure, contact area, and pressure-time integral of high and low shoes were significantly lower than that of flat shoes. To reduce posture instability, the stance limb plantar loading shifted to medial forefoot. Knowledge of this study identified dynamic posture control deficits while wearing high-heeled shoes and the critical role of the medial forefoot in dynamic balance maintenance.Keywords: dynamic posture control, high-heeled shoes, plantar pressure, star excursion balance test.
Procedia PDF Downloads 134841 Towards Creative Movie Title Generation Using Deep Neural Models
Authors: Simon Espigolé, Igor Shalyminov, Helen Hastie
Abstract:
Deep machine learning techniques including deep neural networks (DNN) have been used to model language and dialogue for conversational agents to perform tasks, such as giving technical support and also for general chit-chat. They have been shown to be capable of generating long, diverse and coherent sentences in end-to-end dialogue systems and natural language generation. However, these systems tend to imitate the training data and will only generate the concepts and language within the scope of what they have been trained on. This work explores how deep neural networks can be used in a task that would normally require human creativity, whereby the human would read the movie description and/or watch the movie and come up with a compelling, interesting movie title. This task differs from simple summarization in that the movie title may not necessarily be derivable from the content or semantics of the movie description. Here, we train a type of DNN called a sequence-to-sequence model (seq2seq) that takes as input a short textual movie description and some information on e.g. genre of the movie. It then learns to output a movie title. The idea is that the DNN will learn certain techniques and approaches that the human movie titler may deploy that may not be immediately obvious to the human-eye. To give an example of a generated movie title, for the movie synopsis: ‘A hitman concludes his legacy with one more job, only to discover he may be the one getting hit.’; the original, true title is ‘The Driver’ and the one generated by the model is ‘The Masquerade’. A human evaluation was conducted where the DNN output was compared to the true human-generated title, as well as a number of baselines, on three 5-point Likert scales: ‘creativity’, ‘naturalness’ and ‘suitability’. Subjects were also asked which of the two systems they preferred. The scores of the DNN model were comparable to the scores of the human-generated movie title, with means m=3.11, m=3.12, respectively. There is room for improvement in these models as they were rated significantly less ‘natural’ and ‘suitable’ when compared to the human title. In addition, the human-generated title was preferred overall 58% of the time when pitted against the DNN model. These results, however, are encouraging given the comparison with a highly-considered, well-crafted human-generated movie title. Movie titles go through a rigorous process of assessment by experts and focus groups, who have watched the movie. This process is in place due to the large amount of money at stake and the importance of creating an effective title that captures the audiences’ attention. Our work shows progress towards automating this process, which in turn may lead to a better understanding of creativity itself.Keywords: creativity, deep machine learning, natural language generation, movies
Procedia PDF Downloads 326840 An Anthropometric Index Capable of Differentiating Morbid Obesity from Obesity and Metabolic Syndrome in Children
Authors: Mustafa Metin Donma
Abstract:
Circumference measurements are important because they are easily obtained values for the identification of the weight gain without determining body fat. They may give meaningful information about the varying stages of obesity. Besides, some formulas may be derived from a number of body circumference measurements to estimate body fat. Waist (WC), hip (HC) and neck (NC) circumferences are currently the most frequently used measurements. The aim of this study was to develop a formula derived from these three anthropometric measurements, each giving a valuable information independently, to question whether their combined power within a formula was capable of being helpful for the differential diagnosis of morbid obesity without metabolic syndrome (MetS) from MetS. One hundred and eighty seven children were recruited from the pediatrics outpatient clinic of Tekirdag Namik Kemal University Faculty of Medicine. The parents of the participants were informed about asked to fill and sign the consent forms. The study was carried out according to the Helsinki Declaration. The study protocol was approved by the institutional non-interventional ethics committee. The study population was divided into four groups as normal-body mass index (N-BMI), obese (OB), morbid obese (MO) and MetS, which were composed of 35, 44, 75 and 33 children, respectively. Age- and gender-adjusted BMI percentile values were used for the classification of groups. The children in MetS group were selected based upon the nature of the MetS components described as MetS criteria. Anthropometric measurements, laboratory analysis and statistical evaluation confined to study population were performed. Body mass index values were calculated. A circumference index, advanced Donma circumference index (ADCI) was introduced as WC*HC/NC. The statistical significance degree was chosen as p value smaller than 0.05. Body mass index values were 17.7±2.8, 24.5±3.3, 28.8±5.7, 31.4±8.0 kg/m2, for N-BMI, OB, MO, MetS groups, respectively. The corresponding values for ADCI were 165±35, 240±42, 270±55, and 298±62. Significant differences were obtained between BMI values of N-BMI and OB, MO, MetS groups (p=0.001). Obese group BMI values also differed from MO group BMI values (p=0.001). However, the increase in MetS group compared to MO group was not significant (p=0.091). For the new index, significant differences were obtained between N-BMI and OB, MO, MetS groups (p=0.001). Obese group ADCI values also differed from MO group ADCI values (p=0.015). A significant difference between MO and MetS groups was detected (p=0.043). The correlation coefficient value and the significance check of the correlation was found between BMI and ADCI as r=0.0883 and p=0.001 upon consideration of all participants. In conclusion, in spite of the strong correlation between BMI and ADCI values obtained when all groups were considered, ADCI, but not BMI, was the index, which was capable of differentiating cases with morbid obesity from cases with morbid obesity and MetS.Keywords: anthropometry, body mass index, child, circumference, metabolic syndrome, obesity
Procedia PDF Downloads 63839 Revolutionizing Healthcare Facility Maintenance: A Groundbreaking AI, BIM, and IoT Integration Framework
Authors: Mina Sadat Orooje, Mohammad Mehdi Latifi, Behnam Fereydooni Eftekhari
Abstract:
The integration of cutting-edge Internet of Things (IoT) technologies with advanced Artificial Intelligence (AI) systems is revolutionizing healthcare facility management. However, the current landscape of hospital building maintenance suffers from slow, repetitive, and disjointed processes, leading to significant financial, resource, and time losses. Additionally, the potential of Building Information Modeling (BIM) in facility maintenance is hindered by a lack of data within digital models of built environments, necessitating a more streamlined data collection process. This paper presents a robust framework that harmonizes AI with BIM-IoT technology to elevate healthcare Facility Maintenance Management (FMM) and address these pressing challenges. The methodology begins with a thorough literature review and requirements analysis, providing insights into existing technological landscapes and associated obstacles. Extensive data collection and analysis efforts follow to deepen understanding of hospital infrastructure and maintenance records. Critical AI algorithms are identified to address predictive maintenance, anomaly detection, and optimization needs alongside integration strategies for BIM and IoT technologies, enabling real-time data collection and analysis. The framework outlines protocols for data processing, analysis, and decision-making. A prototype implementation is executed to showcase the framework's functionality, followed by a rigorous validation process to evaluate its efficacy and gather user feedback. Refinement and optimization steps are then undertaken based on evaluation outcomes. Emphasis is placed on the scalability of the framework in real-world scenarios and its potential applications across diverse healthcare facility contexts. Finally, the findings are meticulously documented and shared within the healthcare and facility management communities. This framework aims to significantly boost maintenance efficiency, cut costs, provide decision support, enable real-time monitoring, offer data-driven insights, and ultimately enhance patient safety and satisfaction. By tackling current challenges in healthcare facility maintenance management it paves the way for the adoption of smarter and more efficient maintenance practices in healthcare facilities.Keywords: artificial intelligence, building information modeling, healthcare facility maintenance, internet of things integration, maintenance efficiency
Procedia PDF Downloads 59838 Best Practice for Post-Operative Surgical Site Infection Prevention
Authors: Scott Cavinder
Abstract:
Surgical site infections (SSI) are a known complication to any surgical procedure and are one of the most common nosocomial infections. Globally it is estimated 300 million surgical procedures take place annually, with an incidence of SSI’s estimated to be 11 of 100 surgical patients developing an infection within 30 days after surgery. The specific purpose of the project is to address the PICOT (Problem, Intervention, Comparison, Outcome, Time) question: In patients who have undergone cardiothoracic or vascular surgery (P), does implementation of a post-operative care bundle based on current EBP (I) as compared to current clinical agency practice standards (C) result in a decrease of SSI (O) over a 12-week period (T)? Synthesis of Supporting Evidence: A literature search of five databases, including citation chasing, was performed, which yielded fourteen pieces of evidence ranging from high to good quality. Four common themes were identified for the prevention of SSI’s including use and removal of surgical dressings; use of topical antibiotics and antiseptics; implementation of evidence-based care bundles, and implementation of surveillance through auditing and feedback. The Iowa Model was selected as the framework to help guide this project as it is a multiphase change process which encourages clinicians to recognize opportunities for improvement in healthcare practice. Practice/Implementation: The process for this project will include recruiting postsurgical participants who have undergone cardiovascular or thoracic surgery prior to discharge at a Northwest Indiana Hospital. The patients will receive education, verbal instruction, and return demonstration. The patients will be followed for 12 weeks, and wounds assessed utilizing the National Healthcare Safety Network//Centers for Disease Control (NHSN/CDC) assessment tool and compared to the SSI rate of 2021. Key stakeholders will include two cardiovascular surgeons, four physician assistants, two advance practice nurses, medical assistant and patients. Method of Evaluation: Chi Square analysis will be utilized to establish statistical significance and similarities between the two groups. Main Results/Outcomes: The proposed outcome is the prevention of SSIs in the post-op cardiothoracic and vascular patient. Implication/Recommendation(s): Implementation of standardized post operative care bundles in the prevention of SSI in cardiovascular and thoracic surgical patients.Keywords: cardiovascular, evidence based practice, infection, post-operative, prevention, thoracic, surgery
Procedia PDF Downloads 83837 Developing Cultural Competence as Part of Nursing Studies: Language, Customs and Health Issues
Authors: Mohammad Khatib, Salam Hadid
Abstract:
Introduction: Developing nurses' cultural competence begins in their basic training and requires them to participate in an array of activities which raise their awareness and stimulate their interest, desire and curiosity about different cultures, by creating opportunities for intercultural meetings promoting the concept of 'culture' and its components, including recognition of cultural diversity and the legitimacy of the other. Importantly, professionals need to acquire specific cultural knowledge and thorough understanding of the values, norms, customs, beliefs and symbols of different cultures. Similarly, they need to be given opportunities to practice the verbal and non-verbal communication skills of other cultures according to their cultural codes. Such a system is being implemented as part of nursing studies at Zefat Academic College in two study frameworks; firstly, a course integrating nursing theory and practice in multicultural nursing; secondly, a course in learning the languages spoken in Israel focusing on medical and nursing terminology. Methods: Students participating in the 'Transcultural Nursing' course come from a variety of backgrounds: Jews, or Arabs, religious, or secular; Muslim, Christian, new immigrants, Ethiopians or from other cultural affiliations. They are required to present and discuss cultural practices that affect health. In addition, as part of the language course, students learn and teach their friends 5 spoken languages (Arabic, Russian, Amharian, Yidish, and Sign language) focusing on therapeutic interaction and communication using the vocabulary and concepts necessary for the therapeutic encounter. An evaluation of the process and the results was done using a structured questionnaire which includes series of questions relating to the contributions of the courses to their cultural knowledge, awareness and skills. 155 students completed the questionnaire. Results: A preliminary assessment of this educational system points an increase in cultural awareness and knowledge among the students as well as in their willingness to recognize the other's difference. A positive atmosphere of multiculturalism is reflected in students' mutual interest and respect was created. Students showed a deep understanding of cultural issues relating to health and care (consanguinity and genetics, food customs; cultural events, reincarnation, traditional treatments etc.). Most of the students were willing to recommend the courses to others and suggest some changes relating learning methods (more simulations, role playing and activities).Keywords: cultural competence, nursing education, culture, language
Procedia PDF Downloads 277836 A Methodology to Virtualize Technical Engineering Laboratories: MastrLAB-VR
Authors: Ivana Scidà, Francesco Alotto, Anna Osello
Abstract:
Due to the importance given today to innovation, the education sector is evolving thanks digital technologies. Virtual Reality (VR) can be a potential teaching tool offering many advantages in the field of training and education, as it allows to acquire theoretical knowledge and practical skills using an immersive experience in less time than the traditional educational process. These assumptions allow to lay the foundations for a new educational environment, involving and stimulating for students. Starting from the objective of strengthening the innovative teaching offer and the learning processes, the case study of the research concerns the digitalization of MastrLAB, High Quality Laboratory (HQL) belonging to the Department of Structural, Building and Geotechnical Engineering (DISEG) of the Polytechnic of Turin, a center specialized in experimental mechanical tests on traditional and innovative building materials and on the structures made with them. The MastrLAB-VR has been developed, a revolutionary innovative training tool designed with the aim of educating the class in total safety on the techniques of use of machinery, thus reducing the dangers arising from the performance of potentially dangerous activities. The virtual laboratory, dedicated to the students of the Building and Civil Engineering Courses of the Polytechnic of Turin, has been projected to simulate in an absolutely realistic way the experimental approach to the structural tests foreseen in their courses of study: from the tensile tests to the relaxation tests, from the steel qualification tests to the resilience tests on elements at environmental conditions or at characterizing temperatures. The research work proposes a methodology for the virtualization of technical laboratories through the application of Building Information Modelling (BIM), starting from the creation of a digital model. The process includes the creation of an independent application, which with Oculus Rift technology will allow the user to explore the environment and interact with objects through the use of joypads. The application has been tested in prototype way on volunteers, obtaining results related to the acquisition of the educational notions exposed in the experience through a virtual quiz with multiple answers, achieving an overall evaluation report. The results have shown that MastrLAB-VR is suitable for both beginners and experts and will be adopted experimentally for other laboratories of the University departments.Keywords: building information modelling, digital learning, education, virtual laboratory, virtual reality
Procedia PDF Downloads 131835 Proximate Composition, Minerals and Sensory Attributes of Cake, Cookies, Cracker, and Chin-Chin Prepared from Cassava-Gari Residue Flour
Authors: Alice Nwanyioma Ohuoba, Rose Erdoo Kukwa, Ukpabi Joseph Ukpabi
Abstract:
Cassava root (Manihot esculenta) is one of the important carbohydrates containing crops in Nigeria. It is a staple food, mostly in the southern part of the country, and a source of income to farmers and processors. Cassava gari processing methods result to residue fiber (solid waste) from the sieving operation, these residue fibers ( solid wastes) can be dried and milled into flour and used to prepare cakes, cookies, crackers and chin-chin instead of being thrown away mostly on farmland or near the residential area. Flour for baking or frying may contain carbohydrates and protein (wheat flour) or rich in only carbohydrates (cassava flour). Cake, cookies, crackers, and chin-chin were prepared using the residue flour obtained from the residue fiber of cassava variety NR87184 roots, processed into gari. This study is aimed at evaluating the proximate composition, mineral content and sensory attributes of these selected snacks produced. The proximate composition results obtained showed that crackers had the lowest value in moisture (2.3390%) and fat (1.7130%), but highest in carbohydrates (85.2310%). Amongst the food products, cakes recorded the highest value in protein (8.0910%). Crude fibre values ranges from 2.5265% (cookies) to 3.4165% (crackers). The result of the mineral contents showed cookies ranking the highest in Phosphorus (65.8535 ppm) and Iron (0.1150 mg/L), Calcium (1.3800mg/L) and Potassium (7.2850 mg/L) contents, while chin-chin and crackers were lowest in Sodium ( 2.7000 mg/L). The food products were also subjected to sensory attributes evaluation by thirty member panelists using 9-hedonic scale which ranged from 1 ( dislike extremely) to 9 (like extremely). The means score obtained shows all the food products having above 7.00 (above “like moderately”). This study has shown that food products that may be functional or nutraceuticals could be prepared from the residue flour. There is a call for the use of gluten-free flour in baking due to ciliac disease and other allergic causes by gluten. Therefore local carbohydrates food crops like cassava residue flour that are gluten-free, could be the solution. In addition, this could aid cassava gari processing waste management thereby reducing post-harvest losses of cassava root.Keywords: allergy, flour, food-products, gluten-free
Procedia PDF Downloads 155834 Factors Influencing the Integration of Comprehensive Sexuality Education into Educational Systems in Low- And Middle-Income Countries: A Systematic Review
Authors: Malizgani Paul Chavula
Abstract:
Background: Comprehensive sexuality education (CSE) plays a critical role in promoting youth and adolescents’ sexual and reproductive health and well-being. However, little is known about the enablers and barriers affecting the integration of CSE into educational programmes. The aim of this review is to explore positive and negative factors influencing the integration of CSE into national curricula and educational systems in low- and middle-income countries. Methods: We conducted a systematic literature review (January 2010 to August 2022). The results accord with the Preferred Reporting Items for Systematic Reviews and Meta-analysis standards for systematic reviews. Data were retrieved from the PubMed, Cochrane, Google Scholar, and Web of Hinari databases. The search yielded 431 publications, of which 23 met the inclusion criteria for full-text screening. The review is guided by an established conceptual framework that incorporates the integration of health innovations into health systems. Data were analyzed using a thematic synthesis approach. Results: The magnitude of the problem is evidenced by sexual and reproductive health challenges such as high teenage pregnancies, early marriages, and sexually transmitted infections. Awareness of these challenges can facilitate the development of interventions and the implementation and integration of CSE. Reported aspects of the interventions include core CSE content, delivery methods, training materials and resources, and various teacher-training factors. Reasons for adoption include perceived benefits of CSE, experiences and characteristics of both teachers and learners, and religious, social, and cultural factors. Broad system characteristics include strengthening links between schools and health facilities, school and community-based collaboration, coordination of CSE implementation, and the monitoring and evaluation of CSE. Ultimately, the availability of resources, national policies and laws, international agendas, and political commitment will impact upon the extent and level of integration. Conclusion: Social, economic, cultural, political, legal, and financial contextual factors influence the implementation and integration of CSE into national curricula and educational systems. Stakeholder collaboration and involvement in the design and appropriateness of interventions is critical.Keywords: comprehensive sexuality education, factors, integration, sexual reproductive health rights
Procedia PDF Downloads 75833 The Impact of Coronal STIR Imaging in Routine Lumbar MRI: Uncovering Hidden Causes to Enhanced Diagnostic Yield of Back Pain and Sciatica
Authors: Maysoon Nasser Samhan, Somaya Alkiswani, Abdullah Alzibdeh
Abstract:
Background: Routine lumbar MRIs for back pain may yield normal results despite persistent symptoms, which means the possibility of other causes for this pain, which was not shown on the routine images. Research suggests including coronal STIR imaging to detect additional pathologies like sacroiliitis. Objectives: This study aims to enhance diagnostic accuracy and aid in determining treatment processes for patients with persistent back pain who have normal routine lumbar MRI (T1 and T2 images) by incorporating coronal STIR into the examination. Methods: A prospectively conducted study involving 274 patients, 115 males and 159 females, with an age range of 6–92 years, reviewed their medical records and imaging data following a lumbar spine MRI. This study included patients with back pain and sciatica as their primary complaints, all of whom underwent lumbar spine MRIs at our hospital to identify potential pathologies. Using a GE Signa HD 1.5T MRI System, each patient received a standard MRI protocol that included T1 and T2 sagittal and axial sequences, as well as a coronal STIR sequence. We collected relevant MRI findings, including abnormalities and structural variations, from radiology reports. We classified these findings into tables and documented them as counts and percentages, using Fisher’s exact test to assess differences between categorical variables. We conducted a statistical analysis using Prism GraphPad software version 10.1.2. The study adhered to ethical guidelines, institutional review board approvals, and patient confidentiality regulations. Results: Exclusion of the coronal STIR sequence led to 83 subjects (30.29%) being classified as within normal limits on MRI examination. 36 patients without abnormalities on T1 and T2 sequences showed abnormalities on the coronal STIR sequence, with 26 cases attributed to spinal pathologies and 10 to non-spinal pathologies. In addition to that, Fisher's exact test demonstrated a significant association between sacroiliitis diagnosis and abnormalities identified solely through the coronal STIR sequence (P < 0.0001). Conclusion: Implementing coronal STIR imaging as part of routine lumbar MRI protocols has the potential to improve patient care by facilitating a more comprehensive evaluation and management of persistent back pain.Keywords: magnetic resonance imaging, lumber MRI, radiology, neurology
Procedia PDF Downloads 13832 Histological Grade Concordance between Core Needle Biopsy and Corresponding Surgical Specimen in Breast Carcinoma
Authors: J. Szpor, K. Witczak, M. Storman, A. Orchel, D. Hodorowicz-Zaniewska, K. Okoń, A. Klimkowska
Abstract:
Core needle biopsy (CNB) is well established as an important diagnostic tool in diagnosing breast cancer and it is now considered the initial method of choice for diagnosing breast disease. In comparison to fine needle aspiration (FNA), CNB provides more architectural information allowing for the evaluation of prognostic and predictive factors for breast cancer, including histological grade—one of three prognostic factors used to calculate the Nottingham Prognostic Index. Several studies have previously described the concordance rate between CNB and surgical excision specimen in determination of histological grade (HG). The concordance rate previously ascribed to overall grade varies widely across literature, ranging from 59-91%. The aim of this study is to see how the data looks like in material at authors’ institution and are the results as compared to those described in previous literature. The study population included 157 women with a breast tumor who underwent a core needle biopsy for breast carcinoma and a subsequent surgical excision of the tumor. Both materials were evaluated for the determination of histological grade (scale from 1 to 3). HG was assessed only in core needle biopsies containing at least 10 well preserved HPF with invasive tumor. The degree of concordance between CNB and surgical excision specimen for the determination of tumor grade was assessed by Cohen’s kappa coefficient. The level of agreement between core needle biopsy and surgical resection specimen for overall histologic grading was 73% (113 of 155 cases). CNB correctly predicted the grade of the surgical excision specimen in 21 cases for grade 1 tumors (Kappa coefficient κ = 0.525 95% CI (0.3634; 0.6818), 52 cases for grade 2 (Kappa coefficient κ = 0.5652 95% CI (0.458; 0.667) and 40 cases for stage 3 tumors (Kappa coefficient κ = 0.6154 95% CI (0.4862; 0.7309). The highest level of agreement was observed in grade 3 malignancies. In 9 of 42 (21%) discordant cases, the grade was higher in the CNB than in the surgical excision. This composed 6% of the overall discordance. These results correspond to the noted in the literature, showing that underestimation occurs more frequently than overestimation. This study shows that authors’ institution’s histologic grading of CNBs and surgical excisions shows a fairly good correlation and is consistent with findings in previous reports. Despite the inevitable limitations of CNB, CNB is an effective method for diagnosing breast cancer and managing treatment options. Assessment of tumour grade by CNB is useful for the planning of treatment, so in authors’ opinion it is worthy to implement it in daily practice.Keywords: breast cancer, concordance, core needle biopsy, histological grade
Procedia PDF Downloads 229831 A Magnetic Hydrochar Nanocomposite as a Potential Adsorbent of Emerging Pollutants
Authors: Aura Alejandra Burbano Patino, Mariela Agotegaray, Veronica Lassalle, Fernanda Horst
Abstract:
Water pollution is of worldwide concern due to its importance as an essential resource for life. Industrial and urbanistic growth are anthropogenic activities that have caused an increase of undesirable compounds in water. In the last decade, emerging pollutants have become of great interest since, at very low concentrations (µg/L and ng/L), they exhibit a hazardous effect on wildlife, aquatic ecosystems, and human organisms. One group of emerging pollutants that are a matter of study are pharmaceuticals. Their high consumption rate and their inappropriate disposal have led to their detection in wastewater treatment plant influent, effluent, surface water, and drinking water. In consequence, numerous technologies have been developed to efficiently treat these pollutants. Adsorption appears like an easy and cost-effective technology. One of the most used adsorbents of emerging pollutants removal is carbon-based materials such as hydrochars. This study aims to use a magnetic hydrochar nanocomposite to be employed as an adsorbent for diclofenac removal. Kinetics models and the adsorption efficiency in real water samples were analyzed. For this purpose, a magnetic hydrochar nanocomposite was synthesized through the hydrothermal carbonization (HTC) technique hybridized to co-precipitation to add the magnetic component into the hydrochar, based on iron oxide nanoparticles. The hydrochar was obtained from sunflower husk residue as the precursor. TEM, TGA, FTIR, Zeta potential as a function of pH, DLS, BET technique, and elemental analysis were employed to characterize the material in terms of composition and chemical structure. Adsorption kinetics were carried out in distilled water and real water at room temperature, pH of 5.5 for distilled water and natural pH for real water samples, 1:1 adsorbent: adsorbate dosage ratio, contact times from 10-120 minutes, and 50% dosage concentration of DCF. Results have demonstrated that magnetic hydrochar presents superparamagnetic properties with a saturation magnetization value of 55.28 emu/g. Besides, it is mesoporous with a surface area of 55.52 m²/g. It is composed of magnetite nanoparticles incorporated into the hydrochar matrix, as can be proven by TEM micrographs, FTIR spectra, and zeta potential. On the other hand, kinetic studies were carried out using DCF models, finding percent removal efficiencies up to 85.34% after 80 minutes of contact time. In addition, after 120 minutes of contact time, desorption of emerging pollutants from active sites took place, which indicated that the material got saturated after that t time. In real water samples, percent removal efficiencies decrease up to 57.39%, ascribable to a possible mechanism of competitive adsorption of organic or inorganic compounds, ions for active sites of the magnetic hydrochar. The main suggested adsorption mechanism between the magnetic hydrochar and diclofenac include hydrophobic and electrostatic interactions as well as hydrogen bonds. It can be concluded that the magnetic hydrochar nanocomposite could be valorized into a by-product which appears as an efficient adsorbent for DCF removal as a model emerging pollutant. These results are being complemented by modifying experimental variables such as pollutant’s initial concentration, adsorbent: adsorbate dosage ratio, and temperature. Currently, adsorption assays of other emerging pollutants are being been carried out.Keywords: environmental remediation, emerging pollutants, hydrochar, magnetite nanoparticles
Procedia PDF Downloads 189830 The Development of Congeneric Elicited Writing Tasks to Capture Language Decline in Alzheimer Patients
Authors: Lise Paesen, Marielle Leijten
Abstract:
People diagnosed with probable Alzheimer disease suffer from an impairment of their language capacities; a gradual impairment which affects both their spoken and written communication. Our study aims at characterising the language decline in DAT patients with the use of congeneric elicited writing tasks. Within these tasks, a descriptive text has to be written based upon images with which the participants are confronted. A randomised set of images allows us to present the participants with a different task on every encounter, thus allowing us to avoid a recognition effect in this iterative study. This method is a revision from previous studies, in which participants were presented with a larger picture depicting an entire scene. In order to create the randomised set of images, existing pictures were adapted following strict criteria (e.g. frequency, AoA, colour, ...). The resulting data set contained 50 images, belonging to several categories (vehicles, animals, humans, and objects). A pre-test was constructed to validate the created picture set; most images had been used before in spoken picture naming tasks. Hence the same reaction times ought to be triggered in the typed picture naming task. Once validated, the effectiveness of the descriptive tasks was assessed. First, the participants (n=60 students, n=40 healthy elderly) performed a typing task, which provided information about the typing speed of each individual. Secondly, two descriptive writing tasks were carried out, one simple and one complex. The simple task contains 4 images (1 animal, 2 objects, 1 vehicle) and only contains elements with high frequency, a young AoA (<6 years), and fast reaction times. Slow reaction times, a later AoA (≥ 6 years) and low frequency were criteria for the complex task. This task uses 6 images (2 animals, 1 human, 2 objects and 1 vehicle). The data were collected with the keystroke logging programme Inputlog. Keystroke logging tools log and time stamp keystroke activity to reconstruct and describe text production processes. The data were analysed using a selection of writing process and product variables, such as general writing process measures, detailed pause analysis, linguistic analysis, and text length. As a covariate, the intrapersonal interkey transition times from the typing task were taken into account. The pre-test indicated that the new images lead to similar or even faster reaction times compared to the original images. All the images were therefore used in the main study. The produced texts of the description tasks were significantly longer compared to previous studies, providing sufficient text and process data for analyses. Preliminary analysis shows that the amount of words produced differed significantly between the healthy elderly and the students, as did the mean length of production bursts, even though both groups needed the same time to produce their texts. However, the elderly took significantly more time to produce the complex task than the simple task. Nevertheless, the amount of words per minute remained comparable between simple and complex. The pauses within and before words varied, even when taking personal typing abilities (obtained by the typing task) into account.Keywords: Alzheimer's disease, experimental design, language decline, writing process
Procedia PDF Downloads 274829 Evaluation of Diagnostic Values of Culture, Rapid Urease Test, and Histopathology in the Diagnosis of Helicobacter pylori Infection and in vitro Effects of Various Antimicrobials against Helicobacter pylori
Authors: Recep Kesli, Huseyin Bilgin, Yasar Unlu, Gokhan Gungor
Abstract:
Aim: The aim of this study, was to investigate the presence of Helicobacter pylori (H. pylori) infection by culture, histology, and RUT (Rapid Urease Test) in gastric antrum biopsy samples taken from patients presented with dyspeptic complaints and to determine resistance rates of amoxicillin, clarithromycin, levofloxacin and metronidazole against the H. pylori strains by E-test. Material and Methods: A total of 278 patients who admitted to Konya Education and Research Hospital Department of Gastroenterology with dyspeptic complaints, between January 2011-July 2013, were included in the study. Microbiological and histopathological examinations of biopsy specimens taken from antrum and corpus regions were performed. The presence of H. pylori in biopsy samples was investigated by culture (Portagerm pylori-PORT PYL, Pylori agar-PYL, GENbox microaer, bioMerieux, France), histology (Giemsa, Hematoxylin and Eosin staining), and RUT(CLOtest, Cimberly-Clark, USA). Antimicrobial resistance of isolates against amoxicillin, clarithromycin, levofloxacin, and metronidazole was determined by E-test method (bioMerieux, France). As a gold standard in the diagnosis of H. pylori; it was accepted that the culture method alone was positive or both histology and RUT were positive together. Sensitivity and specificity for histology and RUT were calculated by taking the culture as a gold standard. Sensitivity and specificity for culture were also calculated by taking the co-positivity of both histology and RUT as a gold standard. Results: H. pylori was detected in 140 of 278 of patients with culture and 174 of 278 of patients with histology in the study. H. pylori positivity was also found in 191 patients with RUT. According to the gold standard criteria, a false negative result was found in 39 cases by culture method, 17 cases by histology, and 8 cases by RUT. Sensitivity and specificity of the culture, histology, and RUT methods of the patients were 76.5 % and 88.3 %, 87.8 % and 63 %, 94.2 % and 57.2 %, respectively. Antibiotic resistance was investigated by E-test in 140 H. pylori strains isolated from culture. The resistance rates of H. pylori strains to the amoxicillin, clarithromycin, levofloxacin, and metronidazole was detected as 9 (6.4 %), 22 (15.7 %), 17 (12.1 %), 57 (40.7 %), respectively. Conclusion: In our study, RUT was found to be the most sensitive, culture was the most specific test between culture, histology, and RUT methods. Although we detected the specificity of the culture method as high, its sensitivity was found to be quite low compared to other methods. The low sensitivity of H. pylori culture may be caused by the factors affect the chances of direct isolation such as spoild bacterium, difficult-to-breed microorganism, clinical sample retrieval, and transport conditions.Keywords: antimicrobial resistance, culture, histology, H. pylori, RUT
Procedia PDF Downloads 163828 Implementation of Smart Card Automatic Fare Collection Technology in Small Transit Agencies for Standards Development
Authors: Walter E. Allen, Robert D. Murray
Abstract:
Many large transit agencies have adopted RFID technology and electronic automatic fare collection (AFC) or smart card systems, but small and rural agencies remain tied to obsolete manual, cash-based fare collection. Small countries or transit agencies can benefit from the implementation of smart card AFC technology with the promise of increased passenger convenience, added passenger satisfaction and improved agency efficiency. For transit agencies, it reduces revenue loss, improves passenger flow and bus stop data. For countries, further implementation into security, distribution of social services or currency transactions can provide greater benefits. However, small countries or transit agencies cannot afford expensive proprietary smart card solutions typically offered by the major system suppliers. Deployment of Contactless Fare Media System (CFMS) Standard eliminates the proprietary solution, ultimately lowering the cost of implementation. Acumen Building Enterprise, Inc. chose the Yuma County Intergovernmental Public Transportation Authority (YCIPTA) existing proprietary YCAT smart card system to implement CFMS. The revised system enables the purchase of fare product online with prepaid debit or credit cards using the Payment Gateway Processor. Open and interoperable smart card standards for transit have been developed. During the 90-day Pilot Operation conducted, the transit agency gathered the data from the bus AcuFare 200 Card Reader, loads (copies) the data to a USB Thumb Drive and uploads the data to the Acumen Host Processing Center for consolidation of the data into the transit agency master data file. The transition from the existing proprietary smart card data format to the new CFMS smart card data format was transparent to the transit agency cardholders. It was proven that open standards and interoperability design can work and reduce both implementation and operational costs for small transit agencies or countries looking to expand smart card technology. Acumen was able to avoid the implementation of the Payment Card Industry (PCI) Data Security Standards (DSS) which is expensive to develop and costly to operate on a continuing basis. Due to the substantial additional complexities of implementation and the variety of options presented to the transit agency cardholder, Acumen chose to implement only the Directed Autoload. To improve the implementation efficiency and the results for a similar undertaking, it should be considered that some passengers lack credit cards and are averse to technology. There are more than 1,300 small and rural agencies in the United States. This grows by 10 fold when considering small countries or rural locations throughout Latin American and the world. Acumen is evaluating additional countries, sites or transit agency that can benefit from the smart card systems. Frequently, payment card systems require extensive security procedures for implementation. The Project demonstrated the ability to purchase fare value, rides and passes with credit cards on the internet at a reasonable cost without highly complex security requirements.Keywords: automatic fare collection, near field communication, small transit agencies, smart cards
Procedia PDF Downloads 283827 The Effect of Mesenchymal Stem Cells on Full Thickness Skin Wound Healing in Albino Rats
Authors: Abir O. El Sadik
Abstract:
Introduction: Wound healing involves the interaction of multiple biological processes among different types of cells, intercellular matrix and specific signaling factors producing enhancement of cell proliferation of the epidermis over dermal granulation tissue. Several studies investigated multiple strategies to promote wound healing and to minimize infection and fluid losses. However, burn crisis, and its related morbidity and mortality are still elevated. The aim of the present study was to examine the effects of mesenchymal stem cells (MSCs) in accelerating wound healing and to compare the most efficient route of administration of MSCs, either intradermal or systemic injection, with focusing on the mechanisms producing epidermal and dermal cell regeneration. Material and methods: Forty-two adult male Sprague Dawley albino rats were divided into three equal groups (fourteen rats in each group): control group (group I); full thickness surgical skin wound model, Group II: Wound treated with systemic injection of MSCs and Group III: Wound treated with intradermal injection of MSCs. The healing ulcer was examined on day 2, 6, 10 and 15 for gross morphological evaluation and on day 10 and 15 for fluorescent, histological and immunohistochemical studies. Results: The wounds of the control group did not reach complete closure up to the end of the experiment. In MSCs treated groups, better and faster healing of wounds were detected more than the control group. Moreover, the intradermal route of administration of stem cells increased the rate of healing of the wounds more than the systemic injection. In addition, the wounds were found completely healed by the end of the fifteenth day of the experiment in all rats of the group injected intradermally. Microscopically, the wound areas of group III were hardly distinguished from the adjacent normal skin with complete regeneration of all skin layers; epidermis, dermis, hypodermis and underlying muscle layer. Fully regenerated hair follicles and sebaceous glands in the dermis of the healed areas surrounded by different arrangement of collagen fibers with a significant increase in their area percent were recorded in this group more than in other groups. Conclusion: MSCs accelerate the healing process of wound closure. The route of administration of MSCs has a great influence on wound healing as intradermal injection of MSCs was more effective in enhancement of wound healing than systemic injection.Keywords: intradermal, mesenchymal stem cells, morphology, skin wound, systemic injection
Procedia PDF Downloads 203826 Bio-Hub Ecosystems: Profitability through Circularity for Sustainable Forestry, Energy, Agriculture and Aquaculture
Authors: Kimberly Samaha
Abstract:
The Bio-Hub Ecosystem model was developed to address a critical area of concern within the global energy market regarding biomass as a feedstock for power plants. Yet the lack of an economically-viable business model for bioenergy facilities has resulted in the continuation of idled and decommissioned plants. This study analyzed data and submittals to the Born Global Maine Innovation Challenge. The Innovation Challenge was a global innovation challenge to identify process innovations that could address a ‘whole-tree’ approach of maximizing the products, byproducts, energy value and process slip-streams into a circular zero-waste design. Participating companies were at various stages of developing bioproducts and included biofuels, lignin-based products, carbon capture platforms and biochar used as both a filtration medium and as a soil amendment product. This case study shows the QCA (Qualitative Comparative Analysis) methodology of the prequalification process and the resulting techno-economic model that was developed for the maximizing profitability of the Bio-Hub Ecosystem through continuous expansion of system waste streams into valuable process inputs for co-hosts. A full site plan for the integration of co-hosts (biorefinery, land-based shrimp and salmon aquaculture farms, a tomato green-house and a hops farm) at an operating forestry-based biomass to energy plant in West Enfield, Maine USA. This model and process for evaluating the profitability not only proposes models for integration of forestry, aquaculture and agriculture in cradle-to-cradle linkages of what have typically been linear systems, but the proposal also allows for the early measurement of the circularity and impact of resource use and investment risk mitigation, for these systems. In this particular study, profitability is assessed at two levels CAPEX (Capital Expenditures) and in OPEX (Operating Expenditures). Given that these projects start with repurposing facilities where the industrial level infrastructure is already built, permitted and interconnected to the grid, the addition of co-hosts first realizes a dramatic reduction in permitting, development times and costs. In addition, using the biomass energy plant’s waste streams such as heat, hot water, CO₂ and fly ash as valuable inputs to their operations and a significant decrease in the OPEX costs, increasing overall profitability to each of the co-hosts bottom line. This case study utilizes a proprietary techno-economic model to demonstrate how utilizing waste streams of a biomass energy plant and/or biorefinery, results in significant reduction in OPEX for both the biomass plants and the agriculture and aquaculture co-hosts. Economically viable Bio-Hubs with favorable environmental and community impacts may prove critical in garnering local and federal government support for pilot programs and more wide-scale adoption, especially for those living in severely economically depressed rural areas where aging industrial sites have been shuttered and local economies devastated.Keywords: bio-economy, biomass energy, financing, zero-waste
Procedia PDF Downloads 134825 Analysis of Delays during Initial Phase of Construction Projects and Mitigation Measures
Authors: Sunaitan Al Mutairi
Abstract:
A perfect start is a key factor for project completion on time. The study examined the effects of delayed mobilization of resources during the initial phases of the project. This paper mainly highlights the identification and categorization of all delays during the initial construction phase and their root cause analysis with corrective/control measures for the Kuwait Oil Company oil and gas projects. A relatively good percentage of the delays identified during the project execution (Contract award to end of defects liability period) attributed to mobilization/preliminary activity delays. Data analysis demonstrated significant increase in average project delay during the last five years compared to the previous period. Contractors had delays/issues during the initial phase, which resulted in slippages and progressively increased, resulting in time and cost overrun. Delays/issues not mitigated on time during the initial phase had very high impact on project completion. Data analysis of the delays for the past five years was carried out using trend chart, scatter plot, process map, box plot, relative importance index and Pareto chart. Construction of any project inside the Gathering Centers involves complex management skills related to work force, materials, plant, machineries, new technologies etc. Delay affects completion of projects and compromises quality, schedule and budget of project deliverables. Works executed as per plan during the initial phase and start-up duration of the project construction activities resulted in minor slippages/delays in project completion. In addition, there was a good working environment between client and contractor resulting in better project execution and management. Mainly, the contractor was on the front foot in the execution of projects, which had minimum/no delays during the initial and construction period. Hence, having a perfect start during the initial construction phase shall have a positive influence on the project success. Our research paper studies each type of delay with some real example supported by statistic results and suggests mitigation measures. Detailed analysis carried out with all stakeholders based on impact and occurrence of delays to have a practical and effective outcome to mitigate the delays. The key to improvement is to have proper control measures and periodic evaluation/audit to ensure implementation of the mitigation measures. The focus of this research is to reduce the delays encountered during the initial construction phase of the project life cycle.Keywords: construction activities delays, delay analysis for construction projects, mobilization delays, oil & gas projects delays
Procedia PDF Downloads 318824 A Comprehensive Analysis of Factors Leading to Fatal Road Accidents in France and Its Overseas Territories
Authors: Bouthayna Hayou, Mohamed Mouloud Haddak
Abstract:
In road accidents in French overseas territories have been understudied, with relevant data often collected late and incompletely. Although these territories account for only 3% to 4% of road traffic injuries in France, their unique characteristics merit closer attention. Despite lower mobility and, consequently, lower exposure to road risks, the actual road risk in Overseas France is as high or even higher than in Metropolitan France. Significant disparities exist not only between Metropolitan France and Overseas territories but also among the overseas territories themselves. The varying population densities in these regions do not fully explain these differences, as each territory has its own distinct vulnerabilities and road safety challenges. This analysis, based on BAAC data files from 2005 to 2018 for both Metropolitan France and the overseas departments and regions, examines key variables such as gender, age, type of road user, type of obstacle hit, type of trip, road category, traffic conditions, weather, and location of accidents. Logistic regression models were built for each region to investigate the risk factors associated with fatal road accidents, focusing on the probability of being killed versus injured. Due to insufficient data, Mayotte and the Overseas Communities (French Polynesia and New Caledonia) were not included in the models. The findings reveal that road safety is worse in the overseas territories compared to Metropolitan France, particularly for vulnerable road users such as pedestrians and motorized two-wheelers. These territories present an accident profile that sits between that of Metropolitan France and middle-income countries. A pressing need exists to standardize accident data collection between Metropolitan and Overseas France to allow for more detailed comparative analyses. Further epidemiological studies could help identify the specific road safety issues unique to each territory, particularly with regard to socio-economic factors such as social cohesion, which may influence road safety outcomes. Moreover, the lack of data on new modes of travel, such as electric scooters, and the absence of socio-economic details of accident victims complicate the evaluation of emerging risk factors. Additional research, including sociological and psychosocial studies, is essential for understanding road users' behavior and perceptions of road risk, which could also provide valuable insights into accident trends in peri-urban areas in France.Keywords: multivariate logistic regression, overseas France, road safety, road traffic accident, territorial inequalities
Procedia PDF Downloads 10823 Evaluation of the Photo Neutron Contamination inside and outside of Treatment Room for High Energy Elekta Synergy® Linear Accelerator
Authors: Sharib Ahmed, Mansoor Rafi, Kamran Ali Awan, Faraz Khaskhali, Amir Maqbool, Altaf Hashmi
Abstract:
Medical linear accelerators (LINAC’s) used in radiotherapy treatments produce undesired neutrons when they are operated at energies above 8 MeV, both in electron and photon configuration. Neutrons are produced by high-energy photons and electrons through electronuclear (e, n) a photonuclear giant dipole resonance (GDR) reactions. These reactions occurs when incoming photon or electron incident through the various materials of target, flattening filter, collimators, and other shielding components in LINAC’s structure. These neutrons may reach directly to the patient, or they may interact with the surrounding materials until they become thermalized. A work has been set up to study the effect of different parameter on the production of neutron around the room by photonuclear reactions induced by photons above ~8 MeV. One of the commercial available neutron detector (Ludlum Model 42-31H Neutron Detector) is used for the detection of thermal and fast neutrons (0.025 eV to approximately 12 MeV) inside and outside of the treatment room. Measurements were performed for different field sizes at 100 cm source to surface distance (SSD) of detector, at different distances from the isocenter and at the place of primary and secondary walls. Other measurements were performed at door and treatment console for the potential radiation safety concerns of the therapists who must walk in and out of the room for the treatments. Exposures have taken place from Elekta Synergy® linear accelerators for two different energies (10 MV and 18 MV) for a given 200 MU’s and dose rate of 600 MU per minute. Results indicates that neutron doses at 100 cm SSD depend on accelerator characteristics means jaw settings as jaws are made of high atomic number material so provides significant interaction of photons to produce neutrons, while doses at the place of larger distance from isocenter are strongly influenced by the treatment room geometry and backscattering from the walls cause a greater doses as compare to dose at 100 cm distance from isocenter. In the treatment room the ambient dose equivalent due to photons produced during decay of activation nuclei varies from 4.22 mSv.h−1 to 13.2 mSv.h−1 (at isocenter),6.21 mSv.h−1 to 29.2 mSv.h−1 (primary wall) and 8.73 mSv.h−1 to 37.2 mSv.h−1 (secondary wall) for 10 and 18 MV respectively. The ambient dose equivalent for neutrons at door is 5 μSv.h−1 to 2 μSv.h−1 while at treatment console room it is 2 μSv.h−1 to 0 μSv.h−1 for 10 and 18 MV respectively which shows that a 2 m thick and 5m longer concrete maze provides sufficient shielding for neutron at door as well as at treatment console for 10 and 18 MV photons.Keywords: equivalent doses, neutron contamination, neutron detector, photon energy
Procedia PDF Downloads 449