Search results for: spectral domain
697 The Association between Health-Related Quality of Life and Physical Activity in Different Domains with Other Factors in Croatian Male Police Officers
Authors: Goran Sporiš, Dinko Vuleta, Stefan Lovro
Abstract:
The purpose of the present study was to determine the associations between health-related quality of life (HRQOL) and physical activity (PA) in different domains. In this cross-sectional study, participants were 169 Croatian police officers (mean age 35.14±8.95 yrs, mean height 180.93±7.53 cm, mean weight 88.39±14.05 kg, mean body-mass index 26.90±3.39 kg/m2). The dependent variables were two general domains extracted from the HRQOL questionnaire: (1) physical component scale (PCS) and (2) mental component scale (MCS). The independent variables were job-related, transport, domestic and leisure-time PA, along with other factors: age, body-mass index, smoking status, psychological distress, socioeconomic status and time spent in sedentary behaviour. The associations between dependent and independent variables were analyzed by using multiple regression analysis. Significance was set up at p < 0.05. PCS was positively associated with leisure-time PA (β 0.28, p < 0.001) and socioeconomic status (SES) (β 0.16, p=0.005), but inversely associated with job-related PA (β -0.15, p=0.012), domestic-time PA (β -0.14, p=0.014), age (β -0.12, p=0.050), psychological distress (β -0.43, p<0.001) and sedentary behaviour (β -0.15, p=0.009). MCS was positively associated with leisure-time PA (β 0.19, p=0.013) and SES (β 0.20, p=0.002), while inversely associated with age (β -0.23, p=0.001), psychological distress (β -0.27, p<0.001) and sedentary behaviour (β -0.22, p=0.001). Our results added new information about the associations between domain-specific PA and both physical and mental component scale in police officers. Future studies should deal with the same associations in other stressful occupations.Keywords: health, fitness, police force, relations
Procedia PDF Downloads 299696 Facilitators and Barriers of Family Resilience in Cancer Patients Based on the Theoretical Domains Framework: An Integrative Review
Authors: Jiang Yuqi
Abstract:
Aims: The aim is to analyze the facilitators and barriers of family resilience in cancer patients based on the theoretical domain framework, provide a basis for intervention in the family resilience of cancer patients, and identify the progress and enlightenment of existing intervention projects. Methods: NVivo software was used to code the influencing factors using the framework of 14 theoretical domains as primary nodes; secondary nodes were then refined using thematic analysis, and specific influencing factors were aggregated and analyzed for evaluator reliability. Data sources: PubMed, Embase, CINAHL, Web of Science, Cochrane Library, MEDLINE, CNKI, and Wanfang (search dates: from construction to November 2023). Results: A total of 35 papers were included, with 142 coding points across 14 theoretical domains and 38 secondary nodes. The three most relevant theoretical domains are social influences (norms), the environment and resources, and emotions (mood). The factors with the greatest impact were family support, mood, confidence and beliefs, external support, quality of life, economic circumstances, family adaptation, coping styles with illness, and management. Conclusion: The factors influencing family resilience in cancer patients cover most of the theoretical domains in the Theoretical Domains Framework and are cross-cutting, multi-sourced, and complex. Further in-depth exploration of the key factors influencing family resilience is necessary to provide a basis for intervention research.Keywords: cancer, survivors, family resilience, theoretical domains framework, literature review
Procedia PDF Downloads 47695 An Automatic Bayesian Classification System for File Format Selection
Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan
Abstract:
This paper presents an approach for the classification of an unstructured format description for identification of file formats. The main contribution of this work is the employment of data mining techniques to support file format selection with just the unstructured text description that comprises the most important format features for a particular organisation. Subsequently, the file format indentification method employs file format classifier and associated configurations to support digital preservation experts with an estimation of required file format. Our goal is to make use of a format specification knowledge base aggregated from a different Web sources in order to select file format for a particular institution. Using the naive Bayes method, the decision support system recommends to an expert, the file format for his institution. The proposed methods facilitate the selection of file format and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and specifications of file formats. To facilitate decision-making, the aggregated information about the file formats is presented as a file format vocabulary that comprises most common terms that are characteristic for all researched formats. The goal is to suggest a particular file format based on this vocabulary for analysis by an expert. The sample file format calculation and the calculation results including probabilities are presented in the evaluation section.Keywords: data mining, digital libraries, digital preservation, file format
Procedia PDF Downloads 499694 Studying the Relationship Between Washback Effects of IELTS Test on Iranian Language Teachers, Teaching Strategies and Candidates
Authors: Afsaneh Jasmine Majidi
Abstract:
Language testing is an important part of language teaching experience and language learning process as it presents assessment strategies for teachers to evaluate the efficiency of teaching and for learners to examine their outcomes. However, language testing is demanding and challenging because it should provide the opportunity for proper and objective decision. In addition to all the efforts test designers put to design valid and reliable tests, there are some other determining factors which are even more complex and complicated. These factors affect the educational system, individuals, and society, and the impact of the tests vary according to the scope of the test. Seemingly, the impact of a simple classroom assessment is not the same as that of high stake tests such as International English Language Testing System (IELTS). As the importance of the test increases, it affects wider domain. Accordingly, the impacts of high stake tests are reflected not only in teaching, learning strategies but also in society. Testing experts use the term ‘washback’ or ‘impact’ to define the different effects of a test on teaching, learning, and community. This paper first looks at the theoretical background of ‘washback’ and ‘impact’ in language testing by reviewing of relevant literature in the field and then investigates washback effects of IELTS test of on Iranian IELTS teachers and students. The study found significant relationship between the washback effect of IELTS test and teaching strategies of Iranian IELTS teachers as well as performance of Iranian IELTS candidates and their community.Keywords: high stake tests, IELTS, Iranian Candidates, language testing, test impact, washback
Procedia PDF Downloads 327693 The Current Development and Legislation on the Acquisition and Use of Nuclear Energy in Contemporary International Law
Authors: Uche A. Nnawulezi
Abstract:
Over the past decades, the acquisition and utilization of nuclear energy have remained a standout amongst the most intractable issues which past world leaders have unsuccessfully endeavored to grapple with. This study analyzes the present advancement and enactment on the acquisition and utilization of nuclear energy in contemporary international law. It seeks to address international co-operations in the field of nuclear energy by looking at what nuclear energy is all about and how it came into being. It also seeks to address concerns expressed by a few researchers on the position of nuclear law in the most extensive domain of the law by looking at the authoritative procedure for nuclear law, system of arrangements and traditions. This study also agrees in favour of treaty on non-proliferation of nuclear weapons based on human right and humanitarian principles that are not duly moral, but also legal ones. Specifically, the past development activities on nuclear weapon and the practical system of the nuclear energy institute will be inspected. The study noted among others, former president Obama's remark on nuclear energy and Pakistan nuclear policies and its attendant outcomes. Essentially, we depended on documentary evidence and henceforth scooped a great part of the data from secondary sources. The study emphatically advocates for the adoption of absolute liability principles and setting up of a viability trust fund, all of which will help in sustaining global peace where global best practices in acquisition and use of nuclear energy will be widely accepted in the contemporary international law. Essentially, the fundamental proposals made in this paper if completely adopted, might go far in fortifying the present advancement and enactment on the application and utilization of nuclear energy and accordingly, addressing a portion of the intractable issues under international law.Keywords: nuclear energy, international law, acquisition, development
Procedia PDF Downloads 178692 Influence of Servant Leadership on Faculty Retention in Higher Education Institutes: Mediating Role of Job Satisfaction
Authors: Aneela Sheikh
Abstract:
Private higher education institutes are challenged for their resilience and competitive edge in the globalized knowledge-based economy in the 21st century. Faculty retention plays an important role as a catalyst for addressing the current mega-developmental phenomenon in higher education institutes faced by developing countries. This study intends to explore the influence of servant leadership practice on faculty retention through the intervening role of job satisfaction towards minimizing the high faculty turnover in private higher education institutes, with the mediating role of job satisfaction. A sample of 341 faculty members from ten private higher education institutes in Lahore city of Pakistan, was selected through a stratified proportionate random sampling technique. A descriptive survey research approach was employed to collect data from 341 faculty members by administering a close-ended questionnaire based on a seven-point Likert scale as a self-administered research instrument. The study was conducted under the domain of the Leader-Member Exchange (LMX) theory. The mediating role of job satisfaction was measured by bootstrapping technique. The results revealed that servant leadership has a statistically significant influence on faculty retention, with a statistically significant mediating role of job satisfaction, in private higher education institutes in Pakistan. Further, up to the best of the authors’ knowledge, this is the first systematic and empirical study on faculty retention conducted against the backdrop of servant leadership in an Eastern context, particularly in Pakistan.Keywords: servant leadership, faculty retention, job satisfaction, higher education institutes
Procedia PDF Downloads 81691 Developing an Out-of-Distribution Generalization Model Selection Framework through Impurity and Randomness Measurements and a Bias Index
Authors: Todd Zhou, Mikhail Yurochkin
Abstract:
Out-of-distribution (OOD) detection is receiving increasing amounts of attention in the machine learning research community, boosted by recent technologies, such as autonomous driving and image processing. This newly-burgeoning field has called for the need for more effective and efficient methods for out-of-distribution generalization methods. Without accessing the label information, deploying machine learning models to out-of-distribution domains becomes extremely challenging since it is impossible to evaluate model performance on unseen domains. To tackle this out-of-distribution detection difficulty, we designed a model selection pipeline algorithm and developed a model selection framework with different impurity and randomness measurements to evaluate and choose the best-performing models for out-of-distribution data. By exploring different randomness scores based on predicted probabilities, we adopted the out-of-distribution entropy and developed a custom-designed score, ”CombinedScore,” as the evaluation criterion. This proposed score was created by adding labeled source information into the judging space of the uncertainty entropy score using harmonic mean. Furthermore, the prediction bias was explored through the equality of opportunity violation measurement. We also improved machine learning model performance through model calibration. The effectiveness of the framework with the proposed evaluation criteria was validated on the Folktables American Community Survey (ACS) datasets.Keywords: model selection, domain generalization, model fairness, randomness measurements, bias index
Procedia PDF Downloads 124690 An Interactive User-Oriented Approach to Optimizing Public Space Lighting
Authors: Tamar Trop, Boris Portnov
Abstract:
Public Space Lighting (PSL) of outdoor urban areas promotes comfort, defines spaces and neighborhood identities, enhances perceived safety and security, and contributes to residential satisfaction and wellbeing. However, if excessive or misdirected, PSL leads to unnecessary energy waste and increased greenhouse gas emissions, poses a non-negligible threat to the nocturnal environment, and may become a potential health hazard. At present, PSL is designed according to international, regional, and national standards, which consolidate best practice. Yet, knowledge regarding the optimal light characteristics needed for creating a perception of personal comfort and safety in densely populated residential areas, and the factors associated with this perception, is still scarce. The presented study suggests a paradigm shift in designing PSL towards a user-centered approach, which incorporates pedestrians' perspectives into the process. The study is an ongoing joint research project between China and Israel Ministries of Science and Technology. Its main objectives are to reveal inhabitants' perceptions of and preferences for PSL in different densely populated neighborhoods in China and Israel, and to develop a model that links instrumentally measured parameters of PSL (e.g., intensity, spectra and glare) with its perceived comfort and quality, while controlling for three groups of attributes: locational, temporal, and individual. To investigate measured and perceived PSL, the study employed various research methods and data collection tools, developed a location-based mobile application, and used multiple data sources, such as satellite multi-spectral night-time light imagery, census statistics, and detailed planning schemes. One of the study’s preliminary findings is that higher sense of safety in the investigated neighborhoods is not associated with higher levels of light intensity. This implies potential for energy saving in brightly illuminated residential areas. Study findings might contribute to the design of a smart and adaptive PSL strategy that enhances pedestrians’ perceived safety and comfort while reducing light pollution and energy consumption.Keywords: energy efficiency, light pollution, public space lighting, PSL, safety perceptions
Procedia PDF Downloads 133689 Vertically Coupled III-V/Silicon Single Mode Laser with a Hybrid Grating Structure
Abstract:
Silicon photonics has gained much interest and extensive research for a promising aspect for fabricating compact, high-speed and low-cost photonic devices compatible with complementary metal-oxide-semiconductor (CMOS) process. Despite the remarkable progress made on the development of silicon photonics, high-performance, cost-effective, and reliable silicon laser sources are still missing. In this work, we present a 1550 nm III-V/silicon laser design with stable single-mode lasing property and robust and high-efficiency vertical coupling. The InP cavity consists of two uniform Bragg grating sections at sides for mode selection and feedback, as well as a central second-order grating for surface emission. A grating coupler is etched on the SOI waveguide by which the light coupling between the parallel III-V and SOI is reached vertically rather than by evanescent wave coupling. Laser characteristic is simulated and optimized by the traveling-wave model (TWM) and a Green’s function analysis as well as a 2D finite difference time domain (FDTD) method for the coupling process. The simulation results show that single-mode lasing with SMSR better than 48dB is achievable, and the threshold current is less than 15mA with a slope efficiency of around 0.13W/A. The coupling efficiency is larger than 42% and possesses a high tolerance with less than 10% reduction for 10 um horizontal or 15 um vertical dislocation. The design can be realized by standard flip-chip bonding techniques without co-fabrication of III-V and silicon or precise alignment.Keywords: III-V/silicon integration, silicon photonics, single mode laser, vertical coupling
Procedia PDF Downloads 156688 Microstructure Study of Melt Spun Mg₆₅Cu₂₅Y₁₀
Authors: Michael Regev, Shai Essel, Alexander Katz-Demyanetz
Abstract:
Magnesium alloys are characterized by good physical properties: They exhibit high strength, are lightweight and have good damping absorption and good thermal and electrical conductivity. Amorphous magnesium alloys, moreover, exhibit higher strength, hardness and a large elastic domain in addition to having excellent corrosion resistance. These above-mentioned advantages make magnesium based metallic glasses attractive for industrial use. Among the various existing magnesium alloys, Mg₆₅Cu₂₅Y₁₀ alloy is known to be one of the best glass formers. In the current study, Mg₆₅Cu₂₅Y₁₀ ribbons were produced by melt spinning, their microstructure was investigated in its as-cast condition, after pressing under 0.5 GPa for 5 minutes under different temperatures - RT, 500C, 1000C, 1500C and 2000C - and after five minute exposure to the above temperatures without pressing. The microstructure was characterized by means of X-ray Diffraction (XRD), Differential Scanning Calorimetry (DSC), High Resolution Scanning Electron Microscope (HRSEM) and High Resolution Transmission Electron Microscopy (HRTEM). XRD and DSC studies showed that the as-cast material had an amorphous character and that the material crystallized during exposure to temperature with or without applying stress. HRTEM revealed that the as-cast Mg65Cu25Y10, although known to be one of the best glass formers, is nano-crystalline rather than amorphous. The current study casts light on the question what an amorphous alloy is and whether there is any clear borderline between amorphous and nano-crystalline alloys.Keywords: metallic glass, magnesium, melt spinning, amorphous alloys
Procedia PDF Downloads 236687 An Automated Approach to the Nozzle Configuration of Polycrystalline Diamond Compact Drill Bits for Effective Cuttings Removal
Authors: R. Suresh, Pavan Kumar Nimmagadda, Ming Zo Tan, Shane Hart, Sharp Ugwuocha
Abstract:
Polycrystalline diamond compact (PDC) drill bits are extensively used in the oil and gas industry as well as the mining industry. Industry engineers continually improve upon PDC drill bit designs and hydraulic conditions. Optimized injection nozzles play a key role in improving the drilling performance and efficiency of these ever changing PDC drill bits. In the first part of this study, computational fluid dynamics (CFD) modelling is performed to investigate the hydrodynamic characteristics of drilling fluid flow around the PDC drill bit. An Open-source CFD software – OpenFOAM simulates the flow around the drill bit, based on the field input data. A specifically developed console application integrates the entire CFD process including, domain extraction, meshing, and solving governing equations and post-processing. The results from the OpenFOAM solver are then compared with that of the ANSYS Fluent software. The data from both software programs agree. The second part of the paper describes the parametric study of the PDC drill bit nozzle to determine the effect of parameters such as number of nozzles, nozzle velocity, nozzle radial position and orientations on the flow field characteristics and bit washing patterns. After analyzing a series of nozzle configurations, the best configuration is identified and recommendations are made for modifying the PDC bit design.Keywords: ANSYS Fluent, computational fluid dynamics, nozzle configuration, OpenFOAM, PDC dill bit
Procedia PDF Downloads 420686 A Recommender System for Job Seekers to Show up Companies Based on Their Psychometric Preferences and Company Sentiment Scores
Authors: A. Ashraff
Abstract:
The increasing importance of the web as a medium for electronic and business transactions has served as a catalyst or rather a driving force for the introduction and implementation of recommender systems. Recommender Systems play a major role in processing and analyzing thousands of data rows or reviews and help humans make a purchase decision of a product or service. It also has the ability to predict whether a particular user would rate a product or service based on the user’s profile behavioral pattern. At present, Recommender Systems are being used extensively in every domain known to us. They are said to be ubiquitous. However, in the field of recruitment, it’s not being utilized exclusively. Recent statistics show an increase in staff turnover, which has negatively impacted the organization as well as the employee. The reasons being company culture, working flexibility (work from home opportunity), no learning advancements, and pay scale. Further investigations revealed that there are lacking guidance or support, which helps a job seeker find the company that will suit him best, and though there’s information available about companies, job seekers can’t read all the reviews by themselves and get an analytical decision. In this paper, we propose an approach to study the available review data on IT companies (score their reviews based on user review sentiments) and gather information on job seekers, which includes their Psychometric evaluations. Then presents the job seeker with useful information or rather outputs on which company is most suitable for the job seeker. The theoretical approach, Algorithmic approach and the importance of such a system will be discussed in this paper.Keywords: psychometric tests, recommender systems, sentiment analysis, hybrid recommender systems
Procedia PDF Downloads 106685 Land Cover Mapping Using Sentinel-2, Landsat-8 Satellite Images, and Google Earth Engine: A Study Case of the Beterou Catchment
Authors: Ella Sèdé Maforikan
Abstract:
Accurate land cover mapping is essential for effective environmental monitoring and natural resources management. This study focuses on assessing the classification performance of two satellite datasets and evaluating the impact of different input feature combinations on classification accuracy in the Beterou catchment, situated in the northern part of Benin. Landsat-8 and Sentinel-2 images from June 1, 2020, to March 31, 2021, were utilized. Employing the Random Forest (RF) algorithm on Google Earth Engine (GEE), a supervised classification categorized the land into five classes: forest, savannas, cropland, settlement, and water bodies. GEE was chosen due to its high-performance computing capabilities, mitigating computational burdens associated with traditional land cover classification methods. By eliminating the need for individual satellite image downloads and providing access to an extensive archive of remote sensing data, GEE facilitated efficient model training on remote sensing data. The study achieved commendable overall accuracy (OA), ranging from 84% to 85%, even without incorporating spectral indices and terrain metrics into the model. Notably, the inclusion of additional input sources, specifically terrain features like slope and elevation, enhanced classification accuracy. The highest accuracy was achieved with Sentinel-2 (OA = 91%, Kappa = 0.88), slightly surpassing Landsat-8 (OA = 90%, Kappa = 0.87). This underscores the significance of combining diverse input sources for optimal accuracy in land cover mapping. The methodology presented herein not only enables the creation of precise, expeditious land cover maps but also demonstrates the prowess of cloud computing through GEE for large-scale land cover mapping with remarkable accuracy. The study emphasizes the synergy of different input sources to achieve superior accuracy. As a future recommendation, the application of Light Detection and Ranging (LiDAR) technology is proposed to enhance vegetation type differentiation in the Beterou catchment. Additionally, a cross-comparison between Sentinel-2 and Landsat-8 for assessing long-term land cover changes is suggested.Keywords: land cover mapping, Google Earth Engine, random forest, Beterou catchment
Procedia PDF Downloads 63684 Neural Graph Matching for Modification Similarity Applied to Electronic Document Comparison
Authors: Po-Fang Hsu, Chiching Wei
Abstract:
In this paper, we present a novel neural graph matching approach applied to document comparison. Document comparison is a common task in the legal and financial industries. In some cases, the most important differences may be the addition or omission of words, sentences, clauses, or paragraphs. However, it is a challenging task without recording or tracing the whole edited process. Under many temporal uncertainties, we explore the potentiality of our approach to proximate the accurate comparison to make sure which element blocks have a relation of edition with others. In the beginning, we apply a document layout analysis that combines traditional and modern technics to segment layouts in blocks of various types appropriately. Then we transform this issue into a problem of layout graph matching with textual awareness. Regarding graph matching, it is a long-studied problem with a broad range of applications. However, different from previous works focusing on visual images or structural layout, we also bring textual features into our model for adapting this domain. Specifically, based on the electronic document, we introduce an encoder to deal with the visual presentation decoding from PDF. Additionally, because the modifications can cause the inconsistency of document layout analysis between modified documents and the blocks can be merged and split, Sinkhorn divergence is adopted in our neural graph approach, which tries to overcome both these issues with many-to-many block matching. We demonstrate this on two categories of layouts, as follows., legal agreement and scientific articles, collected from our real-case datasets.Keywords: document comparison, graph matching, graph neural network, modification similarity, multi-modal
Procedia PDF Downloads 179683 Being Funny is a Serious Business for Feminine Brands
Authors: Mohammed Murtuza Soofi
Abstract:
Purpose: Marketers and Researchers alike have simultaneously, yet in mutually exclusive instances, promote the use of humour by brands in their communication and gendering of brands, as both enhance brand equity and can generate positive attitudinal responses from customers. However, the gendering of brands comes with associated gendered stereotypical expectations. The current paper consolidates the long standing literature on gender role/stereotype theory and brand gender theories establishing a theoretical framework for understanding how gender-based stereotypes about humour can influence consumers’ attitudinal responses towards brands. Design/methodology/approach: Using parallel constrain satisfaction theory as domain theory to explain the highhandedness of stereotypes and gender stereotype theories (particularly around feminine use of humour), we explain why gender based stereotypes could constrain brand behaviors, and in turn, feminine brands get penalised for using witty, aggressive and self-enhancing humor. Findings: Extension of gender stereotypes to anthropomorphised brands will lead consumers to judge the use of negative humour by a feminine brand as less appropriate, which will trigger the causal chain of reduced sense of communal appropriateness and brand warmth which will result in a negative attitude towards the brand. Originality/value: Brand gendering being susceptible to gender based stereotypes, has very little attention in the literature and hence use of negative humour (stereotypical male behaviour), has never been studied in the context of gendered brands. It also helps understand to what extent stereotypes will impact attitudinal responses to the brand. Our work can help understand when heavily gendered brands can optimise the use of humour and when they can avoid it.Keywords: brand femininity, brand gender, gender stereotypes, humour
Procedia PDF Downloads 203682 Developing a Model for the Relation between Heritage and Place Identity
Authors: A. Arjomand Kermani, N. Charbgoo, M. Alalhesabi
Abstract:
In the situation of great acceleration of changes and the need for new developments in the cities on one hand and conservation and regeneration approaches on the other hand, place identity and its relation with heritage context have taken on new importance. This relation is generally mutual and complex one. The significant point in this relation is that the process of identifying something as heritage rather than just historical phenomena, brings that which may be inherited into the realm of identity. In planning and urban design as well as environmental psychology and phenomenology domain, place identity and its attributes and components were studied and discussed. However, the relation between physical environment (especially heritage) and identity has been neglected in the planning literature. This article aims to review the knowledge on this field and develop a model on the influence and relation of these two major concepts (heritage and identity). To build this conceptual model, we draw on available literature in environmental psychology as well as planning on place identity and heritage environment using a descriptive-analytical methodology to understand how they can inform the planning strategies and governance policies. A cross-disciplinary analysis is essential to understand the nature of place identity and heritage context and develop a more holistic model of their relationship in order to be employed in planning process and decision making. Moreover, this broader and more holistic perspective would enable both social scientists and planners to learn from one another’s expertise for a fuller understanding of community dynamics. The result indicates that a combination of these perspectives can provide a richer understanding—not only of how planning impacts our experience of place, but also how place identity can impact community planning and development.Keywords: heritage, inter-disciplinary study, place identity, planning
Procedia PDF Downloads 424681 Challenges for the Implementation of Community Led Total Sanitation in Rural Malawi
Authors: Save Kumwenda, Khumbo Kalulu, Kondwani Chidziwisano, Limbani Kalumbi, Vincent Doyle, Bagrey Ngwira
Abstract:
Introduction: The Malawi Government in partnership with Non-Governmental Organizations adopted Community Led Total Sanitation (CLTS) in 2008 as an approach in sanitation and hygiene promotion with an aim of declaring Malawi Open Defeacation Free (ODF) by 2015. While there is a significant body of research into CLTS available in public domain, there is little research done on challenges faced in implementing CLTS in Malawi. Methods: A cross-sectional qualitative study was carried out in three districts of Ntcheu, Balaka, and Phalombe. Data was collected using Focus Group Discussions (FGDs) and Key informant interviews (KII) and analysed manually. Results: In total, 96 people took part in FGDs and 9 people in KII. It was shown that choice of leaders after triggering was commonly done by chiefs, facilitators, and VHC without following CLTS principles as opposed to identifying individuals who showed leadership skills. Despite capacity building initiatives involving District Coordinating Teams, lack of resources to undertake follow-ups contributed to failure to sustain ODF in the community. It was also found that while most respondents appreciating the need for no subsidies, the elderly and those with disabilities felt the need for external support because do not have money for buying strong logs, slabs for durable toilet floor and also to hire people to build latrines for them. Conclusion: Effective implementation of CLTS requires comprehensive consideration of various issues that may affect its success.Keywords: open defecation, community-led, sanitation, faecal matter, hygiene, Malawi
Procedia PDF Downloads 381680 Rapid Flood Damage Assessment of Population and Crops Using Remotely Sensed Data
Authors: Urooj Saeed, Sajid Rashid Ahmad, Iqra Khalid, Sahar Mirza, Imtiaz Younas
Abstract:
Pakistan, a flood-prone country, has experienced worst floods in the recent past which have caused extensive damage to the urban and rural areas by loss of lives, damage to infrastructure and agricultural fields. Poor flood management system in the country has projected the risks of damages as the increasing frequency and magnitude of floods are felt as a consequence of climate change; affecting national economy directly or indirectly. To combat the needs of flood emergency, this paper focuses on remotely sensed data based approach for rapid mapping and monitoring of flood extent and its damages so that fast dissemination of information can be done, from local to national level. In this research study, spatial extent of the flooding caused by heavy rains of 2014 has been mapped by using space borne data to assess the crop damages and affected population in sixteen districts of Punjab. For this purpose, moderate resolution imaging spectroradiometer (MODIS) was used to daily mark the flood extent by using Normalised Difference Water Index (NDWI). The highest flood value data was integrated with the LandScan 2014, 1km x 1km grid based population, to calculate the affected population in flood hazard zone. It was estimated that the floods covered an area of 16,870 square kilometers, with 3.0 million population affected. Moreover, to assess the flood damages, Object Based Image Analysis (OBIA) aided with spectral signatures was applied on Landsat image to attain the thematic layers of healthy (0.54 million acre) and damaged crops (0.43 million acre). The study yields that the population of Jhang district (28% of 2.5 million population) was affected the most. Whereas, in terms of crops, Jhang and Muzzafargarh are the ‘highest damaged’ ranked district of floods 2014 in Punjab. This study was completed within 24 hours of the peak flood time, and proves to be an effective methodology for rapid assessment of damages due to flood hazardKeywords: flood hazard, space borne data, object based image analysis, rapid damage assessment
Procedia PDF Downloads 328679 DWT-SATS Based Detection of Image Region Cloning
Authors: Michael Zimba
Abstract:
A duplicated image region may be subjected to a number of attacks such as noise addition, compression, reflection, rotation, and scaling with the intention of either merely mating it to its targeted neighborhood or preventing its detection. In this paper, we present an effective and robust method of detecting duplicated regions inclusive of those affected by the various attacks. In order to reduce the dimension of the image, the proposed algorithm firstly performs discrete wavelet transform, DWT, of a suspicious image. However, unlike most existing copy move image forgery (CMIF) detection algorithms operating in the DWT domain which extract only the low frequency sub-band of the DWT of the suspicious image thereby leaving valuable information in the other three sub-bands, the proposed algorithm simultaneously extracts features from all the four sub-bands. The extracted features are not only more accurate representation of image regions but also robust to additive noise, JPEG compression, and affine transformation. Furthermore, principal component analysis-eigenvalue decomposition, PCA-EVD, is applied to reduce the dimension of the features. The extracted features are then sorted using the more computationally efficient Radix Sort algorithm. Finally, same affine transformation selection, SATS, a duplication verification method, is applied to detect duplicated regions. The proposed algorithm is not only fast but also more robust to attacks compared to the related CMIF detection algorithms. The experimental results show high detection rates.Keywords: affine transformation, discrete wavelet transform, radix sort, SATS
Procedia PDF Downloads 230678 Detection and Identification of Antibiotic Resistant Bacteria Using Infra-Red-Microscopy and Advanced Multivariate Analysis
Authors: Uraib Sharaha, Ahmad Salman, Eladio Rodriguez-Diaz, Elad Shufan, Klaris Riesenberg, Irving J. Bigio, Mahmoud Huleihel
Abstract:
Antimicrobial drugs have an important role in controlling illness associated with infectious diseases in animals and humans. However, the increasing resistance of bacteria to a broad spectrum of commonly used antibiotics has become a global health-care problem. Rapid determination of antimicrobial susceptibility of a clinical isolate is often crucial for the optimal antimicrobial therapy of infected patients and in many cases can save lives. The conventional methods for susceptibility testing like disk diffusion are time-consuming and other method including E-test, genotyping are relatively expensive. Fourier transform infrared (FTIR) microscopy is rapid, safe, and low cost method that was widely and successfully used in different studies for the identification of various biological samples including bacteria. The new modern infrared (IR) spectrometers with high spectral resolution enable measuring unprecedented biochemical information from cells at the molecular level. Moreover, the development of new bioinformatics analyses combined with IR spectroscopy becomes a powerful technique, which enables the detection of structural changes associated with resistivity. The main goal of this study is to evaluate the potential of the FTIR microscopy in tandem with machine learning algorithms for rapid and reliable identification of bacterial susceptibility to antibiotics in time span of few minutes. The bacterial samples, which were identified at the species level by MALDI-TOF and examined for their susceptibility by the routine assay (micro-diffusion discs), are obtained from the bacteriology laboratories in Soroka University Medical Center (SUMC). These samples were examined by FTIR microscopy and analyzed by advanced statistical methods. Our results, based on 550 E.coli samples, were promising and showed that by using infrared spectroscopic technique together with multivariate analysis, it is possible to classify the tested bacteria into sensitive and resistant with success rate higher than 85% for eight different antibiotics. Based on these preliminary results, it is worthwhile to continue developing the FTIR microscopy technique as a rapid and reliable method for identification antibiotic susceptibility.Keywords: antibiotics, E. coli, FTIR, multivariate analysis, susceptibility
Procedia PDF Downloads 265677 Identifying Enablers and Barriers of Healthcare Knowledge Transfer: A Systematic Review
Authors: Yousuf Nasser Al Khamisi
Abstract:
Purpose: This paper presents a Knowledge Transfer (KT) Framework in healthcare sectors by applying a systematic literature review process to the healthcare organizations domain to identify enablers and barriers of KT in Healthcare. Methods: The paper conducted a systematic literature search of peer-reviewed papers that described key elements of KT using four databases (Medline, Cinahl, Scopus, and Proquest) for a 10-year period (1/1/2008–16/10/2017). The results of the literature review were used to build a conceptual framework of KT in healthcare organizations. The author used a systematic review of the literature, as described by Barbara Kitchenham in Procedures for Performing Systematic Reviews. Findings: The paper highlighted the impacts of using Knowledge Management (KM) concept at a healthcare organization in controlling infectious diseases in hospitals, improving family medicine performance and enhancing quality improvement practices. Moreover, it found that good-coding performance is analytically linked with a knowledge sharing network structure rich in brokerage and hierarchy rather than in density. The unavailability or ignored of the latest evidence on more cost-effective or more efficient delivery approaches leads to increase the healthcare costs and may lead to unintended results. Originality: Search procedure produced 12,093 results, of which 3523 were general articles about KM and KT. The titles and abstracts of these articles had been screened to segregate what is related and what is not. 94 articles identified by the researchers for full-text assessment. The total number of eligible articles after removing un-related articles was 22 articles.Keywords: healthcare organisation, knowledge management, knowledge transfer, KT framework
Procedia PDF Downloads 138676 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores
Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay
Abstract:
Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.Keywords: retail stores, faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition
Procedia PDF Downloads 156675 Improving Similarity Search Using Clustered Data
Authors: Deokho Kim, Wonwoo Lee, Jaewoong Lee, Teresa Ng, Gun-Ill Lee, Jiwon Jeong
Abstract:
This paper presents a method for improving object search accuracy using a deep learning model. A major limitation to provide accurate similarity with deep learning is the requirement of huge amount of data for training pairwise similarity scores (metrics), which is impractical to collect. Thus, similarity scores are usually trained with a relatively small dataset, which comes from a different domain, causing limited accuracy on measuring similarity. For this reason, this paper proposes a deep learning model that can be trained with a significantly small amount of data, a clustered data which of each cluster contains a set of visually similar images. In order to measure similarity distance with the proposed method, visual features of two images are extracted from intermediate layers of a convolutional neural network with various pooling methods, and the network is trained with pairwise similarity scores which is defined zero for images in identical cluster. The proposed method outperforms the state-of-the-art object similarity scoring techniques on evaluation for finding exact items. The proposed method achieves 86.5% of accuracy compared to the accuracy of the state-of-the-art technique, which is 59.9%. That is, an exact item can be found among four retrieved images with an accuracy of 86.5%, and the rest can possibly be similar products more than the accuracy. Therefore, the proposed method can greatly reduce the amount of training data with an order of magnitude as well as providing a reliable similarity metric.Keywords: visual search, deep learning, convolutional neural network, machine learning
Procedia PDF Downloads 215674 Eliciting and Confirming Data, Information, Knowledge and Wisdom in a Specialist Health Care Setting - The Wicked Method
Authors: Sinead Impey, Damon Berry, Selma Furtado, Miriam Galvin, Loretto Grogan, Orla Hardiman, Lucy Hederman, Mark Heverin, Vincent Wade, Linda Douris, Declan O'Sullivan, Gaye Stephens
Abstract:
Healthcare is a knowledge-rich environment. This knowledge, while valuable, is not always accessible outside the borders of individual clinics. This research aims to address part of this problem (at a study site) by constructing a maximal data set (knowledge artefact) for motor neurone disease (MND). This data set is proposed as an initial knowledge base for a concurrent project to develop an MND patient data platform. It represents the domain knowledge at the study site for the duration of the research (12 months). A knowledge elicitation method was also developed from the lessons learned during this process - the WICKED method. WICKED is an anagram of the words: eliciting and confirming data, information, knowledge, wisdom. But it is also a reference to the concept of wicked problems, which are complex and challenging, as is eliciting expert knowledge. The method was evaluated at a second site, and benefits and limitations were noted. Benefits include that the method provided a systematic way to manage data, information, knowledge and wisdom (DIKW) from various sources, including healthcare specialists and existing data sets. Limitations surrounded the time required and how the data set produced only represents DIKW known during the research period. Future work is underway to address these limitations.Keywords: healthcare, knowledge acquisition, maximal data sets, action design science
Procedia PDF Downloads 360673 A Critical Review of Risk-Based Approach for Project Management Office Development
Authors: Alin Veronika, Yusuf Latief
Abstract:
This critical review meticulously delineates and elucidates the considerable deficiencies and voids that exist within the extant body of literature concerning the development strategies associated with risk-based Project Management Offices (PMOs). Although the advantages and positive outcomes linked to the establishment and functioning of PMOs are regularly articulated and acknowledged in various academic discourses, the empirical evidence that supports these claims frequently demonstrates a significant shortfall in methodological rigor and often encounters challenges when attempting to distinctly isolate and delineate the unique contributions and impacts of PMOs in contrast to other multifaceted organizational factors that may also play a role. This comprehensive review systematically scrutinizes and evaluates the current research landscape pertaining to the critical success factors that include, but are not limited to, strategic alignment, organizational structure, human capital, operational efficiency, technology, and the overarching influence of organizational culture, thereby identifying notable limitations within this research domain and proposing targeted areas for further scholarly investigation. Furthermore, the analysis accentuates the imperative need for the development and implementation of more sophisticated, nuanced risk assessment and mitigation frameworks that are specifically designed to cater to the unique operational characteristics of PMOs while simultaneously advocating for an elevated focus on the profound influence exerted by organizational culture and its various subcultures on the overall effectiveness and success of PMOs.Keywords: organizational culture, project management office, risk management, risk-based PMO development
Procedia PDF Downloads 12672 Folding Pathway and Thermodynamic Stability of Monomeric GroEL
Authors: Sarita Puri, Tapan K. Chaudhuri
Abstract:
Chaperonin GroEL is a tetradecameric Escherichia coli protein having identical subunits of 57 kDa. The elucidation of thermodynamic parameters related to stability for the native GroEL is not feasible as it undergoes irreversible unfolding because of its large size (800kDa) and multimeric nature. Nevertheless, it is important to determine the thermodynamic stability parameters for the highly stable GroEL protein as it helps in folding and holding of many substrate proteins during many cellular stresses. Properly folded monomers work as building-block for the formation of native tetradecameric GroEL. Spontaneous refolding behavior of monomeric GroEL makes it suitable for protein-denaturant interactions and thermodynamic stability based studies. The urea mediated unfolding is a three state process which means there is the formation of one intermediate state along with native and unfolded states. The heat mediated denaturation is a two-state process. The unfolding process is reversible as observed by the spontaneous refolding of denatured protein in both urea and head mediated refolding processes. Analysis of folding/unfolding data provides a measure of various thermodynamic stability parameters for the monomeric GroEL. The proposed mechanism of unfolding of monomeric GroEL is a three state process which involves formation of one stable intermediate having folded apical domain and unfolded equatorial, intermediate domains. Research in progress is to demonstrate the importance of specific residues in stability and oligomerization of GroEL protein. Several mutant versions of GroEL are under investigation to resolve the above mentioned issue.Keywords: equilibrium unfolding, monomeric GroEl, spontaneous refolding, thermodynamic stability
Procedia PDF Downloads 282671 New Targets Promoting Oncolytic Virotherapy
Authors: Felicia Segeth, Florian G. Klein, Lea Berger, Andreas Kolk, Per S. Holm
Abstract:
The entry of oncolytic viruses (OVs) into clinical application opens groundbreaking changes in current and future treatment regimens. However, despite their potent anti-cancer activity in vitro, clinical studies revealed limitations of OVs as monotherapy. The same applies to CDK 4/6 inhibitors (CDK4/6i) targeting cell cycle as well as bromodomain and extra-terminal domain inhibitors (BETi) targeting gene expression. In this study, the anti-tumoral effect of XVir-N-31, an YB-1 dependent oncolytic adenovirus, was evaluated in combination with Ribociclib, a CDK4/6i, and JQ1, a BETi. The head and neck squamous cell carcinoma (HNSCC) cell lines Fadu, SAS, and Cal-33 were used. DNA replication and gene expression of XVir-N-31 was measured by RT-qPCR, protein expression by western blotting, and cell lysis by SRB assays. Treatment with CDK4/6i and BETi increased viral gene expression, viral DNA replication, and viral particle formation. The data show that the combination of oncolytic adenovirus XVir-N-31 with CDK4/6i & BETi acts highly synergistic in cancer cell lysis. Furthermore, additional molecular analyses on this subject demonstrate that the positive transcription elongation factor P-TEFb plays a decisive role in this regard, indicating an influence of the combinational therapy on gene transcription control. The combination of CDK4/6i & BETi and XVir-N-31 is an attractive strategy to achieve substantial cancer cell killing and is highly suitable for clinical testing.Keywords: adenovirus, BET, CDK4/6, HNSCC, P-TEFb, YB-1
Procedia PDF Downloads 118670 Development and Control of Deep Seated Gravitational Slope Deformation: The Case of Colzate-Vertova Landslide, Bergamo, Northern Italy
Authors: Paola Comella, Vincenzo Francani, Paola Gattinoni
Abstract:
This paper presents the Colzate-Vertova landslide, a Deep Seated Gravitational Slope Deformation (DSGSD) located in the Seriana Valley, Northern Italy. The paper aims at describing the development as well as evaluating the factors that influence the evolution of the landslide. After defining the conceptual model of the landslide, numerical simulations were developed using a finite element numerical model, first with a two-dimensional domain, and later with a three-dimensional one. The results of the 2-D model showed a displacement field typical of a sackung, as a consequence of the erosion along the Seriana Valley. The analysis also showed that the groundwater flow could locally affect the slope stability, bringing about a reduction in the safety factor, but without reaching failure conditions. The sensitivity analysis carried out on the strength parameters pointed out that slope failures could be reached only for relevant reduction of the geotechnical characteristics. Such a result does not fit the real conditions observed on site, where a number of small failures often develop all along the hillslope. The 3-D model gave a more comprehensive analysis of the evolution of the DSGSD, also considering the border effects. The results showed that the convex profile of the slope favors the development of displacements along the lateral valley, with a relevant reduction in the safety factor, justifying the existing landslides.Keywords: deep seated gravitational slope deformation, Italy, landslide, numerical modeling
Procedia PDF Downloads 365669 Normalized Enterprises Architectures: Portugal's Public Procurement System Application
Authors: Tiago Sampaio, André Vasconcelos, Bruno Fragoso
Abstract:
The Normalized Systems Theory, which is designed to be applied to software architectures, provides a set of theorems, elements and rules, with the purpose of enabling evolution in Information Systems, as well as ensuring that they are ready for change. In order to make that possible, this work’s solution is to apply the Normalized Systems Theory to the domain of enterprise architectures, using Archimate. This application is achieved through the adaptation of the elements of this theory, making them artifacts of the modeling language. The theorems are applied through the identification of the viewpoints to be used in the architectures, as well as the transformation of the theory’s encapsulation rules into architectural rules. This way, it is possible to create normalized enterprise architectures, thus fulfilling the needs and requirements of the business. This solution was demonstrated using the Portuguese Public Procurement System. The Portuguese government aims to make this system as fair as possible, allowing every organization to have the same business opportunities. The aim is for every economic operator to have access to all public tenders, which are published in any of the 6 existing platforms, independently of where they are registered. In order to make this possible, we applied our solution to the construction of two different architectures, which are able of fulfilling the requirements of the Portuguese government. One of those architectures, TO-BE A, has a Message Broker that performs the communication between the platforms. The other, TO-BE B, represents the scenario in which the platforms communicate with each other directly. Apart from these 2 architectures, we also represent the AS-IS architecture that demonstrates the current behavior of the Public Procurement Systems. Our evaluation is based on a comparison between the AS-IS and the TO-BE architectures, regarding the fulfillment of the rules and theorems of the Normalized Systems Theory and some quality metrics.Keywords: archimate, architecture, broker, enterprise, evolvable systems, interoperability, normalized architectures, normalized systems, normalized systems theory, platforms
Procedia PDF Downloads 357668 Multi-Stream Graph Attention Network for Recommendation with Knowledge Graph
Abstract:
In recent years, Graph neural network has been widely used in knowledge graph recommendation. The existing recommendation methods based on graph neural network extract information from knowledge graph through entity and relation, which may not be efficient in the way of information extraction. In order to better propose useful entity information for the current recommendation task in the knowledge graph, we propose an end-to-end Neural network Model based on multi-stream graph attentional Mechanism (MSGAT), which can effectively integrate the knowledge graph into the recommendation system by evaluating the importance of entities from both users and items. Specifically, we use the attention mechanism from the user's perspective to distil the domain nodes information of the predicted item in the knowledge graph, to enhance the user's information on items, and generate the feature representation of the predicted item. Due to user history, click items can reflect the user's interest distribution, we propose a multi-stream attention mechanism, based on the user's preference for entities and relationships, and the similarity between items to be predicted and entities, aggregate user history click item's neighborhood entity information in the knowledge graph and generate the user's feature representation. We evaluate our model on three real recommendation datasets: Movielens-1M (ML-1M), LFM-1B 2015 (LFM-1B), and Amazon-Book (AZ-book). Experimental results show that compared with the most advanced models, our proposed model can better capture the entity information in the knowledge graph, which proves the validity and accuracy of the model.Keywords: graph attention network, knowledge graph, recommendation, information propagation
Procedia PDF Downloads 116