Search results for: conceptual and funnel methods
14819 A Diagnostic Accuracy Study: Comparison of Two Different Molecular-Based Tests (Genotype HelicoDR and Seeplex Clar-H. pylori ACE Detection), in the Diagnosis of Helicobacter pylori Infections
Authors: Recep Kesli, Huseyin Bilgin, Yasar Unlu, Gokhan Gungor
Abstract:
Aim: The aim of this study was to compare diagnostic values of two different molecular-based tests (GenoType® HelicoDR ve Seeplex® H. pylori-ClaR- ACE Detection) in detection presence of the H. pylori from gastric biopsy specimens. In addition to this also was aimed to determine resistance ratios of H. pylori strains against to clarytromycine and quinolone isolated from gastric biopsy material cultures by using both the genotypic (GenoType® HelicoDR, Seeplex ® H. pylori -ClaR- ACE Detection) and phenotypic (gradient strip, E-test) methods. Material and methods: A total of 266 patients who admitted to Konya Education and Research Hospital Department of Gastroenterology with dyspeptic complaints, between January 2011-June 2013, were included in the study. Microbiological and histopathological examinations of biopsy specimens taken from antrum and corpus regions were performed. The presence of H. pylori in all the biopsy samples was investigated by five differnt dignostic methods together: culture (C) (Portagerm pylori-PORT PYL, Pylori agar-PYL, GENbox microaer, bioMerieux, France), histology (H) (Giemsa, Hematoxylin and Eosin staining), rapid urease test (RUT) (CLOtest, Cimberly-Clark, USA), and two different molecular tests; GenoType® HelicoDR, Hain, Germany, based on DNA strip assay, and Seeplex ® H. pylori -ClaR- ACE Detection, Seegene, South Korea, based on multiplex PCR. Antimicrobial resistance of H. pylori isolates against clarithromycin and levofloxacin was determined by GenoType® HelicoDR, Seeplex ® H. pylori -ClaR- ACE Detection, and gradient strip (E-test, bioMerieux, France) methods. Culture positivity alone or positivities of both histology and RUT together was accepted as the gold standard for H. pylori positivity. Sensitivity and specificity rates of two molecular methods used in the study were calculated by taking the two gold standards previously mentioned. Results: A total of 266 patients between 16-83 years old who 144 (54.1 %) were female, 122 (45.9 %) were male were included in the study. 144 patients were found as culture positive, and 157 were H and RUT were positive together. 179 patients were found as positive with GenoType® HelicoDR and Seeplex ® H. pylori -ClaR- ACE Detection together. Sensitivity and specificity rates of studied five different methods were found as follows: C were 80.9 % and 84.4 %, H + RUT were 88.2 % and 75.4 %, GenoType® HelicoDR were 100 % and 71.3 %, and Seeplex ® H. pylori -ClaR- ACE Detection were, 100 % and 71.3 %. A strong correlation was found between C and H+RUT, C and GenoType® HelicoDR, and C and Seeplex ® H. pylori -ClaR- ACE Detection (r:0.644 and p:0.000, r:0.757 and p:0.000, r:0.757 and p:0.000, respectively). Of all the isolated 144 H. pylori strains 24 (16.6 %) were detected as resistant to claritromycine, and 18 (12.5 %) were levofloxacin. Genotypic claritromycine resistance was detected only in 15 cases with GenoType® HelicoDR, and 6 cases with Seeplex ® H. pylori -ClaR- ACE Detection. Conclusion: In our study, it was concluded that; GenoType® HelicoDR and Seeplex ® H. pylori -ClaR- ACE Detection was found as the most sensitive diagnostic methods when comparing all the investigated other ones (C, H, and RUT).Keywords: Helicobacter pylori, GenoType® HelicoDR, Seeplex ® H. pylori -ClaR- ACE Detection, antimicrobial resistance
Procedia PDF Downloads 16814818 Conceptual Model of a Residential Waste Collection System Using ARENA Software
Authors: Bruce G. Wilson
Abstract:
The collection of municipal solid waste at the curbside is a complex operation that is repeated daily under varying circumstances around the world. There have been several attempts to develop Monte Carlo simulation models of the waste collection process dating back almost 50 years. Despite this long history, the use of simulation modeling as a planning or optimization tool for waste collection is still extremely limited in practice. Historically, simulation modeling of waste collection systems has been hampered by the limitations of computer hardware and software and by the availability of representative input data. This paper outlines the development of a Monte Carlo simulation model that overcomes many of the limitations contained in previous models. The model uses a general purpose simulation software program that is easily capable of modeling an entire waste collection network. The model treats the stops on a waste collection route as a queue of work to be processed by a collection vehicle (or server). Input data can be collected from a variety of sources including municipal geographic information systems, global positioning system recorders on collection vehicles, and weigh scales at transfer stations or treatment facilities. The result is a flexible model that is sufficiently robust that it can model the collection activities in a large municipality, while providing the flexibility to adapt to changing conditions on the collection route.Keywords: modeling, queues, residential waste collection, Monte Carlo simulation
Procedia PDF Downloads 40014817 Augmented Reality in Teaching Children with Autism
Authors: Azadeh Afrasyabi, Ali Khaleghi, Aliakbar Alijarahi
Abstract:
Training at an early age is so important, because of tremendous changes in adolescence, including the formation of character, physical changes and other factors. One of the most sensitive sectors in this field is the children with a disability and are somehow special children who have trouble in communicating with their environment. One of the emerging technologies in the field of education that can be effectively profitable called augmented reality, where the combination of real world and virtual images in real time produces new concepts that can facilitate learning. The purpose of this paper is to propose an effective training method for special and disabled children based on augmented reality. Of course, in particular, the efficiency of augmented reality in teaching children with autism will consider, also examine the various aspect of this disease and different learning methods in this area.Keywords: technology in education, augmented reality, special education, teaching methods
Procedia PDF Downloads 37114816 Indoor and Outdoor Forest Farming for Year-Round Food and Medicine Production, Carbon Sequestration, Soil-Building, and Climate Change Mitigation
Authors: Jerome Osentowski
Abstract:
The objective at Central Rocky Mountain Permaculture Institute has been to put in practice a sustainable way of life while growing food, medicine, and providing education. This has been done by applying methods of farming such as agroforestry, forest farming, and perennial polycultures. These methods have been found to be regenerative to the environment through carbon sequestration, soil-building, climate change mitigation, and the provision of food security. After 30 years of implementing carbon farming methods, the results are agro-diversity, self-sustaining systems, and a consistent provision of food and medicine. These results are exhibited through polyculture plantings in an outdoor forest garden spanning roughly an acre containing about 200 varieties of fruits, nuts, nitrogen-fixing trees, and medicinal herbs, and two indoor forest garden greenhouses (one Mediterranean and one Tropical) containing about 50 varieties of tropical fruits, beans, herbaceous plants and more. While the climate zone outside the greenhouse is 6, the tropical forest garden greenhouse retains an indoor climate zone of 11 with near-net-zero energy consumption through the use of a climate battery, allowing the greenhouse to serve as a year-round food producer. The effort to source food from the forest gardens is minimal compared to annual crop production. The findings at Central Rocky Mountain Permaculture Institute conclude that agroecological methods are not only beneficial but necessary in order to revive and regenerate the environment and food security.Keywords: agroecology, agroforestry, carbon farming, carbon sequestration, climate battery, food security, forest farming, forest garden, greenhouse, near-net-zero, perennial polycultures
Procedia PDF Downloads 44214815 Experimental Chevreul’s Salt Production Methods on Copper Recovery
Authors: Turan Çalban, Oral Laçin, Abdüsselam Kurtbaş
Abstract:
The experimental production methods Chevreul’s salt being a intermediate stage product for copper recovery were investigated by dealing with the articles written on this topic. Chevreul’s salt, Cu2SO3.CuSO3.2H2O, being a mixed valence copper sulphite compound has been obtained by using different methods and reagents. Chevreul’s salt has a intense brick-red color. It is a highly stable and expensive salt. The production of Chevreul’s salt plays a key role in hiydrometallurgy. In recent years, researchs on this compound have been intensified. Silva et al. reported that this salt is thermally stable up to 200oC. Çolak et al. precipitated the Chevreul’s salt by using ammonia and sulphur dioxide. Çalban et al. obtained at the optimum conditions by passing SO2 from leach solutions with NH3-(NH4)2SO4. Yeşiryurt and Çalban investigated the optimum precipitation conditions of Chevreul’s salt from synthetic CuSO4 solutions including Na2SO3. Çalban et al. achieved the precipitation of Chevreul’s salt at the optimum conditions by passing SO2 from synthetic CuSO4 solutions. Çalban et al. examined the precipitation conditions of Chevreul’s salt using (NH4)2SO3 from synthetic aqueous CuSO4 solutions. In light of these studies, it can be said that Chevreul’s salt can be produced practically from both a leach solutions including copper and synthetic CuSO4 solutions.Keywords: Chevreul’s salt, ammonia, copper sulpfite, sodium sülfite, optimum conditions
Procedia PDF Downloads 26814814 The Analysis of Secondary Case Studies as a Starting Point for Grounded Theory Studies: An Example from the Enterprise Software Industry
Authors: Abilio Avila, Orestis Terzidis
Abstract:
A fundamental principle of Grounded Theory (GT) is to prevent the formation of preconceived theories. This implies the need to start a research study with an open mind and to avoid being absorbed by the existing literature. However, to start a new study without an understanding of the research domain and its context can be extremely challenging. This paper presents a research approach that simultaneously supports a researcher to identify and to focus on critical areas of a research project and prevent the formation of prejudiced concepts by the current body of literature. This approach comprises of four stages: Selection of secondary case studies, analysis of secondary case studies, development of an initial conceptual framework, development of an initial interview guide. The analysis of secondary case studies as a starting point for a research project allows a researcher to create a first understanding of a research area based on real-world cases without being influenced by the existing body of theory. It enables a researcher to develop through a structured course of actions a firm guide that establishes a solid starting point for further investigations. Thus, the described approach may have significant implications for GT researchers who aim to start a study within a given research area.Keywords: grounded theory, interview guide, qualitative research, secondary case studies, secondary data analysis
Procedia PDF Downloads 26614813 Speckle Noise Reduction Using Anisotropic Filter Based on Wavelets
Authors: Kritika Bansal, Akwinder Kaur, Shruti Gujral
Abstract:
In this paper, the approach of denoising is solved by using a new hybrid technique which associates the different denoising methods. Wavelet thresholding and anisotropic diffusion filter are the two different filters in our hybrid techniques. The Wavelet thresholding removes the noise by removing the high frequency components with lesser edge preservation, whereas an anisotropic diffusion filters is based on partial differential equation, (PDE) to remove the speckle noise. This PDE approach is used to preserve the edges and provides better smoothing. So our new method proposes a combination of these two filtering methods which performs better results in terms of peak signal to noise ratio (PSNR), coefficient of correlation (COC) and equivalent no of looks (ENL).Keywords: denoising, anisotropic diffusion filter, multiplicative noise, speckle, wavelets
Procedia PDF Downloads 51214812 Influence of Processing Regime and Contaminants on the Properties of Postconsumer Thermoplastics
Authors: Fares Alsewailem
Abstract:
Material recycling of thermoplastic waste offers practical solution for municipal solid waste reduction. Post-consumer plastics such as polyethylene (PE), polyethyleneterephtalate (PET), and polystyrene (PS) may be separated from each other by physical methods such as density difference and hence processed as single plastic, however one should be cautious about the contaminants presence in the waste stream inform of paper, glue, etc. since these articles even in trace amount may deteriorate properties of the recycled plastics especially the mechanical properties. furthermore, melt processing methods used to recycle thermoplastics such as extrusion and compression molding may induce degradation of some of the recycled plastics such as PET and PS. In this research, it is shown that care should be taken when processing recycled plastics by melt processing means in two directions, first contaminants should be extremely minimized, and secondly melt processing steps should also be minimum.Keywords: Recycling, PET, PS, HDPE, mechanical
Procedia PDF Downloads 28414811 Modelling Mode Choice Behaviour Using Cloud Theory
Authors: Leah Wright, Trevor Townsend
Abstract:
Mode choice models are crucial instruments in the analysis of travel behaviour. These models show the relationship between an individual’s choice of transportation mode for a given O-D pair and the individual’s socioeconomic characteristics such as household size and income level, age and/or gender, and the features of the transportation system. The most popular functional forms of these models are based on Utility-Based Choice Theory, which addresses the uncertainty in the decision-making process with the use of an error term. However, with the development of artificial intelligence, many researchers have started to take a different approach to travel demand modelling. In recent times, researchers have looked at using neural networks, fuzzy logic and rough set theory to develop improved mode choice formulas. The concept of cloud theory has recently been introduced to model decision-making under uncertainty. Unlike the previously mentioned theories, cloud theory recognises a relationship between randomness and fuzziness, two of the most common types of uncertainty. This research aims to investigate the use of cloud theory in mode choice models. This paper highlights the conceptual framework of the mode choice model using cloud theory. Merging decision-making under uncertainty and mode choice models is state of the art. The cloud theory model is expected to address the issues and concerns with the nested logit and improve the design of mode choice models and their use in travel demand.Keywords: Cloud theory, decision-making, mode choice models, travel behaviour, uncertainty
Procedia PDF Downloads 38814810 Assessing the Competence of Oral Surgery Trainees: A Systematic Review
Authors: Chana Pavneet
Abstract:
Background: In more recent years in dentistry, a greater emphasis has been placed on competency-based education (CBE) programmes. Undergraduate and postgraduate curriculums have been reformed to reflect these changes, and adopting a CBE approach has shown to be beneficial to trainees and places an emphasis on continuous lifelong learning. The literature is vast; however, very little work has been done specifically to the assessment of competence in dentistry and even less so in oral surgery. The majority of the literature tends to opinion pieces. Some small-scale studies have been undertaken in this area researching assessment tools which can be used to assess competence in oral surgery. However, there is a lack of general consensus on the preferable assessment methods. The aim of this review is to identify the assessment methods available and their usefulness. Methods: Electronic databases (Medline, Embase, and the Cochrane Database of systematic reviews) were searched. PRISMA guidelines were followed to identify relevant papers. Abstracts of studies were reviewed, and if they met the inclusion criteria, they were included in the review. Papers were reviewed against the critical appraisal skills programme (CASP) checklist and medical education research quality instrument (MERQSI) to assess their quality and identify any bias in a systematic manner. The validity and reliability of each assessment method or tool were assessed. Results: A number of assessment methods were identified, including self-assessment, peer assessment, and direct observation of skills by someone senior. Senior assessment tended to be the preferred method, followed by self-assessment and, finally, peer assessment. The level of training was shown to affect the preferred assessment method, with one study finding peer assessment more useful in postgraduate trainees as opposed to undergraduate trainees. Numerous tools for assessment were identified, including a checklist scale and a global rating scale. Both had their strengths and weaknesses, but the evidence was more favourable for global rating scales in terms of reliability, applicability to more clinical situations, and easier to use for examiners. Studies also looked into trainees’ opinions on assessment tools. Logbooks were not found to be significant in measuring the competence of trainees. Conclusion: There is limited literature exploring the methods and tools which assess the competence of oral surgery trainees. Current evidence shows that the most favourable assessment method and tool may differ depending on the stage of training. More research is required in this area to streamline assessment methods and tools.Keywords: competence, oral surgery, assessment, trainees, education
Procedia PDF Downloads 13414809 Evaluating the ‘Assembled Educator’ of a Specialized Postgraduate Engineering Course Using Activity Theory and Genre Ecologies
Authors: Simon Winberg
Abstract:
The landscape of professional postgraduate education is changing: the focus of these programmes is moving from preparing candidates for a life in academia towards a focus of training in expert knowledge and skills to support industry. This is especially pronounced in engineering disciplines where increasingly more complex products are drawing on a depth of knowledge from multiple fields. This connects strongly with the broader notion of Industry 4.0 – where technology and society are being brought together to achieve more powerful and desirable products, but products whose inner workings also are more complex than before. The changes in what we do, and how we do it, has a profound impact on what industry would like universities to provide. One such change is the increased demand for taught doctoral and Masters programmes. These programmes aim to provide skills and training for professionals, to expand their knowledge of state-of-the-art tools and technologies. This paper investigates one such course, namely a Software Defined Radio (SDR) Master’s degree course. The teaching support for this course had to be drawn from an existing pool of academics, none of who were specialists in this field. The paper focuses on the kind of educator, a ‘hybrid academic’, assembled from available academic staff and bolstered by research. The conceptual framework for this paper combines Activity Theory and Genre Ecology. Activity Theory is used to reason about learning and interactions during the course, and Genre Ecology is used to model building and sharing of technical knowledge related to using tools and artifacts. Data were obtained from meetings with students and lecturers, logs, project reports, and course evaluations. The findings show how the course, which was initially academically-oriented, metamorphosed into a tool-dominant peer-learning structure, largely supported by the sharing of technical tool-based knowledge. While the academic staff could address gaps in the participants’ fundamental knowledge of radio systems, the participants brought with them extensive specialized knowledge and tool experience which they shared with the class. This created a complicated dynamic in the class, which centered largely on engagements with technology artifacts, such as simulators, from which knowledge was built. The course was characterized by a richness of ‘epistemic objects’, which is to say objects that had knowledge-generating qualities. A significant portion of the course curriculum had to be adapted, and the learning methods changed to accommodate the dynamic interactions that occurred during classes. This paper explains the SDR Masters course in terms of conflicts and innovations in its activity system, as well as the continually hybridizing genre ecology to show how the structuring and resource-dependence of the course transformed from its initial ‘traditional’ academic structure to a more entangled arrangement over time. It is hoped that insights from this paper would benefit other educators involved in the design and teaching of similar types of specialized professional postgraduate taught programmes.Keywords: professional postgraduate education, taught masters, engineering education, software defined radio
Procedia PDF Downloads 9214808 Effect of Different Processing Methods on the Proximate, Functional, Sensory, and Nutritional Properties of Weaning Foods Formulated from Maize (Zea mays) and Soybean (Glycine max) Flour Blends
Authors: C. O. Agu, C. C. Okafor
Abstract:
Maize and soybean flours were produced using different methods of processing which include fermentation (FWF), roasting (RWF) and malting (MWF). Products from the different methods were mixed in the ratio 60:40 maize/soybean, respectively. These composites mixed with other ingredients such as sugar, vegetable oil, vanilla flavour and vitamin mix were analyzed for proximate composition, physical/functional, sensory and nutritional properties. The results for the protein content ranged between 6.25% and 16.65% with sample RWF having the highest value. Crude fibre values ranged from 3.72 to 10.0%, carbohydrate from 58.98% to 64.2%, ash from 1.27 to 2.45%. Physical and functional properties such as bulk density, wettability, gelation capacity have values between 0.74 and 0.76g/ml, 20.33 and 46.33 min and 0.73 to 0.93g/ml, respectively. On the sensory quality colour, flavour, taste, texture and general acceptability were determined. In terms of colour and flavour there was no significant difference (P < 0.05) while the values for taste ranged between 4.89 and 7.1 l, texture 5.50 to 8.38 and general acceptability 6.09 and 7.89. Nutritionally there is no significant difference (P < 0.05) between sample RWF and the control in all parameters considered. Samples FWF and MWF showed significantly (P < 0.5) lower values in all parameters determined. In the light of the above findings, roasting method is highly recommend in the production of weaning foods.Keywords: fermentation, malting, ratio, roasting, wettability
Procedia PDF Downloads 30414807 Efficient Chiller Plant Control Using Modern Reinforcement Learning
Authors: Jingwei Du
Abstract:
The need of optimizing air conditioning systems for existing buildings calls for control methods designed with energy-efficiency as a primary goal. The majority of current control methods boil down to two categories: empirical and model-based. To be effective, the former heavily relies on engineering expertise and the latter requires extensive historical data. Reinforcement Learning (RL), on the other hand, is a model-free approach that explores the environment to obtain an optimal control strategy often referred to as “policy”. This research adopts Proximal Policy Optimization (PPO) to improve chiller plant control, and enable the RL agent to collaborate with experienced engineers. It exploits the fact that while the industry lacks historical data, abundant operational data is available and allows the agent to learn and evolve safely under human supervision. Thanks to the development of language models, renewed interest in RL has led to modern, online, policy-based RL algorithms such as the PPO. This research took inspiration from “alignment”, a process that utilizes human feedback to finetune the pretrained model in case of unsafe content. The methodology can be summarized into three steps. First, an initial policy model is generated based on minimal prior knowledge. Next, the prepared PPO agent is deployed so feedback from both critic model and human experts can be collected for future finetuning. Finally, the agent learns and adapts itself to the specific chiller plant, updates the policy model and is ready for the next iteration. Besides the proposed approach, this study also used traditional RL methods to optimize the same simulated chiller plants for comparison, and it turns out that the proposed method is safe and effective at the same time and needs less to no historical data to start up.Keywords: chiller plant, control methods, energy efficiency, proximal policy optimization, reinforcement learning
Procedia PDF Downloads 2914806 Soil Degradati̇on Mapping Using Geographic Information System, Remote Sensing and Laboratory Analysis in the Oum Er Rbia High Basin, Middle Atlas, Morocco
Authors: Aafaf El Jazouli, Ahmed Barakat, Rida Khellouk
Abstract:
Mapping of soil degradation is derived from field observations, laboratory measurements, and remote sensing data, integrated quantitative methods to map the spatial characteristics of soil properties at different spatial and temporal scales to provide up-to-date information on the field. Since soil salinity, texture and organic matter play a vital role in assessing topsoil characteristics and soil quality, remote sensing can be considered an effective method for studying these properties. The main objective of this research is to asses soil degradation by combining remote sensing data and laboratory analysis. In order to achieve this goal, the required study of soil samples was taken at 50 locations in the upper basin of Oum Er Rbia in the Middle Atlas in Morocco. These samples were dried, sieved to 2 mm and analyzed in the laboratory. Landsat 8 OLI imagery was analyzed using physical or empirical methods to derive soil properties. In addition, remote sensing can serve as a supporting data source. Deterministic potential (Spline and Inverse Distance weighting) and probabilistic interpolation methods (ordinary kriging and universal kriging) were used to produce maps of each grain size class and soil properties using GIS software. As a result, a correlation was found between soil texture and soil organic matter content. This approach developed in ongoing research will improve the prospects for the use of remote sensing data for mapping soil degradation in arid and semi-arid environments.Keywords: Soil degradation, GIS, interpolation methods (spline, IDW, kriging), Landsat 8 OLI, Oum Er Rbia high basin
Procedia PDF Downloads 16514805 Neural Networks with Different Initialization Methods for Depression Detection
Authors: Tianle Yang
Abstract:
As a common mental disorder, depression is a leading cause of various diseases worldwide. Early detection and treatment of depression can dramatically promote remission and prevent relapse. However, conventional ways of depression diagnosis require considerable human effort and cause economic burden, while still being prone to misdiagnosis. On the other hand, recent studies report that physical characteristics are major contributors to the diagnosis of depression, which inspires us to mine the internal relationship by neural networks instead of relying on clinical experiences. In this paper, neural networks are constructed to predict depression from physical characteristics. Two initialization methods are examined - Xaiver and Kaiming initialization. Experimental results show that a 3-layers neural network with Kaiming initialization achieves 83% accuracy.Keywords: depression, neural network, Xavier initialization, Kaiming initialization
Procedia PDF Downloads 12814804 A Method to Saturation Modeling of Synchronous Machines in d-q Axes
Authors: Mohamed Arbi Khlifi, Badr M. Alshammari
Abstract:
This paper discusses the general methods to saturation in the steady-state, two axis (d & q) frame models of synchronous machines. In particular, the important role of the magnetic coupling between the d-q axes (cross-magnetizing phenomenon), is demonstrated. For that purpose, distinct methods of saturation modeling of dumper synchronous machine with cross-saturation are identified, and detailed models synthesis in d-q axes. A number of models are given in the final developed form. The procedure and the novel models are verified by a critical application to prove the validity of the method and the equivalence between all developed models is reported. Advantages of some of the models over the existing ones and their applicability are discussed.Keywords: cross-magnetizing, models synthesis, synchronous machine, saturated modeling, state-space vectors
Procedia PDF Downloads 45414803 Gene Prediction in DNA Sequences Using an Ensemble Algorithm Based on Goertzel Algorithm and Anti-Notch Filter
Authors: Hamidreza Saberkari, Mousa Shamsi, Hossein Ahmadi, Saeed Vaali, , MohammadHossein Sedaaghi
Abstract:
In the recent years, using signal processing tools for accurate identification of the protein coding regions has become a challenge in bioinformatics. Most of the genomic signal processing methods is based on the period-3 characteristics of the nucleoids in DNA strands and consequently, spectral analysis is applied to the numerical sequences of DNA to find the location of periodical components. In this paper, a novel ensemble algorithm for gene selection in DNA sequences has been presented which is based on the combination of Goertzel algorithm and anti-notch filter (ANF). The proposed algorithm has many advantages when compared to other conventional methods. Firstly, it leads to identify the coding protein regions more accurate due to using the Goertzel algorithm which is tuned at the desired frequency. Secondly, faster detection time is achieved. The proposed algorithm is applied on several genes, including genes available in databases BG570 and HMR195 and their results are compared to other methods based on the nucleotide level evaluation criteria. Implementation results show the excellent performance of the proposed algorithm in identifying protein coding regions, specifically in identification of small-scale gene areas.Keywords: protein coding regions, period-3, anti-notch filter, Goertzel algorithm
Procedia PDF Downloads 38714802 Frames as Interests and Goals: The Case of MedTech Entrepreneurs' Capital Raising Strategies in Australia
Authors: Joelle Hawa, Michael Gilding
Abstract:
The role of interest as a driver of action has been an on-going debate in the sociological sciences. This paper shows evidence as to how economic actors frame their environment in terms of interests and goals to take action. It introduces the concept of 'dynamic actor compass', a cognitive tool that is socially contingent and allows economic actors to navigate their environment, evaluate the level of alignment of interests and goals with other players, and decide whether or not they are willing to rely on, collaborate or partner with others in the field. The paper builds on Kaplan’s model of framing contests and integrates Max Weber’s interests, and ideas construct as well as Beckert’s concept of fictional expectations. The author illustrates this conceptual framework in the case of MedTech entrepreneurs’ capital raising activities in Australia. The study adopts a grounded theory methodology, running in-depth interviews with 24 MedTech entrepreneurs in order to examine their decision-making processes and actions to finance their innovation trajectory. The findings show that participants take into account material and ideal interests and goals that they impose adapt or negotiate with other actors in their environment. These interactions affect the way MedTech entrepreneurs perceive other funders in the field, influencing their capital raising strategies.Keywords: expectations, financing innovation, frames, goals, interest-oriented action, managerial cognition
Procedia PDF Downloads 14114801 Presenting a Model Of Empowering New Knowledge-based Companies In Iran Insurance Industry
Authors: Pedram Saadati, Zahra Nazari
Abstract:
In the last decade, the role and importance of knowledge-based technological businesses in the insurance industry has greatly increased, and due to the weakness of previous studies in Iran, the current research deals with the design of the InsurTech empowerment model. In order to obtain the conceptual model of the research, a hybrid framework has been used. The statistical population of the research in the qualitative part were experts, and in the quantitative part, the InsurTech activists. The tools of data collection in the qualitative part were in-depth and semi-structured interviews and structured self-interaction matrix, and in the quantitative part, a researcher-made questionnaire. In the qualitative part, 55 indicators, 20 components and 8 concepts (dimensions) were obtained by the content analysis method, then the relationships of the concepts with each other and the levels of the components were investigated. In the quantitative part, the information was analyzed using the descriptive analytical method in the way of path analysis and confirmatory factor analysis. The proposed model consists of eight dimensions of supporter capability, supervisor of insurance innovation ecosystem, managerial, financial, technological, marketing, opportunity identification, innovative InsurTech capabilities. The results of statistical tests in identifying the relationships of the concepts with each other have been examined in detail and suggestions have been presented in the conclusion section.Keywords: insurTech, knowledge-base, empowerment model, factor analysis, insurance
Procedia PDF Downloads 4614800 Sentiment Classification Using Enhanced Contextual Valence Shifters
Authors: Vo Ngoc Phu, Phan Thi Tuoi
Abstract:
We have explored different methods of improving the accuracy of sentiment classification. The sentiment orientation of a document can be positive (+), negative (-), or neutral (0). We combine five dictionaries from [2, 3, 4, 5, 6] into the new one with 21137 entries. The new dictionary has many verbs, adverbs, phrases and idioms, that are not in five ones before. The paper shows that our proposed method based on the combination of Term-Counting method and Enhanced Contextual Valence Shifters method has improved the accuracy of sentiment classification. The combined method has accuracy 68.984% on the testing dataset, and 69.224% on the training dataset. All of these methods are implemented to classify the reviews based on our new dictionary and the Internet Movie data set.Keywords: sentiment classification, sentiment orientation, valence shifters, contextual, valence shifters, term counting
Procedia PDF Downloads 50414799 Computable Difference Matrix for Synonyms in the Holy Quran
Authors: Mohamed Ali Al Shaari, Khalid M. El Fitori
Abstract:
In the field of Quran Studies known as Ghareeb A Quran (the study of the meanings of strange words and structures in Holy Quran), it is difficult to distinguish some pragmatic meanings from conceptual meanings. One who wants to study this subject may need to look for a common usage between any two words or more; to understand general meaning, and sometimes may need to look for common differences between them, even if there are synonyms (word sisters). Some of the distinguished scholars of Arabic linguistics believe that there are no synonym words, they believe in varieties of meaning and multi-context usage. Based on this viewpoint, our method was designed to look for synonyms of a word, then the differences that distinct the word and their synonyms. There are many available books that use such a method e.g. synonyms books, dictionaries, glossaries, and some books on the interpretations of strange vocabulary of the Holy Quran, but it is difficult to look up words in these written works. For that reason, we proposed a logical entity, which we called Differences Matrix (DM). DM groups the synonyms words to extract the relations between them and to know the general meaning, which defines the skeleton of all word synonyms; this meaning is expressed by a word of its sisters. In Differences Matrix, we used the sisters(words) as titles for rows and columns, and in the obtained cells we tried to define the row title (word) by using column title (her sister), so the relations between sisters appear, the expected result is well defined groups of sisters for each word. We represented the obtained results formally, and used the defined groups as a base for building the ontology of the Holy Quran synonyms.Keywords: Quran, synonyms, differences matrix, ontology
Procedia PDF Downloads 41914798 Quintic Spline Solution of Fourth-Order Parabolic Equations Arising in Beam Theory
Authors: Reza Mohammadi, Mahdieh Sahebi
Abstract:
We develop a method based on polynomial quintic spline for numerical solution of fourth-order non-homogeneous parabolic partial differential equation with variable coefficient. By using polynomial quintic spline in off-step points in space and finite difference in time directions, we obtained two three level implicit methods. Stability analysis of the presented method has been carried out. We solve four test problems numerically to validate the derived method. Numerical comparison with other methods shows the superiority of presented scheme.Keywords: fourth-order parabolic equation, variable coefficient, polynomial quintic spline, off-step points
Procedia PDF Downloads 35214797 Applying the Extreme-Based Teaching Model in Post-Secondary Online Classroom Setting: A Field Experiment
Authors: Leon Pan
Abstract:
The first programming course within post-secondary education has long been recognized as a challenging endeavor for both educators and students alike. Historically, these courses have exhibited high failure rates and a notable number of dropouts. Instructors often lament students' lack of effort in their coursework, and students often express frustration that the teaching methods employed are not effective. Drawing inspiration from the successful principles of Extreme Programming, this study introduces an approach—the Extremes-based teaching model — aimed at enhancing the teaching of introductory programming courses. To empirically determine the effectiveness of the model, a comparison was made between a section taught using the extreme-based model and another utilizing traditional teaching methods. Notably, the extreme-based teaching class required students to work collaboratively on projects while also demanding continuous assessment and performance enhancement within groups. This paper details the application of the extreme-based model within the post-secondary online classroom context and presents the compelling results that emphasize its effectiveness in advancing the teaching and learning experiences. The extreme-based model led to a significant increase of 13.46 points in the weighted total average and a commendable 10% reduction in the failure rate.Keywords: extreme-based teaching model, innovative pedagogical methods, project-based learning, team-based learning
Procedia PDF Downloads 5914796 Towards a Robust Patch Based Multi-View Stereo Technique for Textureless and Occluded 3D Reconstruction
Authors: Ben Haines, Li Bai
Abstract:
Patch based reconstruction methods have been and still are one of the top performing approaches to 3D reconstruction to date. Their local approach to refining the position and orientation of a patch, free of global minimisation and independent of surface smoothness, make patch based methods extremely powerful in recovering fine grained detail of an objects surface. However, patch based approaches still fail to faithfully reconstruct textureless or highly occluded surface regions thus though performing well under lab conditions, deteriorate in industrial or real world situations. They are also computationally expensive. Current patch based methods generate point clouds with holes in texturesless or occluded regions that require expensive energy minimisation techniques to fill and interpolate a high fidelity reconstruction. Such shortcomings hinder the adaptation of the methods for industrial applications where object surfaces are often highly textureless and the speed of reconstruction is an important factor. This paper presents on-going work towards a multi-resolution approach to address the problems, utilizing particle swarm optimisation to reconstruct high fidelity geometry, and increasing robustness to textureless features through an adapted approach to the normalised cross correlation. The work also aims to speed up the reconstruction using advances in GPU technologies and remove the need for costly initialization and expansion. Through the combination of these enhancements, it is the intention of this work to create denser patch clouds even in textureless regions within a reasonable time. Initial results show the potential of such an approach to construct denser point clouds with a comparable accuracy to that of the current top-performing algorithms.Keywords: 3D reconstruction, multiview stereo, particle swarm optimisation, photo consistency
Procedia PDF Downloads 20314795 Passive Solar Water Concepts for Human Comfort
Authors: Eyibo Ebengeobong Eddie
Abstract:
Taking advantage of the sun's position to design buildings to ensure human comfort has always been an important aspect in an architectural design. Using cheap and less expensive methods and systems for gaining solar energy, heating and cooling has always been a great advantage to users and occupants of a building. As the years run by, daily techniques and methods have been created and more are being discovered to help reduce the energy demands of any building. Architects have made effective use of a buildings orientation, building materials and elements to achieve less energy demand. This paper talks about the various techniques used in solar heating and passive cooling of buildings and through water techniques and concepts to achieve thermal comfort.Keywords: comfort, passive, solar, water
Procedia PDF Downloads 46014794 Dosimetric Comparison of Conventional Optimization Methods with Inverse Planning Simulated Annealing Technique
Authors: Shraddha Srivastava, N. K. Painuly, S. P. Mishra, Navin Singh, Muhsin Punchankandy, Kirti Srivastava, M. L. B. Bhatt
Abstract:
Various optimization methods used in interstitial brachytherapy are based on dwell positions and dwell weights alteration to produce dose distribution based on the implant geometry. Since these optimization schemes are not anatomy based, they could lead to deviations from the desired plan. This study was henceforth carried out to compare anatomy-based Inverse Planning Simulated Annealing (IPSA) optimization technique with graphical and geometrical optimization methods in interstitial high dose rate brachytherapy planning of cervical carcinoma. Six patients with 12 CT data sets of MUPIT implants in HDR brachytherapy of cervical cancer were prospectively studied. HR-CTV and organs at risk (OARs) were contoured in Oncentra treatment planning system (TPS) using GYN GEC-ESTRO guidelines on cervical carcinoma. Three sets of plans were generated for each fraction using IPSA, graphical optimization (GrOPT) and geometrical optimization (GOPT) methods. All patients were treated to a dose of 20 Gy in 2 fractions. The main objective was to cover at least 95% of HR-CTV with 100% of the prescribed dose (V100 ≥ 95% of HR-CTV). IPSA, GrOPT, and GOPT based plans were compared in terms of target coverage, OAR doses, homogeneity index (HI) and conformity index (COIN) using dose-volume histogram (DVH). Target volume coverage (mean V100) was found to be 93.980.87%, 91.341.02% and 85.052.84% for IPSA, GrOPT and GOPT plans respectively. Mean D90 (minimum dose received by 90% of HR-CTV) values for IPSA, GrOPT and GOPT plans were 10.19 ± 1.07 Gy, 10.17 ± 0.12 Gy and 7.99 ± 1.0 Gy respectively, while D100 (minimum dose received by 100% volume of HR-CTV) for IPSA, GrOPT and GOPT plans was 6.55 ± 0.85 Gy, 6.55 ± 0.65 Gy, 4.73 ± 0.14 Gy respectively. IPSA plans resulted in lower doses to the bladder (D₂Keywords: cervical cancer, HDR brachytherapy, IPSA, MUPIT
Procedia PDF Downloads 18714793 Optimization of a Method of Total RNA Extraction from Mentha piperita
Authors: Soheila Afkar
Abstract:
Mentha piperita is a medicinal plant that contains a large amount of secondary metabolite that has adverse effect on RNA extraction. Since high quality of RNA is the first step to real time-PCR, in this study optimization of total RNA isolation from leaf tissues of Mentha piperita was evaluated. From this point of view, we researched two different total RNA extraction methods on leaves of Mentha piperita to find the best one that contributes the high quality. The methods tested are RNX-plus, modified RNX-plus (1-5 numbers). RNA quality was analyzed by agarose gel 1.5%. The RNA integrity was also assessed by visualization of ribosomal RNA bands on 1.5% agarose gels. In the modified RNX-plus method (number 2), the integrity of 28S and 18S rRNA was highly satisfactory when analyzed in agarose denaturing gel, so this method is suitable for RNA isolation from Mentha piperita.Keywords: Mentha piperita, polyphenol, polysaccharide, RNA extraction
Procedia PDF Downloads 19014792 Modern Methods of Technology and Organization of Production of Construction Works during the Implementation of Construction 3D Printers
Authors: Azizakhanim Maharramli
Abstract:
The gradual transition from entrenched traditional technology and organization of construction production to innovative additive construction technology inevitably meets technological, technical, organizational, labour, and, finally, social difficulties. Therefore, the chosen nodal method will lead to the elimination of the above difficulties, combining some of the usual methods of construction and the myth in world practice that the labour force is subjected to a strong stream of reduction. The nodal method of additive technology will create favourable conditions for the optimal degree of distribution of labour across facilities due to the consistent performance of homogeneous work and the introduction of additive technology and traditional technology into construction production.Keywords: parallel method, sequential method, stream method, combined method, nodal method
Procedia PDF Downloads 9414791 Assessing Lithium Recovery from Secondary Sources
Authors: Carolina A. Santos, Alexandra B. Ribeiro
Abstract:
Climate change and environmental degradation are threats to humanity. Europe has been addressing these problems, namely through the Green Deal, with the use of batteries in mobility and energy fields. However, these require the use of critical raw materials, like lithium, which demand is estimated to grow 60 times in the next 30 years. Thus, it is fundamental to promote a circular economy with lithium recovery from secondary resources. These are nowadays key topics, which will be even more relevant in the future, so a new way to approach them is needed and must be encouraged. Therefore, one of our main goals is to analyse two methods of lithium retrieval from secondary sources, bioleaching, and electrodialysis, and assess them regarding their sustainability. The latest results show good efficiency of removal with both methods, even though there are some matrix interferences. Hence, further investment and research are needed in order to make this process sustainable and our society more circular.Keywords: lithium, sustainable mining, social license to operate, bioleaching, electrodialysis
Procedia PDF Downloads 13014790 Reduction of Peak Input Currents during Charge Pump Boosting in Monolithically Integrated High-Voltage Generators
Authors: Jan Doutreloigne
Abstract:
This paper describes two methods for the reduction of the peak input current during the boosting of Dickson charge pumps. Both methods are implemented in the fully integrated Dickson charge pumps of a high-voltage display driver chip for smart-card applications. Experimental results reveal good correspondence with Spice simulations and show a reduction of the peak input current by a factor of 6 during boostingKeywords: bi-stable display driver, Dickson charge pump, high-voltage generator, peak current reduction, sub-pump boosting, variable frequency boosting
Procedia PDF Downloads 456