Search results for: Lyapunov methods
14549 Assessing the Competence of Oral Surgery Trainees: A Systematic Review
Authors: Chana Pavneet
Abstract:
Background: In more recent years in dentistry, a greater emphasis has been placed on competency-based education (CBE) programmes. Undergraduate and postgraduate curriculums have been reformed to reflect these changes, and adopting a CBE approach has shown to be beneficial to trainees and places an emphasis on continuous lifelong learning. The literature is vast; however, very little work has been done specifically to the assessment of competence in dentistry and even less so in oral surgery. The majority of the literature tends to opinion pieces. Some small-scale studies have been undertaken in this area researching assessment tools which can be used to assess competence in oral surgery. However, there is a lack of general consensus on the preferable assessment methods. The aim of this review is to identify the assessment methods available and their usefulness. Methods: Electronic databases (Medline, Embase, and the Cochrane Database of systematic reviews) were searched. PRISMA guidelines were followed to identify relevant papers. Abstracts of studies were reviewed, and if they met the inclusion criteria, they were included in the review. Papers were reviewed against the critical appraisal skills programme (CASP) checklist and medical education research quality instrument (MERQSI) to assess their quality and identify any bias in a systematic manner. The validity and reliability of each assessment method or tool were assessed. Results: A number of assessment methods were identified, including self-assessment, peer assessment, and direct observation of skills by someone senior. Senior assessment tended to be the preferred method, followed by self-assessment and, finally, peer assessment. The level of training was shown to affect the preferred assessment method, with one study finding peer assessment more useful in postgraduate trainees as opposed to undergraduate trainees. Numerous tools for assessment were identified, including a checklist scale and a global rating scale. Both had their strengths and weaknesses, but the evidence was more favourable for global rating scales in terms of reliability, applicability to more clinical situations, and easier to use for examiners. Studies also looked into trainees’ opinions on assessment tools. Logbooks were not found to be significant in measuring the competence of trainees. Conclusion: There is limited literature exploring the methods and tools which assess the competence of oral surgery trainees. Current evidence shows that the most favourable assessment method and tool may differ depending on the stage of training. More research is required in this area to streamline assessment methods and tools.Keywords: competence, oral surgery, assessment, trainees, education
Procedia PDF Downloads 13314548 Effect of Different Processing Methods on the Proximate, Functional, Sensory, and Nutritional Properties of Weaning Foods Formulated from Maize (Zea mays) and Soybean (Glycine max) Flour Blends
Authors: C. O. Agu, C. C. Okafor
Abstract:
Maize and soybean flours were produced using different methods of processing which include fermentation (FWF), roasting (RWF) and malting (MWF). Products from the different methods were mixed in the ratio 60:40 maize/soybean, respectively. These composites mixed with other ingredients such as sugar, vegetable oil, vanilla flavour and vitamin mix were analyzed for proximate composition, physical/functional, sensory and nutritional properties. The results for the protein content ranged between 6.25% and 16.65% with sample RWF having the highest value. Crude fibre values ranged from 3.72 to 10.0%, carbohydrate from 58.98% to 64.2%, ash from 1.27 to 2.45%. Physical and functional properties such as bulk density, wettability, gelation capacity have values between 0.74 and 0.76g/ml, 20.33 and 46.33 min and 0.73 to 0.93g/ml, respectively. On the sensory quality colour, flavour, taste, texture and general acceptability were determined. In terms of colour and flavour there was no significant difference (P < 0.05) while the values for taste ranged between 4.89 and 7.1 l, texture 5.50 to 8.38 and general acceptability 6.09 and 7.89. Nutritionally there is no significant difference (P < 0.05) between sample RWF and the control in all parameters considered. Samples FWF and MWF showed significantly (P < 0.5) lower values in all parameters determined. In the light of the above findings, roasting method is highly recommend in the production of weaning foods.Keywords: fermentation, malting, ratio, roasting, wettability
Procedia PDF Downloads 30314547 Efficient Chiller Plant Control Using Modern Reinforcement Learning
Authors: Jingwei Du
Abstract:
The need of optimizing air conditioning systems for existing buildings calls for control methods designed with energy-efficiency as a primary goal. The majority of current control methods boil down to two categories: empirical and model-based. To be effective, the former heavily relies on engineering expertise and the latter requires extensive historical data. Reinforcement Learning (RL), on the other hand, is a model-free approach that explores the environment to obtain an optimal control strategy often referred to as “policy”. This research adopts Proximal Policy Optimization (PPO) to improve chiller plant control, and enable the RL agent to collaborate with experienced engineers. It exploits the fact that while the industry lacks historical data, abundant operational data is available and allows the agent to learn and evolve safely under human supervision. Thanks to the development of language models, renewed interest in RL has led to modern, online, policy-based RL algorithms such as the PPO. This research took inspiration from “alignment”, a process that utilizes human feedback to finetune the pretrained model in case of unsafe content. The methodology can be summarized into three steps. First, an initial policy model is generated based on minimal prior knowledge. Next, the prepared PPO agent is deployed so feedback from both critic model and human experts can be collected for future finetuning. Finally, the agent learns and adapts itself to the specific chiller plant, updates the policy model and is ready for the next iteration. Besides the proposed approach, this study also used traditional RL methods to optimize the same simulated chiller plants for comparison, and it turns out that the proposed method is safe and effective at the same time and needs less to no historical data to start up.Keywords: chiller plant, control methods, energy efficiency, proximal policy optimization, reinforcement learning
Procedia PDF Downloads 2514546 Soil Degradati̇on Mapping Using Geographic Information System, Remote Sensing and Laboratory Analysis in the Oum Er Rbia High Basin, Middle Atlas, Morocco
Authors: Aafaf El Jazouli, Ahmed Barakat, Rida Khellouk
Abstract:
Mapping of soil degradation is derived from field observations, laboratory measurements, and remote sensing data, integrated quantitative methods to map the spatial characteristics of soil properties at different spatial and temporal scales to provide up-to-date information on the field. Since soil salinity, texture and organic matter play a vital role in assessing topsoil characteristics and soil quality, remote sensing can be considered an effective method for studying these properties. The main objective of this research is to asses soil degradation by combining remote sensing data and laboratory analysis. In order to achieve this goal, the required study of soil samples was taken at 50 locations in the upper basin of Oum Er Rbia in the Middle Atlas in Morocco. These samples were dried, sieved to 2 mm and analyzed in the laboratory. Landsat 8 OLI imagery was analyzed using physical or empirical methods to derive soil properties. In addition, remote sensing can serve as a supporting data source. Deterministic potential (Spline and Inverse Distance weighting) and probabilistic interpolation methods (ordinary kriging and universal kriging) were used to produce maps of each grain size class and soil properties using GIS software. As a result, a correlation was found between soil texture and soil organic matter content. This approach developed in ongoing research will improve the prospects for the use of remote sensing data for mapping soil degradation in arid and semi-arid environments.Keywords: Soil degradation, GIS, interpolation methods (spline, IDW, kriging), Landsat 8 OLI, Oum Er Rbia high basin
Procedia PDF Downloads 16214545 Neural Networks with Different Initialization Methods for Depression Detection
Authors: Tianle Yang
Abstract:
As a common mental disorder, depression is a leading cause of various diseases worldwide. Early detection and treatment of depression can dramatically promote remission and prevent relapse. However, conventional ways of depression diagnosis require considerable human effort and cause economic burden, while still being prone to misdiagnosis. On the other hand, recent studies report that physical characteristics are major contributors to the diagnosis of depression, which inspires us to mine the internal relationship by neural networks instead of relying on clinical experiences. In this paper, neural networks are constructed to predict depression from physical characteristics. Two initialization methods are examined - Xaiver and Kaiming initialization. Experimental results show that a 3-layers neural network with Kaiming initialization achieves 83% accuracy.Keywords: depression, neural network, Xavier initialization, Kaiming initialization
Procedia PDF Downloads 12714544 A Method to Saturation Modeling of Synchronous Machines in d-q Axes
Authors: Mohamed Arbi Khlifi, Badr M. Alshammari
Abstract:
This paper discusses the general methods to saturation in the steady-state, two axis (d & q) frame models of synchronous machines. In particular, the important role of the magnetic coupling between the d-q axes (cross-magnetizing phenomenon), is demonstrated. For that purpose, distinct methods of saturation modeling of dumper synchronous machine with cross-saturation are identified, and detailed models synthesis in d-q axes. A number of models are given in the final developed form. The procedure and the novel models are verified by a critical application to prove the validity of the method and the equivalence between all developed models is reported. Advantages of some of the models over the existing ones and their applicability are discussed.Keywords: cross-magnetizing, models synthesis, synchronous machine, saturated modeling, state-space vectors
Procedia PDF Downloads 45214543 Gene Prediction in DNA Sequences Using an Ensemble Algorithm Based on Goertzel Algorithm and Anti-Notch Filter
Authors: Hamidreza Saberkari, Mousa Shamsi, Hossein Ahmadi, Saeed Vaali, , MohammadHossein Sedaaghi
Abstract:
In the recent years, using signal processing tools for accurate identification of the protein coding regions has become a challenge in bioinformatics. Most of the genomic signal processing methods is based on the period-3 characteristics of the nucleoids in DNA strands and consequently, spectral analysis is applied to the numerical sequences of DNA to find the location of periodical components. In this paper, a novel ensemble algorithm for gene selection in DNA sequences has been presented which is based on the combination of Goertzel algorithm and anti-notch filter (ANF). The proposed algorithm has many advantages when compared to other conventional methods. Firstly, it leads to identify the coding protein regions more accurate due to using the Goertzel algorithm which is tuned at the desired frequency. Secondly, faster detection time is achieved. The proposed algorithm is applied on several genes, including genes available in databases BG570 and HMR195 and their results are compared to other methods based on the nucleotide level evaluation criteria. Implementation results show the excellent performance of the proposed algorithm in identifying protein coding regions, specifically in identification of small-scale gene areas.Keywords: protein coding regions, period-3, anti-notch filter, Goertzel algorithm
Procedia PDF Downloads 38514542 Sentiment Classification Using Enhanced Contextual Valence Shifters
Authors: Vo Ngoc Phu, Phan Thi Tuoi
Abstract:
We have explored different methods of improving the accuracy of sentiment classification. The sentiment orientation of a document can be positive (+), negative (-), or neutral (0). We combine five dictionaries from [2, 3, 4, 5, 6] into the new one with 21137 entries. The new dictionary has many verbs, adverbs, phrases and idioms, that are not in five ones before. The paper shows that our proposed method based on the combination of Term-Counting method and Enhanced Contextual Valence Shifters method has improved the accuracy of sentiment classification. The combined method has accuracy 68.984% on the testing dataset, and 69.224% on the training dataset. All of these methods are implemented to classify the reviews based on our new dictionary and the Internet Movie data set.Keywords: sentiment classification, sentiment orientation, valence shifters, contextual, valence shifters, term counting
Procedia PDF Downloads 50214541 Quintic Spline Solution of Fourth-Order Parabolic Equations Arising in Beam Theory
Authors: Reza Mohammadi, Mahdieh Sahebi
Abstract:
We develop a method based on polynomial quintic spline for numerical solution of fourth-order non-homogeneous parabolic partial differential equation with variable coefficient. By using polynomial quintic spline in off-step points in space and finite difference in time directions, we obtained two three level implicit methods. Stability analysis of the presented method has been carried out. We solve four test problems numerically to validate the derived method. Numerical comparison with other methods shows the superiority of presented scheme.Keywords: fourth-order parabolic equation, variable coefficient, polynomial quintic spline, off-step points
Procedia PDF Downloads 35214540 Applying the Extreme-Based Teaching Model in Post-Secondary Online Classroom Setting: A Field Experiment
Authors: Leon Pan
Abstract:
The first programming course within post-secondary education has long been recognized as a challenging endeavor for both educators and students alike. Historically, these courses have exhibited high failure rates and a notable number of dropouts. Instructors often lament students' lack of effort in their coursework, and students often express frustration that the teaching methods employed are not effective. Drawing inspiration from the successful principles of Extreme Programming, this study introduces an approach—the Extremes-based teaching model — aimed at enhancing the teaching of introductory programming courses. To empirically determine the effectiveness of the model, a comparison was made between a section taught using the extreme-based model and another utilizing traditional teaching methods. Notably, the extreme-based teaching class required students to work collaboratively on projects while also demanding continuous assessment and performance enhancement within groups. This paper details the application of the extreme-based model within the post-secondary online classroom context and presents the compelling results that emphasize its effectiveness in advancing the teaching and learning experiences. The extreme-based model led to a significant increase of 13.46 points in the weighted total average and a commendable 10% reduction in the failure rate.Keywords: extreme-based teaching model, innovative pedagogical methods, project-based learning, team-based learning
Procedia PDF Downloads 5814539 Towards a Robust Patch Based Multi-View Stereo Technique for Textureless and Occluded 3D Reconstruction
Authors: Ben Haines, Li Bai
Abstract:
Patch based reconstruction methods have been and still are one of the top performing approaches to 3D reconstruction to date. Their local approach to refining the position and orientation of a patch, free of global minimisation and independent of surface smoothness, make patch based methods extremely powerful in recovering fine grained detail of an objects surface. However, patch based approaches still fail to faithfully reconstruct textureless or highly occluded surface regions thus though performing well under lab conditions, deteriorate in industrial or real world situations. They are also computationally expensive. Current patch based methods generate point clouds with holes in texturesless or occluded regions that require expensive energy minimisation techniques to fill and interpolate a high fidelity reconstruction. Such shortcomings hinder the adaptation of the methods for industrial applications where object surfaces are often highly textureless and the speed of reconstruction is an important factor. This paper presents on-going work towards a multi-resolution approach to address the problems, utilizing particle swarm optimisation to reconstruct high fidelity geometry, and increasing robustness to textureless features through an adapted approach to the normalised cross correlation. The work also aims to speed up the reconstruction using advances in GPU technologies and remove the need for costly initialization and expansion. Through the combination of these enhancements, it is the intention of this work to create denser patch clouds even in textureless regions within a reasonable time. Initial results show the potential of such an approach to construct denser point clouds with a comparable accuracy to that of the current top-performing algorithms.Keywords: 3D reconstruction, multiview stereo, particle swarm optimisation, photo consistency
Procedia PDF Downloads 20214538 Passive Solar Water Concepts for Human Comfort
Authors: Eyibo Ebengeobong Eddie
Abstract:
Taking advantage of the sun's position to design buildings to ensure human comfort has always been an important aspect in an architectural design. Using cheap and less expensive methods and systems for gaining solar energy, heating and cooling has always been a great advantage to users and occupants of a building. As the years run by, daily techniques and methods have been created and more are being discovered to help reduce the energy demands of any building. Architects have made effective use of a buildings orientation, building materials and elements to achieve less energy demand. This paper talks about the various techniques used in solar heating and passive cooling of buildings and through water techniques and concepts to achieve thermal comfort.Keywords: comfort, passive, solar, water
Procedia PDF Downloads 45814537 Dosimetric Comparison of Conventional Optimization Methods with Inverse Planning Simulated Annealing Technique
Authors: Shraddha Srivastava, N. K. Painuly, S. P. Mishra, Navin Singh, Muhsin Punchankandy, Kirti Srivastava, M. L. B. Bhatt
Abstract:
Various optimization methods used in interstitial brachytherapy are based on dwell positions and dwell weights alteration to produce dose distribution based on the implant geometry. Since these optimization schemes are not anatomy based, they could lead to deviations from the desired plan. This study was henceforth carried out to compare anatomy-based Inverse Planning Simulated Annealing (IPSA) optimization technique with graphical and geometrical optimization methods in interstitial high dose rate brachytherapy planning of cervical carcinoma. Six patients with 12 CT data sets of MUPIT implants in HDR brachytherapy of cervical cancer were prospectively studied. HR-CTV and organs at risk (OARs) were contoured in Oncentra treatment planning system (TPS) using GYN GEC-ESTRO guidelines on cervical carcinoma. Three sets of plans were generated for each fraction using IPSA, graphical optimization (GrOPT) and geometrical optimization (GOPT) methods. All patients were treated to a dose of 20 Gy in 2 fractions. The main objective was to cover at least 95% of HR-CTV with 100% of the prescribed dose (V100 ≥ 95% of HR-CTV). IPSA, GrOPT, and GOPT based plans were compared in terms of target coverage, OAR doses, homogeneity index (HI) and conformity index (COIN) using dose-volume histogram (DVH). Target volume coverage (mean V100) was found to be 93.980.87%, 91.341.02% and 85.052.84% for IPSA, GrOPT and GOPT plans respectively. Mean D90 (minimum dose received by 90% of HR-CTV) values for IPSA, GrOPT and GOPT plans were 10.19 ± 1.07 Gy, 10.17 ± 0.12 Gy and 7.99 ± 1.0 Gy respectively, while D100 (minimum dose received by 100% volume of HR-CTV) for IPSA, GrOPT and GOPT plans was 6.55 ± 0.85 Gy, 6.55 ± 0.65 Gy, 4.73 ± 0.14 Gy respectively. IPSA plans resulted in lower doses to the bladder (D₂Keywords: cervical cancer, HDR brachytherapy, IPSA, MUPIT
Procedia PDF Downloads 18514536 Optimization of a Method of Total RNA Extraction from Mentha piperita
Authors: Soheila Afkar
Abstract:
Mentha piperita is a medicinal plant that contains a large amount of secondary metabolite that has adverse effect on RNA extraction. Since high quality of RNA is the first step to real time-PCR, in this study optimization of total RNA isolation from leaf tissues of Mentha piperita was evaluated. From this point of view, we researched two different total RNA extraction methods on leaves of Mentha piperita to find the best one that contributes the high quality. The methods tested are RNX-plus, modified RNX-plus (1-5 numbers). RNA quality was analyzed by agarose gel 1.5%. The RNA integrity was also assessed by visualization of ribosomal RNA bands on 1.5% agarose gels. In the modified RNX-plus method (number 2), the integrity of 28S and 18S rRNA was highly satisfactory when analyzed in agarose denaturing gel, so this method is suitable for RNA isolation from Mentha piperita.Keywords: Mentha piperita, polyphenol, polysaccharide, RNA extraction
Procedia PDF Downloads 18914535 Modern Methods of Technology and Organization of Production of Construction Works during the Implementation of Construction 3D Printers
Authors: Azizakhanim Maharramli
Abstract:
The gradual transition from entrenched traditional technology and organization of construction production to innovative additive construction technology inevitably meets technological, technical, organizational, labour, and, finally, social difficulties. Therefore, the chosen nodal method will lead to the elimination of the above difficulties, combining some of the usual methods of construction and the myth in world practice that the labour force is subjected to a strong stream of reduction. The nodal method of additive technology will create favourable conditions for the optimal degree of distribution of labour across facilities due to the consistent performance of homogeneous work and the introduction of additive technology and traditional technology into construction production.Keywords: parallel method, sequential method, stream method, combined method, nodal method
Procedia PDF Downloads 9214534 Assessing Lithium Recovery from Secondary Sources
Authors: Carolina A. Santos, Alexandra B. Ribeiro
Abstract:
Climate change and environmental degradation are threats to humanity. Europe has been addressing these problems, namely through the Green Deal, with the use of batteries in mobility and energy fields. However, these require the use of critical raw materials, like lithium, which demand is estimated to grow 60 times in the next 30 years. Thus, it is fundamental to promote a circular economy with lithium recovery from secondary resources. These are nowadays key topics, which will be even more relevant in the future, so a new way to approach them is needed and must be encouraged. Therefore, one of our main goals is to analyse two methods of lithium retrieval from secondary sources, bioleaching, and electrodialysis, and assess them regarding their sustainability. The latest results show good efficiency of removal with both methods, even though there are some matrix interferences. Hence, further investment and research are needed in order to make this process sustainable and our society more circular.Keywords: lithium, sustainable mining, social license to operate, bioleaching, electrodialysis
Procedia PDF Downloads 12814533 Reduction of Peak Input Currents during Charge Pump Boosting in Monolithically Integrated High-Voltage Generators
Authors: Jan Doutreloigne
Abstract:
This paper describes two methods for the reduction of the peak input current during the boosting of Dickson charge pumps. Both methods are implemented in the fully integrated Dickson charge pumps of a high-voltage display driver chip for smart-card applications. Experimental results reveal good correspondence with Spice simulations and show a reduction of the peak input current by a factor of 6 during boostingKeywords: bi-stable display driver, Dickson charge pump, high-voltage generator, peak current reduction, sub-pump boosting, variable frequency boosting
Procedia PDF Downloads 45514532 Analysis of Expert Information in Linguistic Terms
Authors: O. Poleshchuk, E. Komarov
Abstract:
In this paper, semantic spaces with the properties of completeness and orthogonality (complete orthogonal semantic spaces) were chosen as models of expert evaluations. As the theoretical and practical studies have shown all the properties of complete orthogonal semantic spaces correspond to the thinking activity of experts that is why these semantic spaces were chosen for modeling. Two methods of construction such spaces were proposed. Models of comparative and fuzzy cluster analysis of expert evaluations were developed. The practical application of the developed methods has demonstrated their viability and validity.Keywords: expert evaluation, comparative analysis, fuzzy cluster analysis, theoretical and practical studies
Procedia PDF Downloads 52914531 GCM Based Fuzzy Clustering to Identify Homogeneous Climatic Regions of North-East India
Authors: Arup K. Sarma, Jayshree Hazarika
Abstract:
The North-eastern part of India, which receives heavier rainfall than other parts of the subcontinent, is of great concern now-a-days with regard to climate change. High intensity rainfall for short duration and longer dry spell, occurring due to impact of climate change, affects river morphology too. In the present study, an attempt is made to delineate the North-Eastern region of India into some homogeneous clusters based on the Fuzzy Clustering concept and to compare the resulting clusters obtained by using conventional methods and non conventional methods of clustering. The concept of clustering is adapted in view of the fact that, impact of climate change can be studied in a homogeneous region without much variation, which can be helpful in studies related to water resources planning and management. 10 IMD (Indian Meteorological Department) stations, situated in various regions of the North-east, have been selected for making the clusters. The results of the Fuzzy C-Means (FCM) analysis show different clustering patterns for different conditions. From the analysis and comparison it can be concluded that non conventional method of using GCM data is somehow giving better results than the others. However, further analysis can be done by taking daily data instead of monthly means to reduce the effect of standardization.Keywords: climate change, conventional and nonconventional methods of clustering, FCM analysis, homogeneous regions
Procedia PDF Downloads 38514530 Pre-Malignant Breast Lesions, Methods of Treatment and Outcome
Authors: Ahmed Mostafa, Mohamed Mahmoud, Nesreen H. Hafez, Mohamed Fahim
Abstract:
This retrospective study includes 60 patients with pre-invasive breast cancer. Aim of the study: Evaluation of premalignant lesions of the breast (DCIS), different treatment methods and outcome. Patients and methods: 60 patients with DCIS were studied from the period between 2005 to 2012, for 38 patients the primary surgical method was wide local resection (WLE) (63.3%) and the other cases (22 patients, 36.7%) had mastectomy, fourteen cases from those who underwent local excision received radiotherapy, while no adjuvant radiotherapy was given for those who underwent mastectomy. In case of hormonal receptor positive DCIS lesions hormonal treatment (Tamoxifen) was given after local control. Results: No difference in overall survival between mastectomy & breast conserving therapy (wide local excision and adjuvant radiotherapy), however local recurrence rate is higher in case of breast conserving therapy, also no role of Axillary evacuation in case of DCIS. The use of hormonal therapy decreases the incidence of local recurrence by about 98%. Conclusion: The main management of DCIS is local treatment (wide local excision and radiotherapy) with hormonal treatment in case of hormone receptor positive lesions.Keywords: ductal carcinoma in situ, surgical treatment, radiotherapy, breast conserving therapy, hormonal treatment
Procedia PDF Downloads 31914529 Discourse Analysis: Where Cognition Meets Communication
Authors: Iryna Biskub
Abstract:
The interdisciplinary approach to modern linguistic studies is exemplified by the merge of various research methods, which sometimes causes complications related to the verification of the research results. This methodological confusion can be resolved by means of creating new techniques of linguistic analysis combining several scientific paradigms. Modern linguistics has developed really productive and efficient methods for the investigation of cognitive and communicative phenomena of which language is the central issue. In the field of discourse studies, one of the best examples of research methods is the method of Critical Discourse Analysis (CDA). CDA can be viewed both as a method of investigation, as well as a critical multidisciplinary perspective. In CDA the position of the scholar is crucial from the point of view exemplifying his or her social and political convictions. The generally accepted approach to obtaining scientifically reliable results is to use a special well-defined scientific method for researching special types of language phenomena: cognitive methods applied to the exploration of cognitive aspects of language, whereas communicative methods are thought to be relevant only for the investigation of communicative nature of language. In the recent decades discourse as a sociocultural phenomenon has been the focus of careful linguistic research. The very concept of discourse represents an integral unity of cognitive and communicative aspects of human verbal activity. Since a human being is never able to discriminate between cognitive and communicative planes of discourse communication, it doesn’t make much sense to apply cognitive and communicative methods of research taken in isolation. It is possible to modify the classical CDA procedure by means of mapping human cognitive procedures onto the strategic communicative planning of discourse communication. The analysis of the electronic petition 'Block Donald J Trump from UK entry. The signatories believe Donald J Trump should be banned from UK entry' (584, 459 signatures) and the parliamentary debates on it has demonstrated the ability to map cognitive and communicative levels in the following way: the strategy of discourse modeling (communicative level) overlaps with the extraction of semantic macrostructures (cognitive level); the strategy of discourse management overlaps with the analysis of local meanings in discourse communication; the strategy of cognitive monitoring of the discourse overlaps with the formation of attitudes and ideologies at the cognitive level. Thus, the experimental data have shown that it is possible to develop a new complex methodology of discourse analysis, where cognition would meet communication, both metaphorically and literally. The same approach may appear to be productive for the creation of computational models of human-computer interaction, where the automatic generation of a particular type of a discourse could be based on the rules of strategic planning involving cognitive models of CDA.Keywords: cognition, communication, discourse, strategy
Procedia PDF Downloads 25214528 Alternating Expectation-Maximization Algorithm for a Bilinear Model in Isoform Quantification from RNA-Seq Data
Authors: Wenjiang Deng, Tian Mou, Yudi Pawitan, Trung Nghia Vu
Abstract:
Estimation of isoform-level gene expression from RNA-seq data depends on simplifying assumptions, such as uniform reads distribution, that are easily violated in real data. Such violations typically lead to biased estimates. Most existing methods provide a bias correction step(s), which is based on biological considerations, such as GC content–and applied in single samples separately. The main problem is that not all biases are known. For example, new technologies such as single-cell RNA-seq (scRNA-seq) may introduce new sources of bias not seen in bulk-cell data. This study introduces a method called XAEM based on a more flexible and robust statistical model. Existing methods are essentially based on a linear model Xβ, where the design matrix X is known and derived based on the simplifying assumptions. In contrast, XAEM considers Xβ as a bilinear model with both X and β unknown. Joint estimation of X and β is made possible by simultaneous analysis of multi-sample RNA-seq data. Compared to existing methods, XAEM automatically performs empirical correction of potentially unknown biases. XAEM implements an alternating expectation-maximization (AEM) algorithm, alternating between estimation of X and β. For speed XAEM utilizes quasi-mapping for read alignment, thus leading to a fast algorithm. Overall XAEM performs favorably compared to other recent advanced methods. For simulated datasets, XAEM obtains higher accuracy for multiple-isoform genes, particularly for paralogs. In a differential-expression analysis of a real scRNA-seq dataset, XAEM achieves substantially greater rediscovery rates in an independent validation set.Keywords: alternating EM algorithm, bias correction, bilinear model, gene expression, RNA-seq
Procedia PDF Downloads 14014527 Settlement Analysis of Axially Loaded Bored Piles: A Case History
Authors: M. Mert, M. T. Ozkan
Abstract:
Pile load tests should be applied to check the bearing capacity calculations and to determine the settlement of the pile corresponding to test load. Strain gauges can be installed into pile in order to determine the shaft resistance of the piles for every soil layer respectively. Detailed results can be obtained by means of strain gauges placed at certain levels into test piles. In the scope of this study, pile load test data obtained from two different projects are examined. Instrumented static pile load tests were applied on totally 7 test bored piles of different diameters (80 cm, 150 cm, and 200 cm) and different lengths (between 30-76 m) in two different project site. Settlement analysis of test piles is done by using some of load transfer methods and finite element method. Plaxis 3D which is a three-dimensional finite element program is also used for settlement analysis of the test piles. In this study, firstly bearing capacity of test piles are determined and compared with strain gauge data which is required for settlement analysis. Then, settlement values of the test piles are estimated by using load transfer methods developed in recent years and finite element method. The aim of this study is to show similarities and differences between the results obtained from settlement analysis methods and instrumented pile load tests.Keywords: failure, finite element method, monitoring and instrumentation, pile, settlement
Procedia PDF Downloads 16714526 Preparation and Characterization of α–Alumina with Low Sodium Oxide
Authors: Gyung Soo Jeon, Hong Bae Kim, Chi Jung Oh
Abstract:
In order to prepare the α-alumina with low content of sodium oxide from aluminum trihydroxide as a reactant, three kinds of methods were employed as follows; the mixture of Chamotte (aggregate composed of silica and alumina), ammonium chloride and aluminum fluoride with aluminum trihydroxide under 1600°C, respectively. The sodium oxide in α-alumina produced above methods was analyzed by XRF and the particle size distribution was determined by particle size analyzer, and the specific surface area of α-alumina was measured by BET method, and phase of α-alumina produced was confirmed by XRD. Acknowledgement: This research was supported by Development Program of Technical Innovation funded by Korea Technology and Information Promotion Agency for SMEs (KTIP-2016-S2401821).Keywords: α-alumina, sodium oxide, aluminum trihydroxide, Chamotte, ammonium chloride, aluminum fluoride
Procedia PDF Downloads 31314525 Resilient Machine Learning in the Nuclear Industry: Crack Detection as a Case Study
Authors: Anita Khadka, Gregory Epiphaniou, Carsten Maple
Abstract:
There is a dramatic surge in the adoption of machine learning (ML) techniques in many areas, including the nuclear industry (such as fault diagnosis and fuel management in nuclear power plants), autonomous systems (including self-driving vehicles), space systems (space debris recovery, for example), medical surgery, network intrusion detection, malware detection, to name a few. With the application of learning methods in such diverse domains, artificial intelligence (AI) has become a part of everyday modern human life. To date, the predominant focus has been on developing underpinning ML algorithms that can improve accuracy, while factors such as resiliency and robustness of algorithms have been largely overlooked. If an adversarial attack is able to compromise the learning method or data, the consequences can be fatal, especially but not exclusively in safety-critical applications. In this paper, we present an in-depth analysis of five adversarial attacks and three defence methods on a crack detection ML model. Our analysis shows that it can be dangerous to adopt machine learning techniques in security-critical areas such as the nuclear industry without rigorous testing since they may be vulnerable to adversarial attacks. While common defence methods can effectively defend against different attacks, none of the three considered can provide protection against all five adversarial attacks analysed.Keywords: adversarial machine learning, attacks, defences, nuclear industry, crack detection
Procedia PDF Downloads 15714524 Application of Artificial Intelligence in EOR
Authors: Masoumeh Mofarrah, Amir NahanMoghadam
Abstract:
Higher oil prices and increasing oil demand are main reasons for great attention to Enhanced Oil Recovery (EOR). Comprehensive researches have been accomplished to develop, appraise, and improve EOR methods and their application. Recently, Artificial Intelligence (AI) gained popularity in petroleum industry that can help petroleum engineers to solve some fundamental petroleum engineering problems such as reservoir simulation, EOR project risk analysis, well log interpretation and well test model selection. This study presents a historical overview of most popular AI tools including neural networks, genetic algorithms, fuzzy logic, and expert systems in petroleum industry and discusses two case studies to represent the application of two mentioned AI methods for selecting an appropriate EOR method based on reservoir characterization infeasible and effective way.Keywords: artificial intelligence, EOR, neural networks, expert systems
Procedia PDF Downloads 48714523 Investigation on Mechanical Properties of a Composite Material of Olive Flour Wood with a Polymer Matrix
Authors: Slim Souissi, Mohamed Ben Amar, Nesrine Bouhamed, Pierre Marechal
Abstract:
The bio-composites development from biodegradable materials and natural fibers has a growing interest in the science of composite materials. The present work was conducted as part of a cooperation project between the Sfax University and the Havre University. This work consists in developing and monitoring the properties of a composite material of olive flour wood with a polymer matrix (urea formaldehyde). For this, ultrasonic non-destructive and destructive methods of characterization were used to optimize the mechanical and acoustic properties of the studied material based on the elaboration parameters.Keywords: bio-composite, olive flour wood, polymer matrix, ultrasonic methods, mechanical properties
Procedia PDF Downloads 49114522 Synthesis and Characterization of Hydroxyapatite from Biowaste for Potential Medical Application
Authors: M. D. H. Beg, John O. Akindoyo, Suriati Ghazali, Nitthiyah Jeyaratnam
Abstract:
Over the period of time, several approaches have been undertaken to mitigate the challenges associated with bone regeneration. This includes but not limited to xenografts, allografts, autografts as well as artificial substitutions like bioceramics, synthetic cements and metals. The former three techniques often come along with peculiar limitation and problems such as morbidity, availability, disease transmission, collateral site damage or absolute rejection by the body as the case may be. Synthetic routes remain the only feasible alternative option for treatment of bone defects. Hydroxyapatite (HA) is very compatible and suitable for this application. However, most of the common methods for HA synthesis are either expensive, complicated or environmentally unfriendly. Interestingly, extraction of HA from bio-wastes have been perceived not only to be cost effective, but also environment friendly. In this research, HA was synthesized from bio-waste: namely bovine bones through three different methods which are hydrothermal chemical processes, ultrasound assisted synthesis and ordinary calcination techniques. Structure and property analysis of the HA was carried out through different characterization techniques such as TGA, FTIR, and XRD. All the methods applied were able to produce HA with similar compositional properties to biomaterials found in human calcified tissues. Calcination process was however observed to be more efficient as it eliminated all the organic components from the produced HA. The HA synthesized is unique for its minimal cost and environmental friendliness. It is also perceived to be suitable for tissue and bone engineering applications.Keywords: hydroxyapatite, bone, calcination, biowaste
Procedia PDF Downloads 24714521 Adopted Method of Information System Strategy for Knowledge Management System: A Literature Review
Authors: Elin Cahyaningsih, Dana Indra Sensuse, Wahyu Catur Wibowo, Sofiyanti Indriasari
Abstract:
Bureaucracy reform program drives Indonesian government to change their management and supporting unit in order to enhance their organization performance. Information technology as one of supporting unit became one of strategic plan that organization tried to improve, because IT can automate and speed up process, reduce business process life cycle become more effective and efficient. Knowledge management system is a technology application for supporting knowledge management implementation in government which is requirement based on problem and potential functionality of each knowledge management process. Define knowledge management that suitable for each organization it is difficult, that why we should make the knowledge management system strategy as an alignment of knowledge management process in the organization. Knowledge management system is one of information system development in people perspective, because this system has high dependency in human interaction and participation. Strategic plan for developing knowledge management system can be determine using some of information system strategic methods. This research conducted to define type of strategic method of information system, stage of activity each method, the strategic method strength and weakness. The author use literature review methods for identify and classify strategic methods of information system for differentiate method type, categorize common activities, strength and weakness. Result of this research are determine and compare six strategic information system methods, there are Balanced Scorecard, Five Force Porter, SWOT analysis, Value Chain Analysis, Risk Analysis and Gap Analysis. Balanced Scorecard and Risk Analysis believe as common strategic method that usually used and have the highest excellence strength.Keywords: knowledge management system, balanced scorecard, five force, risk analysis, gap analysis, value chain analysis, SWOT analysis
Procedia PDF Downloads 47614520 A Survey of Discrete Facility Location Problems
Authors: Z. Ulukan, E. Demircioğlu,
Abstract:
Facility location is a complex real-world problem which needs a strategic management decision. This paper provides a general review on studies, efforts and developments in Facility Location Problems which are classical optimization problems having a wide-spread applications in various areas such as transportation, distribution, production, supply chain decisions and telecommunication. Our goal is not to review all variants of different studies in FLPs or to describe very detailed computational techniques and solution approaches, but rather to provide a broad overview of major location problems that have been studied, indicating how they are formulated and what are proposed by researchers to tackle the problem. A brief, elucidative table based on a grouping according to “General Problem Type” and “Methods Proposed” used in the studies is also presented at the end of the work.Keywords: discrete location problems, exact methods, heuristic algorithms, single source capacitated facility location problems
Procedia PDF Downloads 470