Search results for: grey code
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1599

Search results for: grey code

1419 Frequency of Nosocomial Infections in a Tertiary Hospital in Isfahan, Iran

Authors: Zahra Tolou-Ghamari

Abstract:

Objective: Health care associated with multiresistant pathogens is rising globally. It is well known that nosocomial infections increase hospital stay, morbidity, mortality, and disability. Therefore, the aim of this study was to define the occurrence of nosocomial infections in a tertiary hospital in Isfahan/Iran. Materials and Methods: The data were extracted from the official database of hospital nosocomial infections records that included 9152 vertical rows. For each patient, the reported infections were coded by number as UTI-SUTI; Code 55, VAE-PVAP; Code 56, BSI-LCBI Code 19, SSI-DIP; Code 14, and so on. For continuous variables, mean ± standard deviation and for categorical variables, the frequency was used. Results: The study population was 5542 patients, comprised of males (n=3282) and females (n=2260). With a minimum of 15 and a maximum of 99, the mean age in 5313 patients was 58.5 ± 19.1 years old. The highest reported nosocomial infections (n= 77%) were associated with the ages 30-80 years old. Sites of nosocomial infections in 87% were as: VAE-PVAP; 27.3%, VAE-IVAC; 7.7, UTI-SUTI; 29.5%, BSI-LCBI; 12.9%, SSI-DIP; 9.5% and other individual infection (13%) with the main pathogens klebsiella pneumonia, acinetobacter baumannii and staphylococcus. Conclusions: For an efficient surveillance system, adopting pharmacotherapy used antibiotics in terms of monotherapy or polypharmacy control policy, in addition to advanced infection control programs at regional and national levels in Iran recommended.

Keywords: infection, nosocomial, ventilator, blood stream, Isfahan, Iran

Procedia PDF Downloads 77
1418 Assessment of Ultra-High Cycle Fatigue Behavior of EN-GJL-250 Cast Iron Using Ultrasonic Fatigue Testing Machine

Authors: Saeedeh Bakhtiari, Johannes Depessemier, Stijn Hertelé, Wim De Waele

Abstract:

High cycle fatigue comprising up to 107 load cycles has been the subject of many studies, and the behavior of many materials was recorded adequately in this regime. However, many applications involve larger numbers of load cycles during the lifetime of machine components. In this ultra-high cycle regime, other failure mechanisms play, and the concept of a fatigue endurance limit (assumed for materials such as steel) is often an oversimplification of reality. When machine component design demands a high geometrical complexity, cast iron grades become interesting candidate materials. Grey cast iron is known for its low cost, high compressive strength, and good damping properties. However, the ultra-high cycle fatigue behavior of cast iron is poorly documented. The current work focuses on the ultra-high cycle fatigue behavior of EN-GJL-250 (GG25) grey cast iron by developing an ultrasonic (20 kHz) fatigue testing system. Moreover, the testing machine is instrumented to measure the temperature and the displacement of  the specimen, and to control the temperature. The high resonance frequency allowed to assess the  behavior of the cast iron of interest within a matter of days for ultra-high numbers of cycles, and repeat the tests to quantify the natural scatter in fatigue resistance.

Keywords: GG25, cast iron, ultra-high cycle fatigue, ultrasonic test

Procedia PDF Downloads 174
1417 Mobile Platform’s Attitude Determination Based on Smoothed GPS Code Data and Carrier-Phase Measurements

Authors: Mohamed Ramdani, Hassen Abdellaoui, Abdenour Boudrassen

Abstract:

Mobile platform’s attitude estimation approaches mainly based on combined positioning techniques and developed algorithms; which aim to reach a fast and accurate solution. In this work, we describe the design and the implementation of an attitude determination (AD) process, using only measurements from GPS sensors. The major issue is based on smoothed GPS code data using Hatch filter and raw carrier-phase measurements integrated into attitude algorithm based on vectors measurement using least squares (LSQ) estimation method. GPS dataset from a static experiment is used to investigate the effectiveness of the presented approach and consequently to check the accuracy of the attitude estimation algorithm. Attitude results from GPS multi-antenna over short baselines are introduced and analyzed. The 3D accuracy of estimated attitude parameters using smoothed measurements is over 0.27°.

Keywords: attitude determination, GPS code data smoothing, hatch filter, carrier-phase measurements, least-squares attitude estimation

Procedia PDF Downloads 153
1416 Effect of Modeling of Hydraulic Form Loss Coefficient to Break on Emergency Core Coolant Bypass

Authors: Young S. Bang, Dong H. Yoon, Seung H. Yoo

Abstract:

Emergency Core Coolant Bypass (ECC Bypass) has been regarded as an important phenomenon to peak cladding temperature of large-break loss-of-coolant-accidents (LBLOCA) in nuclear power plants (NPP). A modeling scheme to address the ECC Bypass phenomena and the calculation of LBLOCA using that scheme are discussed in the present paper. A hydraulic form loss coefficient (HFLC) from the reactor vessel downcomer to the broken cold leg is predicted by the computational fluid dynamics (CFD) code with a variation of the void fraction incoming from the downcomer. The maximum, mean, and minimum values of FLC are derived from the CFD results and are incorporated into the LBLOCA calculation using a system thermal-hydraulic code, MARS-KS. As a relevant parameter addressing the ECC Bypass phenomena, the FLC to the break and its range are proposed.

Keywords: CFD analysis, ECC bypass, hydraulic form loss coefficient, system thermal-hydraulic code

Procedia PDF Downloads 229
1415 Establish a Company in Turkey for Foreigners

Authors: Mucahit Unal, Ibrahim Arslan

Abstract:

The New Turkish Commercial Code (TCC) No. 6102 was published in the Official Gazette on February 14, 2011. As stated in the New Turkish Commercial Code No. 6102 and Law No. 6103 on Validity and Application of the Turkish Commercial Code, TCC came into effect on July 1, 2012. The basic purpose of the TCC is to form corporate governance coherent with the international standards; to provide transparency in company management; to adjust the Turkish Commercial Code rules with European Union legislations and to simplify establishing a company for foreigner investors to move investments to Turkish market. In this context according to TCC, joint stock companies and limited liability companies can establish with only one single shareholder; the one single shareholder can be foreigner; all board of director members can be foreigner, also all shareholders and board of director members can be non-resident foreigners. Additionally, TCC does not require physical participation to the general shareholders and board members meetings. TCC allows that the general shareholders and board members meetings can hold in an electronic form and resolution of these meetings may also be approved via electronic signatures. Through this amendment, foreign investors no longer have to deal with red tapes. This amendment also means the TCC prevents foreign companies from incurring unnecessary travel expenses. In accordance with all this amendments about TCC, to invest in Turkish market is easy, simple and transparent for foreigner investors and also investors can establish a company in Turkey, irrespective of nationality or place of residence. This article aims to analyze ‘Establish a Company in Turkey for Foreigners’ and inform investors about investing (especially establishing a company) in the Turkish market.

Keywords: establish a company, foreigner investors, invest in Turkish market, Turkish commercial code

Procedia PDF Downloads 263
1414 Investigating the Use of English Arabic Codeswitching in EFL classroom Oral Discourse Case study: Middle school pupils of Ain Fekroun, Wilaya of Oum El Bouaghi Algeria

Authors: Fadila Hadjeris

Abstract:

The study aims at investigating the functions of English-Arabic code switching in English as a foreign language classroom oral discourse and the extent to which they can contribute to the flow of classroom interaction. It also seeks to understand the views, beliefs, and perceptions of teachers and learners towards this practice. We hypothesized that code switching is a communicative strategy which facilitates classroom interaction. Due to this fact, both teachers and learners support its use. The study draws on a key body of literature in bilingualism, second language acquisition, and classroom discourse in an attempt to provide a framework for considering the research questions. It employs a combination of qualitative and quantitative research methods which include classroom observations and questionnaires. The analysis of the recordings shows that teachers’ code switching to Arabic is not only used for academic and classroom management reasons. Rather, the data display instances in which code switching is used for social reasons. The analysis of the questionnaires indicates that teachers and pupils have different attitudes towards this phenomenon. Teachers reported their deliberate switching during EFL teaching, yet the majority was against this practice. According to them, the use of the mother has detrimental effects on the acquisition and the practice of the target language. In contrast, pupils showed their preference to their teachers’ code switching because it enhances and facilitates their understanding. These findings support the fact that the shift to pupils’ mother tongue is a strategy which aids and facilitates the teaching and the learning of the target language. This, in turn, necessitates recommendations which are suggested to teachers and course designers.

Keywords: bilingualism, codeswitching, classroom interaction, classroom discourse, EFL learning/ teaching, SLA

Procedia PDF Downloads 476
1413 Bit Error Rate (BER) Performance of Coherent Homodyne BPSK-OCDMA Network for Multimedia Applications

Authors: Morsy Ahmed Morsy Ismail

Abstract:

In this paper, the structure of a coherent homodyne receiver for the Binary Phase Shift Keying (BPSK) Optical Code Division Multiple Access (OCDMA) network is introduced based on the Multi-Length Weighted Modified Prime Code (ML-WMPC) for multimedia applications. The Bit Error Rate (BER) of this homodyne detection is evaluated as a function of the number of active users and the signal to noise ratio for different code lengths according to the multimedia application such as audio, voice, and video. Besides, the Mach-Zehnder interferometer is used as an external phase modulator in homodyne detection. Furthermore, the Multiple Access Interference (MAI) and the receiver noise in a shot-noise limited regime are taken into consideration in the BER calculations.

Keywords: OCDMA networks, bit error rate, multiple access interference, binary phase-shift keying, multimedia

Procedia PDF Downloads 174
1412 Performance Comparison of Non-Binary RA and QC-LDPC Codes

Authors: Ni Wenli, He Jing

Abstract:

Repeat–Accumulate (RA) codes are subclass of LDPC codes with fast encoder structures. In this paper, we consider a nonbinary extension of binary LDPC codes over GF(q) and construct a non-binary RA code and a non-binary QC-LDPC code over GF(2^4), we construct non-binary RA codes with linear encoding method and non-binary QC-LDPC codes with algebraic constructions method. And the BER performance of RA and QC-LDPC codes over GF(q) are compared with BP decoding and by simulation over the Additive White Gaussian Noise (AWGN) channels.

Keywords: non-binary RA codes, QC-LDPC codes, performance comparison, BP algorithm

Procedia PDF Downloads 375
1411 Tertiary Level Teachers' Beliefs about Codeswitching

Authors: Hoa Pham

Abstract:

Code switching, which can be described as the use of students’ first language in second language classrooms, has long been a controversial topic in the area of language teaching and second language acquisition. While this has been widely investigated across different contexts, little empirical research has been undertaken in Vietnam. The findings of this study contribute to our understanding of bilingual discourse and code switching practices in content and language integrated classrooms, which has significant implications for language teaching and learning in general and in particular for language pedagogy at tertiary level in Vietnam. This study examines the accounts the teachers articulated for their code switching practices in content-based Business English in Vietnam. Data were collected from five teachers through the use of stimulated recall interviews facilitated by the video data to garner the teachers' cognitive reflection, and allowed them to vocalise the motivations behind their code switching behaviour in particular contexts. The literature has recommended that when participants are provided with a large amount of stimuli or cues, they will experience an original situation again in their imagination with great accuracy. This technique can also provide a valuable "insider" perspective on the phenomenon under investigation which complements the researcher’s "outsider" observation. This can create a relaxed atmosphere during the interview process, which in turn promotes the collection of rich and diverse data. Also, participants can be empowered by this technique as they can raise their own concerns and discuss instances which they find important or interesting. The data generated through this study were analysed using a constant comparative approach. The study found that the teachers indicated their support for the use of code switching in their pedagogical practices. Particularly, as a pedagogical resource, the teachers saw code switching to the L1 playing a key role in facilitating the students' comprehension of both content knowledge and the target language. They believed the use of the L1 accommodates the students' current language competence and content knowledge. They also expressed positive opinions about the role that code switching plays in stimulating students' schematic language and content knowledge, encouraging retention and interest in learning and promoting a positive affective environment in the classroom. The teachers perceived that their use of code switching to the L1 helps them meet the students' language needs and prepares them for their study in subsequent courses and addresses functional needs so that students can cope with English language use outside the classroom. Several factors shaped the teachers' perceptions of their code switching practices, including their accumulated teaching experience, their previous experience as language learners, their theoretical understanding of language teaching and learning, and their knowledge of the teaching context. Code switching was a typical phenomenon in the observed classes and was supported by the teachers in certain contexts. This study reinforces the call in the literature to recognise this practice as a useful instructional resource.

Keywords: codeswitching, language teaching, teacher beliefs, tertiary level

Procedia PDF Downloads 449
1410 GPU Based High Speed Error Protection for Watermarked Medical Image Transmission

Authors: Md Shohidul Islam, Jongmyon Kim, Ui-pil Chong

Abstract:

Medical image is an integral part of e-health care and e-diagnosis system. Medical image watermarking is widely used to protect patients’ information from malicious alteration and manipulation. The watermarked medical images are transmitted over the internet among patients, primary and referred physicians. The images are highly prone to corruption in the wireless transmission medium due to various noises, deflection, and refractions. Distortion in the received images leads to faulty watermark detection and inappropriate disease diagnosis. To address the issue, this paper utilizes error correction code (ECC) with (8, 4) Hamming code in an existing watermarking system. In addition, we implement the high complex ECC on a graphics processing units (GPU) to accelerate and support real-time requirement. Experimental results show that GPU achieves considerable speedup over the sequential CPU implementation, while maintaining 100% ECC efficiency.

Keywords: medical image watermarking, e-health system, error correction, Hamming code, GPU

Procedia PDF Downloads 289
1409 A Qualitative Evidence of the Markedness of Code Switching during Commercial Bank Service Encounters in Ìbàdàn Metropolis

Authors: A. Robbin

Abstract:

In a multilingual setting like Nigeria, the success of service encounters is enhanced by the use of a language that ensures the linguistic and persuasive demands of the interlocutors. This study examined motivations for code switching as a negotiation strategy in bank-hall desk service encounters in Ìbàdàn metropolis using Myers-Scotton’s exploration on markedness in language use. The data consisted of transcribed audio recording of bank-hall service encounters, and direct observation of bank interactions in two purposively sampled commercial banks in Ìbàdàn metropolis. The data was subjected to descriptive linguistic analysis using Myers Scotton’s Markedness Model.  Findings reveal that code switching is frequently employed during different stages of service encounter: greeting, transaction and closing to fulfil relational, bargaining and referential functions. Bank staff and customers code switch to make unmarked, marked and explanatory choices. A strategy used to identify with customer’s cultural affiliation, close status gap, and appeal to begrudged customer; or as an explanatory choice with non-literate customers for ease of communication. Bankers select English to maintain customers’ perceptions of prestige which is retained or diverged from depending on their linguistic preference or ability.  Yoruba is seen as an efficient negotiation strategy with both bankers and their customers, making choices within conversation to achieve desired conversational and functional aims.

Keywords: banking, bilingualism, code-switching, markedness, service encounter

Procedia PDF Downloads 205
1408 Myanmar Character Recognition Using Eight Direction Chain Code Frequency Features

Authors: Kyi Pyar Zaw, Zin Mar Kyu

Abstract:

Character recognition is the process of converting a text image file into editable and searchable text file. Feature Extraction is the heart of any character recognition system. The character recognition rate may be low or high depending on the extracted features. In the proposed paper, 25 features for one character are used in character recognition. Basically, there are three steps of character recognition such as character segmentation, feature extraction and classification. In segmentation step, horizontal cropping method is used for line segmentation and vertical cropping method is used for character segmentation. In the Feature extraction step, features are extracted in two ways. The first way is that the 8 features are extracted from the entire input character using eight direction chain code frequency extraction. The second way is that the input character is divided into 16 blocks. For each block, although 8 feature values are obtained through eight-direction chain code frequency extraction method, we define the sum of these 8 feature values as a feature for one block. Therefore, 16 features are extracted from that 16 blocks in the second way. We use the number of holes feature to cluster the similar characters. We can recognize the almost Myanmar common characters with various font sizes by using these features. All these 25 features are used in both training part and testing part. In the classification step, the characters are classified by matching the all features of input character with already trained features of characters.

Keywords: chain code frequency, character recognition, feature extraction, features matching, segmentation

Procedia PDF Downloads 319
1407 The Connection between the Schwartz Theory of Basic Values and Ethical Principles in Clinical Psychology

Authors: Matej Stritesky

Abstract:

The research deals with the connection between the Schwartz Theory of Basic Values and the ethical principles in psychology, on which the meta-code of ethics the European Federation of Psychological Associations is based. The research focuses on ethically problematic situations in clinical psychology in the Czech Republic. Based on the analysis of papers that identified ethically problematic situations faced by clinical psychologists, a questionnaire of ethically problematic situations in clinical psychology (EPSCP) was created for the purposes of the research. The questionnaire was created to represent situations that correspond to the 4 principles on which the meta-code of ethics the European Federation of Psychological Associations is based. The questionnaire EPSCP consists of descriptions of 32 situations that respondents evaluate on a scale from 1 (psychologist's behaviour is ethically perfectly fine) to 10 (psychologist's behaviour is ethically completely unacceptable). The EPSCP questionnaire, together with Schwartz's PVQ questionnaire, will be presented to 60 psychology students. The relationship between principles in clinical psychology and the values on Schwartz´s value continuum will be described using multidimensional scaling. A positive correlation is assumed between the higher-order value of openness to change and problematic ethical situations related to the principle of integrity; a positive correlation between the value of the higher order of self-transcendence and the principle of respect and responsibility; a positive correlation between the value of the higher order of conservation and the principle of competence; and negative correlation between the value of the higher order of ego strengthening and sensitivity to ethically problematic situations. The research also includes an experimental part. The first half of the students are presented with the code of ethics of the Czech Association of Clinical Psychologists before completing the questionnaires, and to the second half of the students is the code of ethics presented after completing the questionnaires. In addition to reading the code of ethics, students describe the three rules of the code of ethics that they consider most important and state why they chose these rules. The output of the experimental part will be to determine whether the presentation of the code of ethics leads to greater sensitivity to ethically problematic situations.

Keywords: clinical psychology, ethically problematic situations in clinical psychology, ethical principles in psychology, Schwartz theory of basic values

Procedia PDF Downloads 112
1406 Grey Wolf Optimization Technique for Predictive Analysis of Products in E-Commerce: An Adaptive Approach

Authors: Shital Suresh Borse, Vijayalaxmi Kadroli

Abstract:

E-commerce industries nowadays implement the latest AI, ML Techniques to improve their own performance and prediction accuracy. This helps to gain a huge profit from the online market. Ant Colony Optimization, Genetic algorithm, Particle Swarm Optimization, Neural Network & GWO help many e-commerce industries for up-gradation of their predictive performance. These algorithms are providing optimum results in various applications, such as stock price prediction, prediction of drug-target interaction & user ratings of similar products in e-commerce sites, etc. In this study, customer reviews will play an important role in prediction analysis. People showing much interest in buying a lot of services& products suggested by other customers. This ultimately increases net profit. In this work, a convolution neural network (CNN) is proposed which further is useful to optimize the prediction accuracy of an e-commerce website. This method shows that CNN is used to optimize hyperparameters of GWO algorithm using an appropriate coding scheme. Accurate model results are verified by comparing them to PSO results whose hyperparameters have been optimized by CNN in Amazon's customer review dataset. Here, experimental outcome proves that this proposed system using the GWO algorithm achieves superior execution in terms of accuracy, precision, recovery, etc. in prediction analysis compared to the existing systems.

Keywords: prediction analysis, e-commerce, machine learning, grey wolf optimization, particle swarm optimization, CNN

Procedia PDF Downloads 112
1405 Detecting Heartbeat Architectural Tactic in Source Code Using Program Analysis

Authors: Ananta Kumar Das, Sujit Kumar Chakrabarti

Abstract:

Architectural tactics such as heartbeat, ping-echo, encapsulate, encrypt data are techniques that are used to achieve quality attributes of a system. Detecting architectural tactics has several benefits: it can aid system comprehension (e.g., legacy systems) and in the estimation of quality attributes such as safety, security, maintainability, etc. Architectural tactics are typically spread over the source code and are implicit. For large codebases, manual detection is often not feasible. Therefore, there is a need for automated methods of detection of architectural tactics. This paper presents a formalization of the heartbeat architectural tactic and a program analytic approach to detect this tactic in source code. The experiment of the proposed method is done on a set of Java applications. The outcome of the experiment strongly suggests that the method compares well with a manual approach in terms of its sensitivity and specificity, and far supersedes a manual exercise in terms of its scalability.

Keywords: software architecture, architectural tactics, detecting architectural tactics, program analysis, AST, alias analysis

Procedia PDF Downloads 158
1404 Prompt Design for Code Generation in Data Analysis Using Large Language Models

Authors: Lu Song Ma Li Zhi

Abstract:

With the rapid advancement of artificial intelligence technology, large language models (LLMs) have become a milestone in the field of natural language processing, demonstrating remarkable capabilities in semantic understanding, intelligent question answering, and text generation. These models are gradually penetrating various industries, particularly showcasing significant application potential in the data analysis domain. However, retraining or fine-tuning these models requires substantial computational resources and ample downstream task datasets, which poses a significant challenge for many enterprises and research institutions. Without modifying the internal parameters of the large models, prompt engineering techniques can rapidly adapt these models to new domains. This paper proposes a prompt design strategy aimed at leveraging the capabilities of large language models to automate the generation of data analysis code. By carefully designing prompts, data analysis requirements can be described in natural language, which the large language model can then understand and convert into executable data analysis code, thereby greatly enhancing the efficiency and convenience of data analysis. This strategy not only lowers the threshold for using large models but also significantly improves the accuracy and efficiency of data analysis. Our approach includes requirements for the precision of natural language descriptions, coverage of diverse data analysis needs, and mechanisms for immediate feedback and adjustment. Experimental results show that with this prompt design strategy, large language models perform exceptionally well in multiple data analysis tasks, generating high-quality code and significantly shortening the data analysis cycle. This method provides an efficient and convenient tool for the data analysis field and demonstrates the enormous potential of large language models in practical applications.

Keywords: large language models, prompt design, data analysis, code generation

Procedia PDF Downloads 37
1403 Simulation of Reflectometry in Alborz Tokamak

Authors: S. Kohestani, R. Amrollahi, P. Daryabor

Abstract:

Microwave diagnostics such as reflectometry are receiving growing attention in magnetic confinement fusionresearch. In order to obtain the better understanding of plasma confinement physics, more detailed measurements on density profile and its fluctuations might be required. A 2D full-wave simulation of ordinary mode propagation has been written in an effort to model effects seen in reflectometry experiment. The code uses the finite-difference-time-domain method with a perfectly-matched-layer absorption boundary to solve Maxwell’s equations.The code has been used to simulate the reflectometer measurement in Alborz Tokamak.

Keywords: reflectometry, simulation, ordinary mode, tokamak

Procedia PDF Downloads 420
1402 Redefining Infrastructure as Code Orchestration Using AI

Authors: Georges Bou Ghantous

Abstract:

This research delves into the transformative impact of Artificial Intelligence (AI) on Infrastructure as Code (IaaC) practices, specifically focusing on the redefinition of infrastructure orchestration. By harnessing AI technologies such as machine learning algorithms and predictive analytics, organizations can achieve unprecedented levels of efficiency and optimization in managing their infrastructure resources. AI-driven IaaC introduces proactive decision-making through predictive insights, enabling organizations to anticipate and address potential issues before they arise. Dynamic resource scaling, facilitated by AI, ensures that infrastructure resources can seamlessly adapt to fluctuating workloads and changing business requirements. Through case studies and best practices, this paper sheds light on the tangible benefits and challenges associated with AI-driven IaaC transformation, providing valuable insights for organizations navigating the evolving landscape of digital infrastructure management.

Keywords: artificial intelligence, infrastructure as code, efficiency optimization, predictive insights, dynamic resource scaling, proactive decision-making

Procedia PDF Downloads 32
1401 Feature Engineering Based Detection of Buffer Overflow Vulnerability in Source Code Using Deep Neural Networks

Authors: Mst Shapna Akter, Hossain Shahriar

Abstract:

One of the most important challenges in the field of software code audit is the presence of vulnerabilities in software source code. Every year, more and more software flaws are found, either internally in proprietary code or revealed publicly. These flaws are highly likely exploited and lead to system compromise, data leakage, or denial of service. C and C++ open-source code are now available in order to create a largescale, machine-learning system for function-level vulnerability identification. We assembled a sizable dataset of millions of opensource functions that point to potential exploits. We developed an efficient and scalable vulnerability detection method based on deep neural network models that learn features extracted from the source codes. The source code is first converted into a minimal intermediate representation to remove the pointless components and shorten the dependency. Moreover, we keep the semantic and syntactic information using state-of-the-art word embedding algorithms such as glove and fastText. The embedded vectors are subsequently fed into deep learning networks such as LSTM, BilSTM, LSTM-Autoencoder, word2vec, BERT, and GPT-2 to classify the possible vulnerabilities. Furthermore, we proposed a neural network model which can overcome issues associated with traditional neural networks. Evaluation metrics such as f1 score, precision, recall, accuracy, and total execution time have been used to measure the performance. We made a comparative analysis between results derived from features containing a minimal text representation and semantic and syntactic information. We found that all of the deep learning models provide comparatively higher accuracy when we use semantic and syntactic information as the features but require higher execution time as the word embedding the algorithm puts on a bit of complexity to the overall system.

Keywords: cyber security, vulnerability detection, neural networks, feature extraction

Procedia PDF Downloads 88
1400 Characterization of Alloyed Grey Cast Iron Quenched and Tempered for a Smooth Roll Application

Authors: Mohamed Habireche, Nacer E. Bacha, Mohamed Djeghdjough

Abstract:

In the brick industry, smooth double roll crusher is used for medium and fine crushing of soft to medium hard material. Due to opposite inward rotation of the rolls, the feed material is nipped between the rolls and crushed by compression. They are subject to intense wear, known as three-body abrasion, due to the action of abrasive products. The production downtime affecting productivity stems from two sources: the bi-monthly rectification of the roll crushers and their replacement when they are completely worn out. Choosing the right material for the roll crushers should result in longer machine cycles, and reduced repair and maintenance costs. All roll crushers are imported from outside Algeria. This results in sometimes very long delivery times which handicap the brickyards, in particular in respecting delivery times and honored the orders made by customers. The aim of this work is to investigate the effect of alloying additions on microstructure and wear behavior of grey lamellar cast iron for smooth roll crushers in brick industry. The base gray iron was melted in an induction furnace with low frequency at a temperature of 1500 °C, in which return cast iron scrap, new cast iron ingot, and steel scrap were added to the melt to generate the desired composition. The chemical analysis of the bar samples was carried out using Emission Spectrometer Systems PV 8050 Series (Philips) except for the carbon, for which a carbon/sulphur analyser Elementrac CS-i was used. Unetched microstructure was used to evaluate the graphite flake morphology using the image comparison measurement method. At least five different fields were selected for quantitative estimation of phase constituents. The samples were observed under X100 magnification with a Zeiss Axiover T40 MAT optical microscope equipped with a digital camera. SEM microscope equipped with EDS was used to characterize the phases present in the microstructure. The hardness (750 kg load, 5mm diameter ball) was measured with a Brinell testing machine for both treated and as-solidified condition test pieces. The test bars were used for tensile strength and metallographic evaluations. Mechanical properties were evaluated using tensile specimens made as per ASTM E8 standards. Two specimens were tested for each alloy. From each rod, a test piece was made for the tensile test. The results showed that the quenched and tempered alloys had best wear resistance at 400 °C for alloyed grey cast iron (containing 0.62%Mn, 0.68%Cr, and 1.09% Cu) due to fine carbides in the tempered matrix. In quenched and tempered condition, increasing Cu content in cast irons improved its wear resistance moderately. Combined addition of Cu and Cr increases hardness and wear resistance for a quenched and tempered hypoeutectic grey cast iron.

Keywords: casting, cast iron, microstructure, heat treating

Procedia PDF Downloads 105
1399 Clinch Process Simulation Using Diffuse Elements

Authors: Benzegaou Ali, Brani Benabderrahmane

Abstract:

This work describes a numerical study of the TOX–clinching process using diffuse elements. A computer code baptized SEMA "Static Explicit Method Analysis" is developed to simulate the clinch joining process. The FE code is based on an Updated Lagrangian scheme. The used resolution method is based on an explicit static approach. The integration of the elasto-plastic behavior law is realized with an algorithm of Simo and Taylor. The tools are represented by plane facets.

Keywords: diffuse elements, numerical simulation, clinching, contact, large deformation

Procedia PDF Downloads 362
1398 Regional Review of Outcome of Cervical Smears Reported with Cytological Features of Non Cervical Glandular Neoplasia

Authors: Uma Krishnamoorthy, Vivienne Beavers, Janet Marshall

Abstract:

Introduction: Cervical cytology showing features raising the suspicion of non cervical glandular neoplasia are reported as code 0 under the United Kingdom National Health Service Cervical screening programme ( NHSCSP). As the suspicion is regarding non cervical neoplasia, smear is reported as normal and patient informed that cervical screening result is normal. GP receives copy of results where it states further referral is indicated in small font within text of report. Background: There were several incidents of delayed diagnosis of endometrial cancer in Lancashire which prompted this Northwest Regional review to enable an understanding of underlying pathology outcome of code zero smears to raise awareness and also to review whether further action on wording of smear results was indicated to prevent such delay. Methodology: All Smears reported at the Manchester cytology centre who process cytology for Lancashire population from March 2013 to March 2014 were reviewed and histological diagnosis outcome of women in whom smear was reported as code zero was reviewed retrospectively . Results: Total smears reported by the cytology centre during this period was approximately 109400. Reports issued with result code 0 among this during this time period was 49.Results revealed that among three fourth (37) of women with code zero smear (N=49), evidence of underlying pathology of non cervical origin was confirmed. Of this, 73 % (36) were due to endometrial pathology with 49 % (24) endometrial carcinoma, 12 % (6)polyp, 4 % atypical endometrial hyperplasia (2), 6 % endometrial hyperplasia without atypia (3), and 2 % adenomyosis (1 case) and 2 % ( 1 case) due to ovarian adenocarcinoma. Conclusion: This review demonstrated that more than half (51 %) of women with a code 0 smear report were diagnosed with underlying carcinoma and 75 % had a confirmed underlying pathology contributory to code 0 smear findings. Recommendations and Action Plan: A local rapid access referral and management pathway for this group of women was implemented as a result of this in our unit. The findings and Pathway were shared with other regional units served by the cytology centre through the Pan Lancashire cervical screening board and through the Cytology centre. Locally, the smear report wording was updated to include a rubber stamp/ print in "Red Bold letters" stating that " URGENT REFERRAL TO GYNAECOLOGY IS INDICATED". Findings were also shared through the Pan Lancashire board with National cervical screening programme board, and revisions to wording of code zero smear reports to highlight the need for Urgent referral has now been agreed at National level to be implemented.

Keywords: code zero smears, endometrial cancer, non cervical glandular neoplasia, ovarian cancer

Procedia PDF Downloads 297
1397 Steady State Modeling and Simulation of an Industrial Steam Boiler

Authors: Amina Lyria Deghal Cheridi, Abla Chaker, Ahcene Loubar

Abstract:

Relap5 system code is one among powerful tools, which is used in the area of design and safety evaluation. This work aims to simulate the behavior of a radiant steam boiler at the steady-state conditions using Relap5 code system. To perform this study, a detailed Relap5 model is built including all the parts of the steam boiler. The control and regulation systems are also considered. To reproduce the most important parameters and phenomena with an acceptable accuracy and fidelity, a strong qualification work is undertaken concerning the facility nodalization. It consists of making a comparison between the code results and the plant available data in steady-state operation mode. Therefore, the model qualification results at the steady-state are in good agreement with the steam boiler experimental data. The steam boiler Relap5 model has proved satisfactory; and the model was capable of predicting the main thermal-hydraulic steady-state conditions of the steam boiler.

Keywords: industrial steam boiler, model qualification, natural circulation, relap5/mod3.2, steady state simulation

Procedia PDF Downloads 269
1396 Running the Athena Vortex Lattice Code in JAVA through the Java Native Interface

Authors: Paul Okonkwo, Howard Smith

Abstract:

This paper describes a methodology to integrate the Athena Vortex Lattice Aerodynamic Software for automated operation in a multivariate optimisation of the Blended Wing Body Aircraft. The Athena Vortex Lattice code developed at the Massachusetts Institute of Technology allows for the aerodynamic analysis of aircraft using the vortex lattice method. Ordinarily, the Athena Vortex Lattice operation requires a text file containing the aircraft geometry to be loaded into the AVL solver in order to determine the aerodynamic forces and moments. However, automated operation will be required to enable integration into a multidisciplinary optimisation framework. Automated AVL operation within the JAVA design environment will nonetheless require a modification and recompilation of AVL source code into an executable file capable of running on windows and other platforms without the –X11 libraries. This paper describes the procedure for the integrating the FORTRAN written AVL software for automated operation within the multivariate design synthesis optimisation framework for the conceptual design of the BWB aircraft.

Keywords: aerodynamics, automation, optimisation, AVL, JNI

Procedia PDF Downloads 565
1395 The Survey Research and Evaluation of Green Residential Building Based on the Improved Group Analytical Hierarchy Process Method in Yinchuan

Authors: Yun-na Wu, Zhen Wang

Abstract:

Due to the economic downturn and the deterioration of the living environment, the development of residential buildings as high energy consuming building is gradually changing from “extensive” to green building in China. So, the evaluation system of green building is continuously improved, but the current evaluation work has the following problems: (1) There are differences in the cost of the actual investment and the purchasing power of residents, also construction target of green residential building is single and lacks multi-objective performance development. (2) Green building evaluation lacks regional characteristics and cannot reflect the different regional residents demand. (3) In the process of determining the criteria weight, the experts’ judgment matrix is difficult to meet the requirement of consistency. Therefore, to solve those problems, questionnaires which are about the green residential building for Ningxia area are distributed, and the results of questionnaires can feedback the purchasing power of residents and the acceptance of the green building cost. Secondly, combined with the geographical features of Ningxia minority areas, the evaluation criteria system of green residential building is constructed. Finally, using the improved group AHP method and the grey clustering method, the criteria weight is determined, and a real case is evaluated, which is located in Xing Qing district, Ningxia. A conclusion can be obtained that the professional evaluation for this project and good social recognition is basically the same.

Keywords: evaluation, green residential building, grey clustering method, group AHP

Procedia PDF Downloads 397
1394 ROOP: Translating Sequential Code Fragments to Distributed Code Fragments Using Deep Reinforcement Learning

Authors: Arun Sanjel, Greg Speegle

Abstract:

Every second, massive amounts of data are generated, and Data Intensive Scalable Computing (DISC) frameworks have evolved into effective tools for analyzing such massive amounts of data. Since the underlying architecture of these distributed computing platforms is often new to users, building a DISC application can often be time-consuming and prone to errors. The automated conversion of a sequential program to a DISC program will consequently significantly improve productivity. However, synthesizing a user’s intended program from an input specification is complex, with several important applications, such as distributed program synthesizing and code refactoring. Existing works such as Tyro and Casper rely entirely on deductive synthesis techniques or similar program synthesis approaches. Our approach is to develop a data-driven synthesis technique to identify sequential components and translate them to equivalent distributed operations. We emphasize using reinforcement learning and unit testing as feedback mechanisms to achieve our objectives.

Keywords: program synthesis, distributed computing, reinforcement learning, unit testing, DISC

Procedia PDF Downloads 103
1393 Collection and Phenotypic Characterization of Some Nigerian Bambara Groundnut (Vigna subterranea (L.) Verdc.) Germplasm Using Seed Morphology

Authors: Abejide Dorcas Ropo, Falusi Olamide Ahmed, Daudu Oladipupo Abdulazeez Yusuf, Muhammad Liman Muhammad, Gado Aishatu Adamu

Abstract:

Bambara groundnut is an indigenous African legume with great potential to tackle the problem of food insecurity in Nigeria. A germplasm collection mission was carried out in collaboration with the Agricultural Developments Project (ADP) Extension officers of Nigeria between October and December 2014. Bambara groundnut seeds were collected from farmers in different States in Nigeria, such as Kaduna, Niger, Kogi, Benue, Plateau, Adamawa, Nasarawa, Jigawa, Enugu, and Federal Capital Territoy (FCT) Abuja. Some seeds were also collected from National Centre for Genetic Resources and Biotechnology (NACGRAB). The seeds were phenotyped using the descriptor list of Vigna subterranea produced by the International Plant Genetic Resource Institute. A total of 45 original seed lots were collected, which comprised of mixed seeds having different seed coat colours (15) and pure seeded accessions having the same seed coat and eye colour (30). After sorting, a total of 83 accessions were derived from the 45 original seed lots collected, and a total of 24 distinct seed morphotypes with varying seed coat colours and eye colours were identified from the collections. They include cream ( cream ash eye, cream plain eye, and cream black eye), cream purplish spots, cream brown spots/stripe, cream black stripe, cream dark brown patches, cream light grey spots, cream black patches, black, red, light red, dark red, brownish red, brown speckled with black, red speckled with black, brown, brown with brown pattern below hilum, brown with black pattern below hilum, cream black, grey brown, grey black and variegated red. The highest number of accessions were collected from NACGRAB (11), followed by Niger State (10), and the lowest from Benue, Jigawa, and Adamawa States (2). Niger State also had the highest number of mixed seeds. The different seed phenotypes observed in the study are important for the field production of true-to-type lines and can be exploited for the genetic improvement of the Bambara groundnut.

Keywords: Bambara groundnut, characterization, collection, germplasm, phenotypic

Procedia PDF Downloads 141
1392 Rest API Based System-level Test Automation for Mobile Applications

Authors: Jisoo Song

Abstract:

Today’s mobile applications are communicating with servers more and more in order to access external services or information. Also, server-side code changes are more frequent than client-side code changes in a mobile application. The frequent changes lead to an increase in testing cost increase. To reduce costs, UI based test automation can be one of the solutions. It is a common automation technique in system-level testing. However, it can be unsuitable for mobile applications. When you automate tests based on UI elements for mobile applications, there are some limitations such as the overhead of script maintenance or the difficulty of finding invisible defects that UI elements cannot represent. To overcome these limitations, we present a new automation technique based on Rest API. You can automate system-level tests through test scripts that you write. These scripts call a series of Rest API in a user’s action sequence. This technique does not require testers to know the internal implementation details, only input and expected output of Rest API. You can easily modify test cases by modifying Rest API input values and also find problems that might not be evident from the UI level by validating output values. For example, when an application receives price information from a payment server and user cannot see it at UI level, Rest API based scripts can check whether price information is correct or not. More than 10 mobile applications at our company are being tested automatically based on Rest API scripts whenever application source code, mostly server source code, is built. We are finding defects right away by setting a script as a build job in CI server. The build job starts when application code builds are completed. This presentation will also include field cases from our company.

Keywords: case studies at SK Planet, introduction of rest API based test automation, limitations of UI based test automation

Procedia PDF Downloads 446
1391 Pallet Tracking and Cost Optimization of the Flow of Goods in Logistics Operations by Serial Shipping Container Code

Authors: Dominika Crnjac Milic, Martina Martinovic, Vladimir Simovic

Abstract:

The case study method in this paper shows the implementation of Information Technology (IT) and the Serial Shipping Container Code (SSCC) in a Croatian company that deals with logistics operations and provides logistics services in the cold chain segment. This company is aware of the sensitivity of the goods entrusted to them by the user of the service, as well as of the importance of speed and accuracy in providing logistics services. To that end, it has implemented and used the latest IT to ensure the highest standard of high-quality logistics services to its customers. Looking for efficiency and optimization of supply chain management, while maintaining a high level of quality of the products that are sold, today's users of outsourced logistics services are open to the implementation of new IT products that ultimately deliver savings. By analysing the positive results and the difficulties that arise when using this technology, we aim to provide an insight into the potential of this approach of the logistics service provider.

Keywords: logistics operations, serial shipping container code, information technology, cost optimization

Procedia PDF Downloads 359
1390 Model Driven Architecture Methodologies: A Review

Authors: Arslan Murtaza

Abstract:

Model Driven Architecture (MDA) is technique presented by OMG (Object Management Group) for software development in which different models are proposed and converted them into code. The main plan is to identify task by using PIM (Platform Independent Model) and transform it into PSM (Platform Specific Model) and then converted into code. In this review paper describes some challenges and issues that are faced in MDA, type and transformation of models (e.g. CIM, PIM and PSM), and evaluation of MDA-based methodologies.

Keywords: OMG, model driven rrchitecture (MDA), computation independent model (CIM), platform independent model (PIM), platform specific model(PSM), MDA-based methodologies

Procedia PDF Downloads 457