Search results for: drawing digital tool
1876 Surface Thermodynamics Approach to Mycobacterium tuberculosis (M-TB) – Human Sputum Interactions
Authors: J. L. Chukwuneke, C. H. Achebe, S. N. Omenyi
Abstract:
This research work presents the surface thermodynamics approach to M-TB/HIV-Human sputum interactions. This involved the use of the Hamaker coefficient concept as a surface energetics tool in determining the interaction processes, with the surface interfacial energies explained using van der Waals concept of particle interactions. The Lifshitz derivation for van der Waals forces was applied as an alternative to the contact angle approach which has been widely used in other biological systems. The methodology involved taking sputum samples from twenty infected persons and from twenty uninfected persons for absorbance measurement using a digital Ultraviolet visible Spectrophotometer. The variables required for the computations with the Lifshitz formula were derived from the absorbance data. The Matlab software tools were used in the mathematical analysis of the data produced from the experiments (absorbance values). The Hamaker constants and the combined Hamaker coefficients were obtained using the values of the dielectric constant together with the Lifshitz Equation. The absolute combined Hamaker coefficients A132abs and A131abs on both infected and uninfected sputum samples gave the values of A132abs = 0.21631x10-21Joule for M-TB infected sputum and Ã132abs = 0.18825x10-21Joule for M-TB/HIV infected sputum. The significance of this result is the positive value of the absolute combined Hamaker coefficient which suggests the existence of net positive van der waals forces demonstrating an attraction between the bacteria and the macrophage. This however, implies that infection can occur. It was also shown that in the presence of HIV, the interaction energy is reduced by 13% conforming adverse effects observed in HIV patients suffering from tuberculosis.Keywords: Absorbance, dielectric constant, Hamaker coefficient, Lifshitz formula, macrophage, Mycobacterium tuberculosis, Van der Waals forces.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17781875 Mechanical Properties of D2 Tool Steel Cryogenically Treated Using Controllable Cooling
Authors: A. Rabin, G. Mazor, I. Ladizhenski, R. Z. Shneck
Abstract:
The hardness and hardenability of AISI D2 cold work tool steel with conventional quenching (CQ), deep cryogenic quenching (DCQ) and rapid deep cryogenic quenching heat treatments caused by temporary porous coating based on magnesium sulfate was investigated. Each of the cooling processes was examined from the perspective of the full process efficiency, heat flux in the austenite-martensite transformation range followed by characterization of the temporary porous layer made of magnesium sulfate using confocal laser scanning microscopy (CLSM), surface and core hardness and hardenability using Vickers hardness technique. The results show that the cooling rate (CR) at the austenite-martensite transformation range has a high influence on the hardness of the studied steel.
Keywords: AISI D2, controllable cooling, magnesium sulfate coating, rapid cryogenic heat treatment, temporary porous layer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3721874 Second Language Development with an Intercultural Approach: A Pilot Program Applied to Higher Education Students from a Escuela Normal in Atequiza, Mexico
Authors: Frida C. Jaime Franco, C. Paulina Navarro Núñez, R. Jacob Sánchez Nájera
Abstract:
The importance of developing multi-language abilities in our global society is noteworthy. However, the necessity, interest, and consciousness of the significance that the development of another language represents, apart from the mother tongue, is not always the same in all contexts as it is in multicultural communities, especially in rural higher education institutions immersed in small communities. Leading opportunities for digital interaction among learners from Mexico and abroad partners represents scaffolding towards, not only language skills development but also intercultural communicative competences (ICC). This study leads us to consider what should be the best approach to work while applying a program of ICC integrated into the practice of EFL. While analyzing the roots of the language, it is possible to obtain the main objective of learning another language, to communicate with a functional purpose, as well as attaching social practices to the learning process, giving a result of functionality and significance to the target language. Hence, the collateral impact that collaborative learning leads to, aims to contribute to a better global understanding as well as a means of self and other cultural awareness through intercultural communication. While communicating through the target language by online collaboration among students in platforms of long-distance communication, language is used as a tool of interaction to broaden students’ perspectives reaching a substantial improvement with the help of their differences. This process should consider the application of the target language in the inquiry of sociocultural information, expecting the learners to integrate communicative skills to handle cultural differentiation at the same time they apply the knowledge of their target language in a real scenario of communication, despite being through virtual resources.
Keywords: Collaborative learning, English as a Foreign language, intercultural communication, intercultural communicative competences, virtual partnership.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6771873 Generating Qualitative Causal Graph using Modeling Constructs of Qualitative Process Theory for Explaining Organic Chemistry Reactions
Authors: Alicia Y. C. Tang, Rukaini Abdullah, Sharifuddin M. Zain, Noorsaadah A. Rahman
Abstract:
This paper discusses the causal explanation capability of QRIOM, a tool aimed at supporting learning of organic chemistry reactions. The development of the tool is based on the hybrid use of Qualitative Reasoning (QR) technique and Qualitative Process Theory (QPT) ontology. Our simulation combines symbolic, qualitative description of relations with quantity analysis to generate causal graphs. The pedagogy embedded in the simulator is to both simulate and explain organic reactions. Qualitative reasoning through a causal chain will be presented to explain the overall changes made on the substrate; from initial substrate until the production of final outputs. Several uses of the QPT modeling constructs in supporting behavioral and causal explanation during run-time will also be demonstrated. Explaining organic reactions through causal graph trace can help improve the reasoning ability of learners in that their conceptual understanding of the subject is nurtured.Keywords: Qualitative reasoning, causal graph, organicreactions, explanation, QPT, modeling constructs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14201872 An Exhaustive Review of Die Sinking Electrical Discharge Machining Process and Scope for Future Research
Authors: M. M. Pawade, S. S. Banwait
Abstract:
Electrical Discharge Machine (EDM) is especially used for the manufacturing of 3-D complex geometry and hard material parts that are extremely difficult-to-machine by conventional machining processes. In this paper authors review the research work carried out in the development of die-sinking EDM within the past decades for the improvement of machining characteristics such as Material Removal Rate, Surface Roughness and Tool Wear Ratio. In this review various techniques reported by EDM researchers for improving the machining characteristics have been categorized as process parameters optimization, multi spark technique, powder mixed EDM, servo control system and pulse discriminating. At the end, flexible machine controller is suggested for Die Sinking EDM to enhance the machining characteristics and to achieve high-level automation. Thus, die sinking EDM can be integrated with Computer Integrated Manufacturing environment as a need of agile manufacturing systems.Keywords: Electrical Discharge Machine, Flexible Machine Controller, Material Removal Rate, Tool Wear Ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 52961871 Computational Method for Annotation of Protein Sequence According to Gene Ontology Terms
Authors: Razib M. Othman, Safaai Deris, Rosli M. Illias
Abstract:
Annotation of a protein sequence is pivotal for the understanding of its function. Accuracy of manual annotation provided by curators is still questionable by having lesser evidence strength and yet a hard task and time consuming. A number of computational methods including tools have been developed to tackle this challenging task. However, they require high-cost hardware, are difficult to be setup by the bioscientists, or depend on time intensive and blind sequence similarity search like Basic Local Alignment Search Tool. This paper introduces a new method of assigning highly correlated Gene Ontology terms of annotated protein sequences to partially annotated or newly discovered protein sequences. This method is fully based on Gene Ontology data and annotations. Two problems had been identified to achieve this method. The first problem relates to splitting the single monolithic Gene Ontology RDF/XML file into a set of smaller files that can be easy to assess and process. Thus, these files can be enriched with protein sequences and Inferred from Electronic Annotation evidence associations. The second problem involves searching for a set of semantically similar Gene Ontology terms to a given query. The details of macro and micro problems involved and their solutions including objective of this study are described. This paper also describes the protein sequence annotation and the Gene Ontology. The methodology of this study and Gene Ontology based protein sequence annotation tool namely extended UTMGO is presented. Furthermore, its basic version which is a Gene Ontology browser that is based on semantic similarity search is also introduced.
Keywords: automatic clustering, bioinformatics tool, gene ontology, protein sequence annotation, semantic similarity search
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31291870 Computational Study on Cardiac-Coronary Interaction in Terms of Coronary Flow-Pressure Waveforms in Presence of Drugs: Comparison Between Simulated and In Vivo Data
Authors: C. De Lazzari, E. Del Prete, I. Genuini, F. Fedele
Abstract:
Cardiovascular human simulator can be a useful tool in understanding complex physiopathological process in cardiocirculatory system. It can also be a useful tool in order to investigate the effects of different drugs on hemodynamic parameters. The aim of this work is to test the potentiality of our cardiovascular numerical simulator CARDIOSIM© in reproducing flow/pressure coronary waveforms in presence of two different drugs: Amlodipine (AMLO) and Adenosine (ADO). In particular a time-varying intramyocardial compression, assumed to be proportional to the left ventricular pressure, was related to the venous coronary compliances in order to study its effects on the coronary blood flow and the flow/pressure loop. Considering that coronary circulation dynamics is strongly interrelated with the mechanics of the left ventricular contraction, relaxation, and filling, the numerical model allowed to analyze the effects induced by the left ventricular pressure on the coronary flow.Keywords: Cardiovascular system, Coronary blood flow, Hemodynamic, Numerical simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17361869 Carbon Isotope Discrimination, A Tool for Screening of Salinity Tolerance of Genotypes
Authors: Alireza Dadkhah, Mahmoud Ghorbanzadeh- Neghab
Abstract:
This study carried out in order to investigate the effects of salinity on carbon isotope discrimination (Δ) of shoots and roots of four sugar beet cultivars (cv) including Madison (British origin) and three Iranian culivars (7233-P12, 7233-P21 and 7233-P29). Plants were grown in sand culture medium in greenhouse conditions. Plants irrigated with saline water (tap water as control, 50 mM, 150 mM, 250 mM and 350 mM of NaCl + CaCl2 in 5 to 1 molar ratio) from 4 leaves stage for 16 weeks. Carbon isotope discrimination significantly decreased with increasing salinity. Significant differences of Δ between shoot and root were observed in all cvs and all levels of salinity. Madison cv showed lower Δ in shoot and root than other three cvs at all levels of salinity expect control, but cv 7233-P29 had significantly higher Δ values at saline conditions of 150 mM and above. Therefore, Δ might be applicable, as a useful tool, for study of salinity tolerance of sugar beet genotypes.Keywords: Carbon isotope discrimination, Photosynthesis, Salt stress, Sugar beet
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16951868 Machine Learning in Production Systems Design Using Genetic Algorithms
Authors: Abu Qudeiri Jaber, Yamamoto Hidehiko Rizauddin Ramli
Abstract:
To create a solution for a specific problem in machine learning, the solution is constructed from the data or by use a search method. Genetic algorithms are a model of machine learning that can be used to find nearest optimal solution. While the great advantage of genetic algorithms is the fact that they find a solution through evolution, this is also the biggest disadvantage. Evolution is inductive, in nature life does not evolve towards a good solution but it evolves away from bad circumstances. This can cause a species to evolve into an evolutionary dead end. In order to reduce the effect of this disadvantage we propose a new a learning tool (criteria) which can be included into the genetic algorithms generations to compare the previous population and the current population and then decide whether is effective to continue with the previous population or the current population, the proposed learning tool is called as Keeping Efficient Population (KEP). We applied a GA based on KEP to the production line layout problem, as a result KEP keep the evaluation direction increases and stops any deviation in the evaluation.Keywords: Genetic algorithms, Layout problem, Machinelearning, Production system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16291867 Effectiveness Evaluation of a Machine Design Process Based on the Computation of the Specific Output
Authors: Barenten Suciu
Abstract:
In this paper, effectiveness of a machine design process is evaluated on the basis of the specific output calculus. Concretely, a screw-worm gear mechanical transmission is designed by using the classical and the 3D-CAD methods. Strength analysis and drawing of the designed parts is substantially aided by employing the SolidWorks software. Quality of the design process is assessed by manufacturing (printing) the parts, and by computing the efficiency, specific load, as well as the specific output (work) of the mechanical transmission. Influence of the stroke, travelling velocity and load on the mechanical output, is emphasized. Optimal design of the mechanical transmission becomes possible by the appropriate usage of the acquired results.
Keywords: Mechanical transmission, design, screw, worm-gear, efficiency, specific output, 3D-printing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9371866 Investigating Polynomial Interpolation Functions for Zooming Low Resolution Digital Medical Images
Authors: Maninder Pal
Abstract:
Medical digital images usually have low resolution because of nature of their acquisition. Therefore, this paper focuses on zooming these images to obtain better level of information, required for the purpose of medical diagnosis. For this purpose, a strategy for selecting pixels in zooming operation is proposed. It is based on the principle of analog clock and utilizes a combination of point and neighborhood image processing. In this approach, the hour hand of clock covers the portion of image to be processed. For alignment, the center of clock points at middle pixel of the selected portion of image. The minute hand is longer in length, and is used to gain information about pixels of the surrounding area. This area is called neighborhood pixels region. This information is used to zoom the selected portion of the image. The proposed algorithm is implemented and its performance is evaluated for many medical images obtained from various sources such as X-ray, Computerized Tomography (CT) scan and Magnetic Resonance Imaging (MRI). However, for illustration and simplicity, the results obtained from a CT scanned image of head is presented. The performance of algorithm is evaluated in comparison to various traditional algorithms in terms of Peak signal-to-noise ratio (PSNR), maximum error, SSIM index, mutual information and processing time. From the results, the proposed algorithm is found to give better performance than traditional algorithms.
Keywords: Zooming, interpolation, medical images, resolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15771865 Management by Sufficient Economy Philosophy for Hospitality Business in Samut Songkram
Authors: Krisada Sungkhamanee
Abstract:
The objectives of this research are to know the management form of Samut Songkram lodging entrepreneurs with sufficient economy framework, to know the threat that affect this business and drawing the fit model for this province in order to sustain their business with Samut Songkram style. What will happen if they do not use this philosophy? Will they have a cash short fall? The data and information are collected by informal discussion with 8 managers and 400 questionnaires. We will use a mix of methods both qualitative research and quantitative research for our study. Bent Flyvbjerg’s phronesis is utilized for this analysis. Our research will prove that sufficient economy can help small and medium business firms solve their problems. We think that the results of our research will be a financial model to solve many problems of the entrepreneurs and this way will use to practice in other areas of our country.
Keywords: Samut Songkram, Hospitality Business, Sufficient Economy Philosophy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32761864 Benchmarking of Pentesting Tools
Authors: Esteban Alejandro Armas Vega, Ana Lucila Sandoval Orozco, Luis Javier García Villalba
Abstract:
The benchmarking of tools for dynamic analysis of vulnerabilities in web applications is something that is done periodically, because these tools from time to time update their knowledge base and search algorithms, in order to improve their accuracy. Unfortunately, the vast majority of these evaluations are made by software enthusiasts who publish their results on blogs or on non-academic websites and always with the same evaluation methodology. Similarly, academics who have carried out this type of analysis from a scientific approach, the majority, make their analysis within the same methodology as well the empirical authors. This paper is based on the interest of finding answers to questions that many users of this type of tools have been asking over the years, such as, to know if the tool truly test and evaluate every vulnerability that it ensures do, or if the tool, really, deliver a real report of all the vulnerabilities tested and exploited. This kind of questions have also motivated previous work but without real answers. The aim of this paper is to show results that truly answer, at least on the tested tools, all those unanswered questions. All the results have been obtained by changing the common model of benchmarking used for all those previous works.Keywords: Cybersecurity, IDS, security, web scanners, web vulnerabilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18081863 Effects of Manufacture and Assembly Errors on the Output Error of Globoidal Cam Mechanisms
Authors: Shuting Ji, Yueming Zhang, Jing Zhao
Abstract:
The output error of the globoidal cam mechanism can be considered as a relevant indicator of mechanism performance, because it determines kinematic and dynamical behavior of mechanical transmission. Based on the differential geometry and the rigid body transformations, the mathematical model of surface geometry of the globoidal cam is established. Then we present the analytical expression of the output error (including the transmission error and the displacement error along the output axis) by considering different manufacture and assembly errors. The effects of the center distance error, the perpendicular error between input and output axes and the rotational angle error of the globoidal cam on the output error are systematically analyzed. A globoidal cam mechanism which is widely used in automatic tool changer of CNC machines is applied for illustration. Our results show that the perpendicular error and the rotational angle error have little effects on the transmission error but have great effects on the displacement error along the output axis. This study plays an important role in the design, manufacture and assembly of the globoidal cam mechanism.Keywords: Globoidal cam mechanism, manufacture error, transmission error, automatic tool changer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23491862 Swarm Intelligence based Optimal Linear Phase FIR High Pass Filter Design using Particle Swarm Optimization with Constriction Factor and Inertia Weight Approach
Authors: Sangeeta Mandal, Rajib Kar, Durbadal Mandal, Sakti Prasad Ghoshal
Abstract:
In this paper, an optimal design of linear phase digital high pass finite impulse response (FIR) filter using Particle Swarm Optimization with Constriction Factor and Inertia Weight Approach (PSO-CFIWA) has been presented. In the design process, the filter length, pass band and stop band frequencies, feasible pass band and stop band ripple sizes are specified. FIR filter design is a multi-modal optimization problem. The conventional gradient based optimization techniques are not efficient for digital filter design. Given the filter specifications to be realized, the PSO-CFIWA algorithm generates a set of optimal filter coefficients and tries to meet the ideal frequency response characteristic. In this paper, for the given problem, the designs of the optimal FIR high pass filters of different orders have been performed. The simulation results have been compared to those obtained by the well accepted algorithms such as Parks and McClellan algorithm (PM), genetic algorithm (GA). The results justify that the proposed optimal filter design approach using PSOCFIWA outperforms PM and GA, not only in the accuracy of the designed filter but also in the convergence speed and solution quality.Keywords: FIR Filter; PSO-CFIWA; PSO; Parks and McClellanAlgorithm, Evolutionary Optimization Technique; MagnitudeResponse; Convergence; High Pass Filter
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15551861 Towards a Compliance Reporting using a Balanced Scorecard
Authors: Michael Amberg, Dipl. Kfm. Johannes C. Panitz
Abstract:
Compliance requires an effective communication within an enterprise as well as towards a company-s external environment. This requirement commences with the implementation of compliance within large scale compliance projects and still persists in the compliance reporting within standard operations. On the one hand the understanding of compliance necessities within the organization is promoted. On the other hand reduction of asymmetric information with compliance stakeholders is achieved. To reach this goal, a central reporting must provide a consolidated view of different compliance efforts- statuses. A concept which could be adapted for this purpose is the balanced scorecard by Kaplan / Norton. This concept has not been analyzed in detail concerning its adequacy for a holistic compliance reporting starting in compliance projects until later usage in regularly compliance operations. At first, this paper evaluates if a holistic compliance reporting can be designed by using the balanced scorecard concept. The current status of compliance reporting clearly shows that scorecards are generally accepted as a compliance reporting tool and are already used for corporate governance reporting. Additional specialized compliance IT - solutions exist in the market. After the scorecard-s adequacy is thoroughly examined and proofed, an example strategy map as the basis to derive a compliance balanced scorecard is defined. This definition answers the question on proceeding in designing a compliance reporting tool.Keywords: Balanced Scorecard, Compliance, ComplianceReporting, Compliance Scorecard.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33711860 Meeting Criminogenic Needs to Reduce Recidivism: The Diversion of Vulnerable Offenders from the Criminal Justice System into Care
Authors: Paulo Rocha
Abstract:
Once in touch with the Criminal Justice System, offenders with mental disorder tend to return to custody more often than nondisordered individuals, which suggests they have not been receiving appropriate treatment in prison. In this scenario, diverting individuals into care as early as possible in their trajectory seems to be the appropriate approach to rehabilitate mentally unwell offenders and alleviate overcrowded prisons. This paper builds on an ethnographic research investigating the challenges encountered by practitioners working to divert offenders into care while attempting to establish cross-boundary interactions with professionals in the Criminal Justice System and Mental Health Services in the UK. Drawing upon the findings of the study, this paper suggests the development of adequate tools to enable liaison between agencies which ultimately results in successful interventions.
Keywords: Criminogenic needs, interagency collaboration, liaison and diversion, recidivism.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9101859 Sperm Identification Using Elliptic Model and Tail Detection
Authors: Vahid Reza Nafisi, Mohammad Hasan Moradi, Mohammad Hosain Nasr-Esfahani
Abstract:
The conventional assessment of human semen is a highly subjective assessment, with considerable intra- and interlaboratory variability. Computer-Assisted Sperm Analysis (CASA) systems provide a rapid and automated assessment of the sperm characteristics, together with improved standardization and quality control. However, the outcome of CASA systems is sensitive to the method of experimentation. While conventional CASA systems use digital microscopes with phase-contrast accessories, producing higher contrast images, we have used raw semen samples (no staining materials) and a regular light microscope, with a digital camera directly attached to its eyepiece, to insure cost benefits and simple assembling of the system. However, since the accurate finding of sperms in the semen image is the first step in the examination and analysis of the semen, any error in this step can affect the outcome of the analysis. This article introduces and explains an algorithm for finding sperms in low contrast images: First, an image enhancement algorithm is applied to remove extra particles from the image. Then, the foreground particles (including sperms and round cells) are segmented form the background. Finally, based on certain features and criteria, sperms are separated from other cells.Keywords: Computer-Assisted Sperm Analysis (CASA), Sperm identification, Tail detection, Elliptic shape model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19281858 Information Security Risk Management in IT-Based Process Virtualization: A Methodological Design Based on Action Research
Authors: Jefferson Camacho Mejía, Jenny Paola Forero Pachón, Luis Carlos Gómez Flórez
Abstract:
Action research is a qualitative research methodology, which leads the researcher to delve into the problems of a community in order to understand its needs in depth and finally, to propose actions that lead to a change of social paradigm. Although this methodology had its beginnings in the human sciences, it has attracted increasing interest and acceptance in the field of information systems research since the 1990s. The countless possibilities offered nowadays by the use of Information Technologies (IT) in the development of different socio-economic activities have meant a change of social paradigm and the emergence of the so-called information and knowledge society. According to this, governments, large corporations, small entrepreneurs and in general, organizations of all kinds are using IT to virtualize their processes, taking them from the physical environment to the digital environment. However, there is a potential risk for organizations related with exposing valuable information without an appropriate framework for protecting it. This paper shows progress in the development of a methodological design to manage the information security risks associated with the IT-based processes virtualization, by applying the principles of the action research methodology and it is the result of a systematic review of the scientific literature. This design consists of seven fundamental stages. These are distributed in the three stages described in the action research methodology: 1) Observe, 2) Analyze and 3) Take actions. Finally, this paper aims to offer an alternative tool to traditional information security management methodologies with a view to being applied specifically in the planning stage of IT-based process virtualization in order to foresee risks and to establish security controls before formulating IT solutions in any type of organization.
Keywords: Action research, information security, information technology, methodological design, process virtualization, risk management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9651857 Estimation of Synchronous Machine Synchronizing and Damping Torque Coefficients
Authors: Khaled M. EL-Naggar
Abstract:
Synchronizing and damping torque coefficients of a synchronous machine can give a quite clear picture for machine behavior during transients. These coefficients are used as a power system transient stability measurement. In this paper, a crow search optimization algorithm is presented and implemented to study the power system stability during transients. The algorithm makes use of the machine responses to perform the stability study in time domain. The problem is formulated as a dynamic estimation problem. An objective function that minimizes the error square in the estimated coefficients is designed. The method is tested using practical system with different study cases. Results are reported and a thorough discussion is presented. The study illustrates that the proposed method can estimate the stability coefficients for the critical stable cases where other methods may fail. The tests proved that the proposed tool is an accurate and reliable tool for estimating the machine coefficients for assessment of power system stability.Keywords: Optimization, estimation, synchronous, machine, crow search.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6681856 Usability and Affordances: Examinations of Object-Naming and Object-Task Performance in Haptic Interfaces
Authors: Mia Sorensen
Abstract:
The introduction of haptic elements in a graphic user interfaces are becoming more widespread. Since haptics are being introduced rapidly into computational tools, investigating how these models affect Human-Computer Interaction would help define how to integrate and model new modes of interaction. The interest of this paper is to discuss and investigate the issues surrounding Haptic and Graphic User Interface designs (GUI) as separate systems, as well as understand how these work in tandem. The development of these systems is explored from a psychological perspective, based on how usability is addressed through learning and affordances, defined by J.J. Gibson. Haptic design can be a powerful tool, aiding in intuitive learning. The problems discussed within the text is how can haptic interfaces be integrated within a GUI without the sense of frivolity. Juxtaposing haptics and Graphic user interfaces has issues of motivation; GUI tends to have a performatory process, while Haptic Interfaces use affordances to learn tool use. In a deeper view, it is noted that two modes of perception, foveal and ambient, dictate perception. These two modes were once thought to work in tandem, however it has been discovered that these processes work independently from each other. Foveal modes interpret orientation is space which provide for posture, locomotion, and motor skills with variations of the sensory information, which instructs perceptions of object-task performance. It is contended, here, that object-task performance is a key element in the use of Haptic Interfaces because exploratory learning uses affordances in order to use an object, without meditating an experience cognitively. It is a direct experience that, through iteration, can lead to skill-sets. It is also indicated that object-task performance will not work as efficiently without the use of exploratory or kinesthetic learning practices. Therefore, object-task performance is not as congruently explored in GUI than it is practiced in Haptic interfaces.
Keywords: Affordances, Graphic User Interface, HapticInterfaces, Tool-Use, Object-Naming, Object-Task Performance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17531855 From Experiments to Numerical Modeling: A Tool for Teaching Heat Transfer in Mechanical Engineering
Authors: D. Zabala, Y. Cárdenas, G. Núñez
Abstract:
In this work the numerical simulation of transient heat transfer in a cylindrical probe is done. An experiment was conducted introducing a steel cylinder in a heating chamber and registering its surface temperature along the time during one hour. In parallel, a mathematical model was solved for one dimension transient heat transfer in cylindrical coordinates, considering the boundary conditions of the test. The model was solved using finite difference method, because the thermal conductivity in the cylindrical steel bar and the convection heat transfer coefficient used in the model are considered temperature dependant functions, and both conditions prevent the use of the analytical solution. The comparison between theoretical and experimental results showed the average deviation is below 2%. It was concluded that numerical methods are useful in order to solve engineering complex problems. For constant k and h, the experimental methodology used here can be used as a tool for teaching heat transfer in mechanical engineering, using mathematical simplified models with analytical solutions.Keywords: Heat transfer experiment, thermal conductivity, finite difference, engineering education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14611854 The Development of the Quality Management Processes for the Building and Environment of the Basic Education Schools
Authors: Suppara Charoenpoom
Abstract:
The objectives of this research was to design and develop a quality management of the school buildings and environment. A quantitative and qualitative mixed research methodology was used. The population sample included 14 directors of primary schools. Two research tools were used. The first research tool included an in-depth interview and questionnaire. The second research tool included the Quality Business Process and Quality Work Procedure, and a Key Performance Indicator of each activity. The statistics included mean and standard deviation. The findings for the development of a quality management process of buildings and environment administration of the basic schools consisted of one quality business process (QBP) and seven quality work processes (QWP). The result from the experts’ evaluation revealed that the process and implementation of quality management of the school buildings and environment has passed the inspection process with consensus. This implies that the process of quality management of the school buildings and environment is suitable for implementation. Moreover, the level of agreement in the feasibility of the implementation of this plan had the mean in the range of 0.64-1.00 which suggests the design of the new plan is acceptable.
Keywords: Process, Building, Environment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16351853 A Critical Review of the Adequacy of EIA Reports-Evidence from Pakistan
Authors: Obaidullah Nadeem, Rizwan Hameed
Abstract:
The preparation of good-quality Environmental Impact Assessment (EIA) reports contribute to enhancing overall effectiveness of EIA. This component of the EIA process becomes more important in situation where public participation is weak and there is lack of expertise on the part of the competent authority. In Pakistan, EIA became mandatory for every project likely to cause adverse environmental impacts from July 1994. The competent authority also formulated guidelines for preparation and review of EIA reports in 1997. However, EIA is yet to prove as a successful decision support tool to help in environmental protection. One of the several reasons of this ineffectiveness is the generally poor quality of EIA reports. This paper critically reviews EIA reports of some randomly selected projects. Interviews of EIA consultants, project proponents and concerned government officials have also been conducted to underpin the root causes of poor quality of EIA reports. The analysis reveals several inadequacies particularly in areas relating to identification, evaluation and mitigation of key impacts and consideration of alternatives. The paper identifies some opportunities and suggests measures for improving the quality of EIA reports and hence making EIA an effective tool to help in environmental protection.
Keywords: Environmental Impact Assessment, EIA Guidelines, EIA Reports, Pakistan.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33151852 How Valid Are Our Language Test Interpretations? A Demonstrative Example
Authors: Masoud Saeedi, Shirin Rahimi Kazerooni, Vahid Parvaresh
Abstract:
Validity is an overriding consideration in language testing. If a test score is intended for a particular purpose, this must be supported through empirical evidence. This article addresses the validity of a multiple-choice achievement test (MCT). The test is administered at the end of each semester to decide about students' mastery of a course in general English. To provide empirical evidence pertaining to the validity of this test, two criterion measures were used. In so doing, a Cloze test and a C-test which are reported to gauge general English proficiency were utilized. The results of analyses show that there is a statistically significant correlation among participants' scores on the MCT, Cloze, and Ctest. Drawing on the findings of the study, it can be cautiously deduced that these tests measure the same underlying trait. However, allowing for the limitations of using criterion measures to validate tests, we cannot make any absolute claim as to the validity of this MCT test.
Keywords: C-test, cloze test, multiple-choice test, validity argument.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19251851 A Book Cover as an Expression of Conceptualization and a Tool of Social Identity Construction: The Interpretation Based on the Example of G. Ritzer's book McDonaldization of Society
Authors: Jiří Pavelka
Abstract:
The study is based on the assumption that media products are appropriate subjects for the exploration of social and cultural identities as a keystone of value orientations of their authors, producers and target audiences. The research object of the study is the title page of the book cover of a professional publication that serves as a medium of marketing, scientific and intercultural communication, which is the result of semiotic and intercultural transfer. The study aims to answer the question whether the book cover is an expression of conceptualization and tool for social identity construction. It attempts to determine what value orientations and what concepts of social and cultural identities are hidden in the narrative structures of the book cover of the Czech translation of the book by G. Ritzer The McDonaldization of Society (1993), issued after the fall of the iron curtain in 1996 in the Czech Republic.
Keywords: Social and cultural identity, book cover, marketing communication, semiotic and intercultural transfer, narrative structure, the McDonaldisation of Society.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18921850 Unsupervised Feature Learning by Pre-Route Simulation of Auto-Encoder Behavior Model
Authors: Youngjae Jin, Daeshik Kim
Abstract:
This paper describes a cycle accurate simulation results of weight values learned by an auto-encoder behavior model in terms of pre-route simulation. Given the results we visualized the first layer representations with natural images. Many common deep learning threads have focused on learning high-level abstraction of unlabeled raw data by unsupervised feature learning. However, in the process of handling such a huge amount of data, the learning method’s computation complexity and time limited advanced research. These limitations came from the fact these algorithms were computed by using only single core CPUs. For this reason, parallel-based hardware, FPGAs, was seen as a possible solution to overcome these limitations. We adopted and simulated the ready-made auto-encoder to design a behavior model in VerilogHDL before designing hardware. With the auto-encoder behavior model pre-route simulation, we obtained the cycle accurate results of the parameter of each hidden layer by using MODELSIM. The cycle accurate results are very important factor in designing a parallel-based digital hardware. Finally this paper shows an appropriate operation of behavior model based pre-route simulation. Moreover, we visualized learning latent representations of the first hidden layer with Kyoto natural image dataset.
Keywords: Auto-encoder, Behavior model simulation, Digital hardware design, Pre-route simulation, Unsupervised feature learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26921849 Intelligent Modeling of the Electrical Activity of the Human Heart
Authors: Lambros V. Skarlas, Grigorios N. Beligiannis, Efstratios F. Georgopoulos, Adam V. Adamopoulos
Abstract:
The aim of this contribution is to present a new approach in modeling the electrical activity of the human heart. A recurrent artificial neural network is being used in order to exhibit a subset of the dynamics of the electrical behavior of the human heart. The proposed model can also be used, when integrated, as a diagnostic tool of the human heart system. What makes this approach unique is the fact that every model is being developed from physiological measurements of an individual. This kind of approach is very difficult to apply successfully in many modeling problems, because of the complexity and entropy of the free variables describing the complex system. Differences between the modeled variables and the variables of an individual, measured at specific moments, can be used for diagnostic purposes. The sensor fusion used in order to optimize the utilization of biomedical sensors is another point that this paper focuses on. Sensor fusion has been known for its advantages in applications such as control and diagnostics of mechanical and chemical processes.Keywords: Artificial Neural Networks, Diagnostic System, Health Condition Modeling Tool, Heart Diagnostics Model, Heart Electricity Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18271848 Feasibility of Risk Assessment for Type 2 Diabetes in Community Pharmacies Using Two Different Approaches: A Pilot Study in Thailand
Authors: Thitaporn Thoopputra, Tipaporn Pongmesa, Shuchuen Li
Abstract:
Aims: To evaluate the application of non-invasive diabetes risk assessment tool in community pharmacy setting. Methods: Thai diabetes risk score was applied to assess individuals at risk of developing type 2 diabetes. Interactive computer-based risk screening (IT) and paper-based risk screening (PT) tools were applied. Participants aged over 25 years with no known diabetes were recruited in six participating pharmacies. Results: A total of 187 clients, mean aged (+SD) was 48.6 (+10.9) years. 35% were at high risk. The mean value of willingness-to-pay for the service fee in IT group was significantly higher than PT group (p=0.013). No significant difference observed for the satisfaction between groups. Conclusions: Non-invasive risk assessment tool, whether paper-based or computerized-based can be applied in community pharmacy to support the enhancing role of pharmacists in chronic disease management. Long term follow up is needed to determine the impact of its application in clinical, humanistic and economic outcomes.
Keywords: Community pharmacy, intervention, prevention, risk assessment, type 2 diabetes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22041847 Toward a Use of Ontology to Reinforcing Semantic Classification of Message Based On LSA
Authors: S. Lgarch, M. Khalidi Idrissi, S. Bennani
Abstract:
For best collaboration, Asynchronous tools and particularly the discussion forums are the most used thanks to their flexibility in terms of time. To convey only the messages that belong to a theme of interest of the tutor in order to help him during his tutoring work, use of a tool for classification of these messages is indispensable. For this we have proposed a semantics classification tool of messages of a discussion forum that is based on LSA (Latent Semantic Analysis), which includes a thesaurus to organize the vocabulary. Benefits offered by formal ontology can overcome the insufficiencies that a thesaurus generates during its use and encourage us then to use it in our semantic classifier. In this work we propose the use of some functionalities that a OWL ontology proposes. We then explain how functionalities like “ObjectProperty", "SubClassOf" and “Datatype" property make our classification more intelligent by way of integrating new terms. New terms found are generated based on the first terms introduced by tutor and semantic relations described by OWL formalism.
Keywords: Classification of messages, collaborative communication tools, discussion forum, e-learning, formal description, latente semantic analysis, ontology, owl, semantic relations, semantic web, thesaurus, tutoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1620