Search results for: Object Identification
3285 Streamlining .NET Data Access: Leveraging JSON for Data Operations in .NET
Authors: Tyler T. Procko, Steve Collins
Abstract:
New features in .NET (6 and above) permit streamlined access to information residing in JSON-capable relational databases, such as SQL Server (2016 and above). Traditional methods of data access now comparatively involve unnecessary steps which compromise system performance. This work posits that the established ORM (Object Relational Mapping) based methods of data access in applications and APIs result in common issues, e.g., object-relational impedance mismatch. Recent developments in C# and .NET Core combined with a framework of modern SQL Server coding conventions have allowed better technical solutions to the problem. As an amelioration, this work details the language features and coding conventions which enable this streamlined approach, resulting in an open-source .NET library implementation called Codeless Data Access (CODA). Canonical approaches rely on ad-hoc mapping code to perform type conversions between the client and back-end database; with CODA, no mapping code is needed, as JSON is freely mapped to SQL and vice versa. CODA streamlines API data access by improving on three aspects of immediate concern to web developers, database engineers and cybersecurity professionals: Simplicity, Speed and Security. Simplicity is engendered by cutting out the “middleman” steps, effectively making API data access a whitebox, whereas traditional methods are blackbox. Speed is improved because of the fewer translational steps taken, and security is improved as attack surfaces are minimized. An empirical evaluation of the speed of the CODA approach in comparison to ORM approaches ] is provided and demonstrates that the CODA approach is significantly faster. CODA presents substantial benefits for API developer workflows by simplifying data access, resulting in better speed and security and allowing developers to focus on productive development rather than being mired in data access code. Future considerations include a generalization of the CODA method and extension outside of the .NET ecosystem to other programming languages.Keywords: API data access, database, JSON, .NET core, SQL server
Procedia PDF Downloads 663284 The Great Mimicker: A Case of Disseminated Tuberculosis
Authors: W. Ling, Mohamed Saufi Bin Awang
Abstract:
Introduction: Mycobacterium tuberculosis post a major health problem worldwide. Central nervous system (CNS) infection by mycobacterium tuberculosis is one of the most devastating complications of tuberculosis. Although with advancement in medical fields, we are yet to understand the pathophysiology of how mycobacterium tuberculosis was able to cross the blood-brain barrier (BBB) and infect the CNS. CNS TB may present with nonspecific clinical symptoms which can mimic other diseases/conditions; this is what makes the diagnosis relatively difficult and challenging. Public health has to be informed and educated about the spread of TB, and early identification of TB is important as it is a curable disease. Case Report: A young 21-year-old Malay gentleman was initially presented to us with symptoms of ear discharge, tinnitus, and right-sided headache for the past one year. Further history reveals that the symptoms have been mismanaged and neglected over the period of 1 year. Initial investigation reveals features of inflammation of the ear. Further imaging showed the feature of chronic inflammation of the otitis media and atypical right cerebral abscess, which has the same characteristic features and consistency. He further underwent a biopsy, and results reveal positive Mycobacterium tuberculosis of the otitis media. With the results and the available imaging, we were certain that this is likely a case of disseminated tuberculosis causing CNS TB. Conclusion: We aim to highlight the challenge and difficult face in our health care system and public health in early identification and treatment.Keywords: central nervous system tuberculosis, intracranial tuberculosis, tuberculous encephalopathy, tuberculous meningitis
Procedia PDF Downloads 1893283 Water Quality Management Based on Hydrodynamic Approach, Landuse, and Human Intervention in Wulan Delta Central Java Indonesia: Problems Identification and Review
Authors: Lintang Nur Fadlillah, Muh Aris Marfai, M. Widyastuti
Abstract:
Delta is dynamics area which is influenced by marine and river. Increasing human population in coastal area and the need of life exert pressure in delta that provides various resources. Wulan Delta is one of active Delta in Central Java, Indonesia. It has been experienced multiple pressures because of natural factors and human factors. In order to provide scientific solution and to analyze the main driving force in river delta, we collected several evidences based on news, papers, and publications related to Wulan Delta. This paper presents a review and problems identification in Wulan Delta, based on hydrodynamic approach, land use, and human activities which influenced water quality in the delta. A comprehensive overview is needed to address best policies under local communities and government. The analysis based on driving forces which affect delta estuary and river mouth. Natural factor in particular hydrodynamic influenced by tides, waves, runoff, and sediment transport. However, hydrodynamic affecting mixing process in river estuaries. The main problem is human intervention in land which is land use exchange leads to several problems such us decreasing water quality. Almost 90% of delta has been transformed into fish pond by local communities. Yet, they have not apply any water management to treat waste water before flush it to the sea and estuary. To understand the environmental condition, we need to assess water quality of river delta. The assessment based on land use as non-point source pollution. In Wulan Delta there are no industries. The land use in Wulan Delta consist of fish pond, settlement, and agriculture. The samples must represent the land use, to estimate which land use are most influence in river delta pollution. The hydrodynamic condition such as high tides and runoff must be considered, because it will affect the mixing process and water quality as well. To determine the samples site, we need to involve local community, in order to give insight into them. Furthermore, based on this review and problem identification, recommendations and strategies for water management are formulated.Keywords: delta, land use, water quality, management, hydrodynamics
Procedia PDF Downloads 2503282 Defect Identification in Partial Discharge Patterns of Gas Insulated Switchgear and Straight Cable Joint
Authors: Chien-Kuo Chang, Yu-Hsiang Lin, Yi-Yun Tang, Min-Chiu Wu
Abstract:
With the trend of technological advancement, the harm caused by power outages is substantial, mostly due to problems in the power grid. This highlights the necessity for further improvement in the reliability of the power system. In the power system, gas-insulated switches (GIS) and power cables play a crucial role. Long-term operation under high voltage can cause insulation materials in the equipment to crack, potentially leading to partial discharges. If these partial discharges (PD) can be analyzed, preventative maintenance and replacement of equipment can be carried out, there by improving the reliability of the power grid. This research will diagnose defects by identifying three different defects in GIS and three different defects in straight cable joints, for a total of six types of defects. The partial discharge data measured will be converted through phase analysis diagrams and pulse sequence analysis. Discharge features will be extracted using convolutional image processing, and three different deep learning models, CNN, ResNet18, and MobileNet, will be used for training and evaluation. Class Activation Mapping will be utilized to interpret the black-box problem of deep learning models, with each model achieving an accuracy rate of over 95%. Lastly, the overall model performance will be enhanced through an ensemble learning voting method.Keywords: partial discharge, gas-insulated switches, straight cable joint, defect identification, deep learning, ensemble learning
Procedia PDF Downloads 783281 Identification and Quantification of Lisinopril from Pure, Formulated and Urine Samples by Micellar Thin Layer Chromatography
Authors: Sudhanshu Sharma
Abstract:
Lisinopril, 1-[N-{(s)-I-carboxy-3 phenyl propyl}-L-proline dehydrate is a lysine analog of enalaprilat, the active metabolite of enalapril. It is long-acting, non-sulhydryl angiotensin-converting enzyme (ACE) inhibitor that is used for the treatment of hypertension and congestive heart failure in daily dosage 10-80 mg. Pharmacological activity of lisinopril has been proved in various experimental and clinical studies. Owing to its importance and widespread use, efforts have been made towards the development of simple and reliable analytical methods. As per our literature survey, lisinopril in pharmaceutical formulations has been determined by various analytical methodologies like polaragraphy, potentiometry, and spectrophotometry, but most of these analytical methods are not too suitable for the Identification of lisinopril from clinical samples because of the interferences caused by the amino acids and amino groups containing metabolites present in biological samples. This report is an attempt in the direction of developing a simple and reliable method for on plate identification and quantification of lisinopril in pharmaceutical formulations as well as from human urine samples using silica gel H layers developed with a new mobile phase comprising of micellar solutions of N-cetyl-N, N, N-trimethylammonium bromide (CTAB). Micellar solutions have found numerous practical applications in many areas of separation science. Micellar liquid chromatography (MLC) has gained immense popularity and wider applicability due to operational simplicity, cost effectiveness, relatively non-toxicity and enhanced separation efficiency, low aggressiveness. Incorporation of aqueous micellar solutions as mobile phase was pioneered by Armstrong and Terrill as they accentuated the importance of TLC where simultaneous separation of ionic or non-ionic species in a variety of matrices is required. A peculiarity of the micellar mobile phases (MMPs) is that they have no macroscopic analogues, as a result the typical separations can be easily achieved by using MMPs than aqueous organic mobile phases. Previously MMPs were successfully employed in TLC based critical separations of aromatic hydrocarbons, nucleotides, vitamin K1 and K5, o-, m- and p- aminophenol, amino acids, separation of penicillins. The human urine analysis for identification of selected drugs and their metabolites has emerged as an important investigation tool in forensic drug analysis. Among all chromatographic methods available only thin layer chromatography (TLC) enables a simple fast and effective separation of the complex mixtures present in various biological samples and is recommended as an approved testing for forensic drug analysis by federal Law. TLC proved its applicability during successful separation of bio-active amines, carbohydrates, enzymes, porphyrins, and their precursors, alkaloid and drugs from urine samples.Keywords: lisnopril, surfactant, chromatography, micellar solutions
Procedia PDF Downloads 3673280 The Connection Between the Semiotic Theatrical System and the Aesthetic Perception
Authors: Păcurar Diana Istina
Abstract:
The indissoluble link between aesthetics and semiotics, the harmonization and semiotic understanding of the interactions between the viewer and the object being looked at, are the basis of the practical demonstration of the importance of aesthetic perception within the theater performance. The design of a theater performance includes several structures, some considered from the beginning, art forms (i.e., the text), others being represented by simple, common objects (e.g., scenographic elements), which, if reunited, can trigger a certain aesthetic perception. The audience is delivered, by the team involved in the performance, a series of auditory and visual signs with which they interact. It is necessary to explain some notions about the physiological support of the transformation of different types of stimuli at the level of the cerebral hemispheres. The cortex considered the superior integration center of extransecal and entanged stimuli, permanently processes the information received, but even if it is delivered at a constant rate, the generated response is individualized and is conditioned by a number of factors. Each changing situation represents a new opportunity for the viewer to cope with, developing feelings of different intensities that influence the generation of meanings and, therefore, the management of interactions. In this sense, aesthetic perception depends on the detection of the “correctness” of signs, the forms of which are associated with an aesthetic property. Fairness and aesthetic properties can have positive or negative values. Evaluating the emotions that generate judgment and implicitly aesthetic perception, whether we refer to visual emotions or auditory emotions, involves the integration of three areas of interest: Valence, arousal and context control. In this context, superior human cognitive processes, memory, interpretation, learning, attribution of meanings, etc., help trigger the mechanism of anticipation and, no less important, the identification of error. This ability to locate a short circuit produced in a series of successive events is fundamental in the process of forming an aesthetic perception. Our main purpose in this research is to investigate the possible conditions under which aesthetic perception and its minimum content are generated by all these structures and, in particular, by interactions with forms that are not commonly considered aesthetic forms. In order to demonstrate the quantitative and qualitative importance of the categories of signs used to construct a code for reading a certain message, but also to emphasize the importance of the order of using these indices, we have structured a mathematical analysis that has at its core the analysis of the percentage of signs used in a theater performance.Keywords: semiology, aesthetics, theatre semiotics, theatre performance, structure, aesthetic perception
Procedia PDF Downloads 893279 Bending Tests for the Axial Load Identifications in Space Structures with Unknown Boundary Conditions
Authors: M. Bonopera, N. Tullini, C. C. Chen, T. K. Lin, K. C. Chang
Abstract:
This paper presents the extension of a static method for the axial load identifications in prismatic beam-columns with uncertain length and unknown boundary conditions belonging to generic space structures, such as columns of space frames or struts and ties of space trusses. The non-destructive method requires the knowledge of the beam-column flexural rigidity only. Flexural displacements are measured at five cross sections along the beam-column subjected to an additional vertical load at the mid-span. Unlike analogous dynamic methods, any set of experimental data may be used in the identification procedure. The method is verified by means of many numerical and experimental tests on beam-columns having unknown boundary conditions and different slenderness belonging to three different space prototypes in small-scale. Excellent estimates of the tensile and compressive forces are obtained for the elements with higher slenderness and when the greatest possible distance between sensors is adopted. Moreover, the application of larger values of the vertical load and very accurate displacement measurements are required. The method could be an efficacious technique in-situ, considering that safety inspections will become increasingly important in the near future, especially because of the improvement of the material properties that allowed designing space structures composed of beam-columns with higher slenderness.Keywords: force identification, in-situ test, space structure, static test
Procedia PDF Downloads 2453278 Setting up a Prototype for the Artificial Interactive Reality Unified System to Transform Psychosocial Intervention in Occupational Therapy
Authors: Tsang K. L. V., Lewis L. A., Griffith S., Tucker P.
Abstract:
Background: Many children with high incidence disabilities, such as autism spectrum disorder (ASD), struggle to participate in the community in a socially acceptable manner. There are limitations for clinical settings to provide natural, real-life scenarios for them to practice the life skills needed to meet their real-life challenges. Virtual reality (VR) offers potential solutions to resolve the existing limitations faced by clinicians to create simulated natural environments for their clients to generalize the facilitated skills. Research design: The research aimed to develop a prototype of an interactive VR system to provide realistic and immersive environments for clients to practice skills. The descriptive qualitative methodology is employed to design and develop the Artificial Interactive Reality Unified System (AIRUS) prototype, which provided insights on how to use advanced VR technology to create simulated real-life social scenarios and enable users to interact with the objects and people inside the virtual environment using natural eye-gazes, hand and body movements. The eye tracking (e.g., selective or joint attention), hand- or body-tracking (e.g., repetitive stimming or fidgeting), and facial tracking (e.g., emotion recognition) functions allowed behavioral data to be captured and managed in the AIRUS architecture. Impact of project: Instead of using external controllers or sensors, hand tracking software enabled the users to interact naturally with the simulated environment using daily life behavior such as handshaking and waving to control and interact with the virtual objects and people. The AIRUS protocol offers opportunities for breakthroughs in future VR-based psychosocial assessment and intervention in occupational therapy. Implications for future projects: AI technology can allow more efficient data capturing and interpretation of object identification and human facial emotion recognition at any given moment. The data points captured can be used to pinpoint our users’ focus and where their interests lie. AI can further help advance the data interpretation system.Keywords: occupational therapy, psychosocial assessment and intervention, simulated interactive environment, virtual reality
Procedia PDF Downloads 353277 Assessing the Theoretical Suitability of Sentinel-2 and Worldview-3 Data for Hydrocarbon Mapping of Spill Events, Using Hydrocarbon Spectral Slope Model
Authors: K. Tunde Olagunju, C. Scott Allen, Freek Van Der Meer
Abstract:
Identification of hydrocarbon oil in remote sensing images is often the first step in monitoring oil during spill events. Most remote sensing methods adopt techniques for hydrocarbon identification to achieve detection in order to model an appropriate cleanup program. Identification on optical sensors does not only allow for detection but also for characterization and quantification. Until recently, in optical remote sensing, quantification and characterization are only potentially possible using high-resolution laboratory and airborne imaging spectrometers (hyperspectral data). Unlike multispectral, hyperspectral data are not freely available, as this data category is mainly obtained via airborne survey at present. In this research, two (2) operational high-resolution multispectral satellites (WorldView-3 and Sentinel-2) are theoretically assessed for their suitability for hydrocarbon characterization, using the hydrocarbon spectral slope model (HYSS). This method utilized the two most persistent hydrocarbon diagnostic/absorption features at 1.73 µm and 2.30 µm for hydrocarbon mapping on multispectral data. In this research, spectra measurement of seven (7) different hydrocarbon oils (crude and refined oil) taken on ten (10) different substrates with the use of laboratory ASD Fieldspec were convolved to Sentinel-2 and WorldView-3 resolution, using their full width half maximum (FWHM) parameter. The resulting hydrocarbon slope values obtained from the studied samples enable clear qualitative discrimination of most hydrocarbons, despite the presence of different background substrates, particularly on WorldView-3. Due to close conformity of central wavelengths and narrow bandwidths to key hydrocarbon bands used in HYSS, the statistical significance for qualitative analysis on WorldView-3 sensors for all studied hydrocarbon oil returned with 95% confidence level (P-value ˂ 0.01), except for Diesel. Using multifactor analysis of variance (MANOVA), the discriminating power of HYSS is statistically significant for most hydrocarbon-substrate combinations on Sentinel-2 and WorldView-3 FWHM, revealing the potential of these two operational multispectral sensors as rapid response tools for hydrocarbon mapping. One notable exception is highly transmissive hydrocarbons on Sentinel-2 data due to the non-conformity of spectral bands with key hydrocarbon absorptions and the relatively coarse bandwidth (> 100 nm).Keywords: hydrocarbon, oil spill, remote sensing, hyperspectral, multispectral, hydrocarbon-substrate combination, Sentinel-2, WorldView-3
Procedia PDF Downloads 2163276 Virtual Science Hub: An Open Source Platform to Enrich Science Teaching
Authors: Enrique Barra, Aldo Gordillo, Juan Quemada
Abstract:
This paper presents the Virtual Science Hub platform. It is an open source platform that combines a social network, an e-learning authoring tool, a video conference service and a learning object repository for science teaching enrichment. These four main functionalities fit very well together. The platform was released in April 2012 and since then it has not stopped growing. Finally we present the results of the surveys conducted and the statistics gathered to validate this approach.Keywords: e-learning, platform, authoring tool, science teaching, educational sciences
Procedia PDF Downloads 3973275 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks
Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone
Abstract:
Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing
Procedia PDF Downloads 1883274 Application of KL Divergence for Estimation of Each Metabolic Pathway Genes
Authors: Shohei Maruyama, Yasuo Matsuyama, Sachiyo Aburatani
Abstract:
The development of the method to annotate unknown gene functions is an important task in bioinformatics. One of the approaches for the annotation is The identification of the metabolic pathway that genes are involved in. Gene expression data have been utilized for the identification, since gene expression data reflect various intracellular phenomena. However, it has been difficult to estimate the gene function with high accuracy. It is considered that the low accuracy of the estimation is caused by the difficulty of accurately measuring a gene expression. Even though they are measured under the same condition, the gene expressions will vary usually. In this study, we proposed a feature extraction method focusing on the variability of gene expressions to estimate the genes' metabolic pathway accurately. First, we estimated the distribution of each gene expression from replicate data. Next, we calculated the similarity between all gene pairs by KL divergence, which is a method for calculating the similarity between distributions. Finally, we utilized the similarity vectors as feature vectors and trained the multiclass SVM for identifying the genes' metabolic pathway. To evaluate our developed method, we applied the method to budding yeast and trained the multiclass SVM for identifying the seven metabolic pathways. As a result, the accuracy that calculated by our developed method was higher than the one that calculated from the raw gene expression data. Thus, our developed method combined with KL divergence is useful for identifying the genes' metabolic pathway.Keywords: metabolic pathways, gene expression data, microarray, Kullback–Leibler divergence, KL divergence, support vector machines, SVM, machine learning
Procedia PDF Downloads 4033273 Pathogen Identification of Fusarium Spp. And Chemotypes Associated With Wheat Crown Rot in Hebei Province of China
Authors: Kahsay Tadesse Mawcha, Na Zhang, Xu Yiying, Chang Jiaying, Wenxiang Yang
Abstract:
Fusarium crown rot (FCR) diseased wheat seedlings were collected from different wheat-growing counties in seven different regions (Baoding, Cangzhou, Handan, Hengshui, Langfang, Shijiazhuang, and Xingtai) in Hebei province, China from 2019 to 2020. One-hundred twenty-two Fusarium isolates were isolated from crown rot diseased wheat seedlings and identified morphologically, confirmation was undertaken molecularly, and species-specific PCR was utilized to verify the morphological identification of F. psuedograminearum, F. graminearum, F. asiaticum, and F. culmorum. The predominant Fusarium species associated with wheat crown rot in the Hebei province were F. psuedograminearum, F. graminearum, F. asiaticum, and F. culmorum with isolation frequency of 85.25%, 12.30%, 1.64%, and 0.81%, respectively. All the Fusarium strains isolated from the different wheat-growing fields were qualitatively tested for toxigenic chemotypes using toxin-specific primers and chemotaxonomically classified into DON, 3-ADON, 15-ADON, and NIV. Among F. psuedograminearum identified, 84.62% were classified as DON chemotypes, 6.73% as 15-ADON chemotypes, 3.84% as 3-ADON chemotypes, and 4.81% of them had NIV as detected by the toxin-specific PCR results. Most of the F. graminearum isolates produced 15-ADON, and only two isolates had NIV chemotypes. F. asiaticum and F. culmorum produce chemotype of 15-ADON and 3-ADON, respectively. Pathogenicity test results showed that F. pseudograminearum and F. graminearum had strong pathogenicity, and F. asiaticum and F. culmorum had moderate pathogenicity to wheat in Hebei province.Keywords: crown rot, pathogen, wheat, Fusarium species, mycotoxin
Procedia PDF Downloads 1353272 Factors Impacting Geostatistical Modeling Accuracy and Modeling Strategy of Fluvial Facies Models
Authors: Benbiao Song, Yan Gao, Zhuo Liu
Abstract:
Geostatistical modeling is the key technic for reservoir characterization, the quality of geological models will influence the prediction of reservoir performance greatly, but few studies have been done to quantify the factors impacting geostatistical reservoir modeling accuracy. In this study, 16 fluvial prototype models have been established to represent different geological complexity, 6 cases range from 16 to 361 wells were defined to reproduce all those 16 prototype models by different methodologies including SIS, object-based and MPFS algorithms accompany with different constraint parameters. Modeling accuracy ratio was defined to quantify the influence of each factor, and ten realizations were averaged to represent each accuracy ratio under the same modeling condition and parameters association. Totally 5760 simulations were done to quantify the relative contribution of each factor to the simulation accuracy, and the results can be used as strategy guide for facies modeling in the similar condition. It is founded that data density, geological trend and geological complexity have great impact on modeling accuracy. Modeling accuracy may up to 90% when channel sand width reaches up to 1.5 times of well space under whatever condition by SIS and MPFS methods. When well density is low, the contribution of geological trend may increase the modeling accuracy from 40% to 70%, while the use of proper variogram may have very limited contribution for SIS method. It can be implied that when well data are dense enough to cover simple geobodies, few efforts were needed to construct an acceptable model, when geobodies are complex with insufficient data group, it is better to construct a set of robust geological trend than rely on a reliable variogram function. For object-based method, the modeling accuracy does not increase obviously as SIS method by the increase of data density, but kept rational appearance when data density is low. MPFS methods have the similar trend with SIS method, but the use of proper geological trend accompany with rational variogram may have better modeling accuracy than MPFS method. It implies that the geological modeling strategy for a real reservoir case needs to be optimized by evaluation of dataset, geological complexity, geological constraint information and the modeling objective.Keywords: fluvial facies, geostatistics, geological trend, modeling strategy, modeling accuracy, variogram
Procedia PDF Downloads 2643271 Outsourcing the Front End of Innovation
Abstract:
The paper presents a new method for efficient innovation process management. Even though the innovation management methods, tools and knowledge are well established and documented in literature, most of the companies still do not manage it efficiently. Especially in SMEs the front end of innovation - problem identification, idea creation and selection - is often not optimally performed. Our eMIPS methodology represents a sort of "umbrella methodology"- a well-defined set of procedures, which can be dynamically adapted to the concrete case in a company. In daily practice, various methods (e.g. for problem identification and idea creation) can be applied, depending on the company's needs. It is based on the proactive involvement of the company's employees supported by the appropriate methodology and external experts. The presented phases are performed via a mixture of face-to-face activities (workshops) and online (eLearning) activities taking place in eLearning Moodle environment and using other e-communication channels. One part of the outcomes is an identified set of opportunities and concrete solutions ready for implementation. The other also very important result is connected to innovation competences for the participating employees related with concrete tools and methods for idea management. In addition, the employees get a strong experience for dynamic, efficient and solution oriented managing of the invention process. The eMIPS also represents a way of establishing or improving the innovation culture in the organization. The first results in a pilot company showed excellent results regarding the motivation of participants and also as to the results achieved.Keywords: creativity, distance learning, front end, innovation, problem
Procedia PDF Downloads 3283270 Optimizing Oxidation Process Parameters of Al-Li Base Alloys Using Taguchi Method
Authors: Muna K. Abbass, Laith A. Mohammed, Muntaha K. Abbas
Abstract:
The oxidation of Al-Li base alloy containing small amounts of rare earth (RE) oxides such as 0.2 wt% Y2O3 and 0.2wt% Nd2O3 particles have been studied at temperatures: 400ºC, 500ºC and 550°C for 60hr in a dry air. Alloys used in this study were prepared by melting and casting in a permanent steel mould under controlled atmosphere. Identification of oxidation kinetics was carried out by using weight gain/surface area (∆W/A) measurements while scanning electron microscopy (SEM) and x-ray diffraction analysis were used for micro structural morphologies and phase identification of the oxide scales. It was observed that the oxidation kinetic for all studied alloys follows the parabolic law in most experimental tests under the different oxidation temperatures. It was also found that the alloy containing 0.2 wt %Y 2O3 particles possess the lowest oxidation rate and shows great improvements in oxidation resistance compared to the alloy containing 0.2 wt % Nd2O3 particles and Al-Li base alloy. In this work, Taguchi method is performed to estimate the optimum weight gain /area (∆W/A) parameter in oxidation process of Al-Li base alloys to obtain a minimum thickness of oxidation layer. Taguchi method is used to formulate the experimental layout, to analyses the effect of each parameter (time, temperature and alloy type) on the oxidation generation and to predict the optimal choice for each parameter and analyzed the effect of these parameters on the weight gain /area (∆W/A) parameter. The analysis shows that, the temperature significantly affects on the (∆W/A) parameter.Keywords: Al-Li base alloy, oxidation, Taguchi method, temperature
Procedia PDF Downloads 3723269 An Evaluative Microbiological Risk Assessment of Drinking Water Supply in the Carpathian Region: Identification of Occurrent Hazardous Bacteria with Quantitative Microbial Risk Assessment Method
Authors: Anikó Kaluzsa
Abstract:
The article's author aims to introduce and analyze those microbiological safety hazards which indicate the presence of secondary contamination in the water supply system. Since drinking water belongs to primary foods and is the basic condition of life, special attention should be paid on its quality. There are such indicators among the microbiological features can be found in water, which are clear evidence of the presence of water contamination, and based on this there is no need to perform other diagnostics, because they prove properly the contamination of the given water supply section. Laboratory analysis can help - both technologically and temporally – to identify contamination, but it does matter how long takes the removal and if the disinfection process takes place in time. The identification of the factors that often occur in the same places or the chance of their occurrence is greater than the average, facilitates our work. The pathogen microbiological risk assessment by the help of several features determines the most likely occurring microbiological features in the Carpathian basin. From among all the microbiological indicators, that are recommended targets for routine inspection by the World Health Organization, there is a paramount importance of the appearance of Escherichia coli in the water network, as its presence indicates the potential ubietiy of enteric pathogens or other contaminants in the water network. In addition, the author presents the steps of microbiological risk assessment analyzing those pathogenic micro-organisms registered to be the most critical.Keywords: drinking water, E. coli, microbiological indicators, risk assessment, water safety plan
Procedia PDF Downloads 3333268 Bacteriological Culture Methods and its Uses in Clinical Pathology
Authors: Prachi Choudhary, Jai Gopal Sharma
Abstract:
Microbial cultures determine the type of organism, its abundance in the tested sample, or both. It is one of the primary diagnostic methods of microbiology. It is used to determine the cause of infectious disease by letting the agent multiply in a predetermined medium. Different bacteria produce colonies that may be very distinct from the bacterial species that produced them. To culture any pathogen or microorganism, we should first know about the types of media used in microbiology for culturing. Sometimes sub culturing is also done in various microorganisms if some mixed growth is seen in culture. Nearly 3 types of culture media based on consistency – solid, semi-solid, and liquid (broth) media; are further explained in the report. Then, The Five I's approach is a method for locating, growing, observing, and characterizing microorganisms, including inoculation and incubation. Isolation, inspection, and identification. For identification of bacteria, we have to culture the sample like urine, sputum, blood, etc., on suitable media; there are different methods of culturing the bacteria or microbe like pour plate method, streak plate method, swabbing by needle, pipetting, inoculation by loop, spreading by spreader, etc. After this, we see the bacterial growth after incubation of 24 hours, then according to the growth of bacteria antibiotics susceptibility test is conducted; this is done for sensitive antibiotics or resistance to that bacteria, and also for knowing the name of bacteria. Various methods like the dilution method, disk diffusion method, E test, etc., do antibiotics susceptibility tests. After that, various medicines are provided to the patients according to antibiotic sensitivity and resistance.Keywords: inoculation, incubation, isolation, antibiotics suspectibility test, characterizing
Procedia PDF Downloads 823267 Workplace Risk Assessment in a Paint Factory
Authors: Rula D. Alshareef, Safa S. Alqathmi, Ghadah K. Alkhouldi, Reem O. Bagabas, Farheen B. Hasan
Abstract:
Safety engineering is among the most crucial considerations in any work environment. Providing mentally, physically, and environmentally safe work conditions must be the top priority of any successful organization. Company X is a local paint production company in Saudi Arabia; in a month, the factory experienced two significant accidents, which indicates that workers’ safety is overlooked. The aim of the research is to examine the risks, assess the root causes and recommend control measures that will eventually contribute to providing a safe workplace. The methodology used is sectioned into three phases, risk identification, assessment, and finally, mitigation. In the identification phase, the team used Rapid Entire Body Assessment (REBA) and National Institute for Occupational Safety and Health Lifting Index (NIOSH LI) tools to holistically establish knowledge about the current risk posed to the factory. The physical hazards in the factory were assessed in two different operations, which are mixing and filling/packaging. For the risk assessment phase, the hazards were deeply analyzed through their severity and impact. Additionally, through risk mitigation, the Rapid Entire Body Assessment (REBA) score decreased from 11 to 7, and the National Institute for Occupational Safety and Health Lifting Index (NIOSH LI) has been reduced from 5.27 to 1.85.Keywords: ergonomics, safety, workplace risks, hazards, awkward posture, fatigue, work environment
Procedia PDF Downloads 793266 Identification of Social Responsibility Factors within Mega Construction Projects
Authors: Ali Alotaibi, Francis Edum-Fotwe, Andrew Price /
Abstract:
Mega construction projects create buildings and major infrastructure to respond to work and life requirements while playing a vital role in promoting any nation’s economy. However, the industry is often criticised for not balancing economic, environmental and social dimensions of their projects, with emphasis typically on one aspect to the detriment of the others. This has resulted in many negative impacts including environmental pollution, waste throughout the project lifecycle, low productivity, and avoidable accidents. The identification of comprehensive Social Responsibility (SR) indicators, which combine social, environmental and economic aspects, is urgently needed. This is particularly the case in the context of the Kingdom of Saudi Arabia (KSA), which often has mega public construction projects. The aim of this paper is to develop a set of wide-ranging SR indicators which encompass social, economic and environmental aspects unique to the KSA. A qualitative approach was applied to explore relevant indicators through a review of the existing literature, international standards and reports. A list of appropriate indicators was developed, and its comprehensiveness was corroborated by interviews with experts on mega construction projects working with SR concepts in the KSA. The findings present 39 indicators and their metrics, covering 10 economic, 12 environmental and 17 social aspects of SR mapped against their references. These indicators are a valuable reference for decision-makers and academics in the KSA to understand factors related to SR in mega construction projects. The indicators are related to mega construction projects within the KSA and require validation in a real case scenario or within a different industry to demonstrate their generalisability.Keywords: social responsibility, construction projects, economic, social, environmental, indicators
Procedia PDF Downloads 1683265 Care and Support for Infants and Toddlers with Special Needs
Authors: Florence A. Undiyaundeye, Aniashie Akpanke
Abstract:
Early identification of developmental disorders in infants and toddlers is critical for the well being of children. It is also an integral function of the primary care medical provider and the early care given in the home or crèche. This paper is focused at providing information on special need infants and toddlers and strategies to support them in developmental concern to cope with the challenges in and out of the classroom and to interact with their peers without stigmatization and inferiority complex. The target children are from birth through three years of age. There is a strong recommendation for developmental surveillance to be incorporated at every well child preventive care program in training and practical stage of formal school settings. The paper posits that any concerns raised during surveillance should be promptly addressed with standardized developmental screening by appropriate health service providers. In addition screening tests should be administered regularly at age 9+, 19+ and 30 months of these infants. The paper also establishes that the early identification of these developmental challenges of the infants and toddlers should lead to further developmental and medical evaluation, diagnosis and treatment, including early developmental school intervention, control and teaching and learning integration and inclusion for proper career build up. Children diagnosed with developmental disorders should be identified as children with special needs so that management is initiated and its underlying etiology may also drive a range of treatment of the child, to parents. Conselling and school integration as applicable to the child’s specific need and care for sustenance in societal functioning.Keywords: care, special need, support, infants and toddlers, management and developmental disorders
Procedia PDF Downloads 3883264 An Object-Based Image Resizing Approach
Authors: Chin-Chen Chang, I-Ta Lee, Tsung-Ta Ke, Wen-Kai Tai
Abstract:
Common methods for resizing image size include scaling and cropping. However, these two approaches have some quality problems for reduced images. In this paper, we propose an image resizing algorithm by separating the main objects and the background. First, we extract two feature maps, namely, an enhanced visual saliency map and an improved gradient map from an input image. After that, we integrate these two feature maps to an importance map. Finally, we generate the target image using the importance map. The proposed approach can obtain desired results for a wide range of images.Keywords: energy map, visual saliency, gradient map, seam carving
Procedia PDF Downloads 4763263 Youth NEET in Albania: Current Situation and Outreach Mechanisms
Authors: Emiljan Karma
Abstract:
One of the major problems of the present is young people who are not concerned with employment, education, or training (NEETs). Unfortunately, this group of people in Albania is a considerable part of working-age people, and despite the measures taken, they remain a major problem. NEETs in Albania are very heterogeneous. This is since youth unemployment and inactivity rate are at a very high level (Albania has the highest NEET rate among EU candidates/potential candidates’ countries and EU countries); the high level of NEET rate in Albania means that government agencies responsible for labour market regulation and other social actors interested in the phenomenon (representatives of employees, representatives of employers, non-governmental organizations, etc.) did not effectively materialize the policies in the field of youth employment promotion. The National Agency for Employment and Skills (NAES) delivers measures specifically designed to target unemployed youth, being the key stakeholder in the implementation of employment policies and skills development in Albania. In the context of identifying and assisting NEETs, this role becomes even stronger. The experience of different EU countries (e.g., Youth Guarantee) indicates that there are different policy-making structures and various outreach mechanisms for constraining the youth NEET phenomenon. The purpose of this research is to highlight: (1) The identification of NEETs feature in Albania; (2) The identification of tailored and efficient outreach mechanisms to assist vulnerable NEETs; (3) The fundamental importance of stakeholders’ partnership at central and regional level.Keywords: labor market, NEETs, non-registered NEETs, unemployment
Procedia PDF Downloads 2743262 The Identification of Combined Genomic Expressions as a Diagnostic Factor for Oral Squamous Cell Carcinoma
Authors: Ki-Yeo Kim
Abstract:
Trends in genetics are transforming in order to identify differential coexpressions of correlated gene expression rather than the significant individual gene. Moreover, it is known that a combined biomarker pattern improves the discrimination of a specific cancer. The identification of the combined biomarker is also necessary for the early detection of invasive oral squamous cell carcinoma (OSCC). To identify the combined biomarker that could improve the discrimination of OSCC, we explored an appropriate number of genes in a combined gene set in order to attain the highest level of accuracy. After detecting a significant gene set, including the pre-defined number of genes, a combined expression was identified using the weights of genes in a gene set. We used the Principal Component Analysis (PCA) for the weight calculation. In this process, we used three public microarray datasets. One dataset was used for identifying the combined biomarker, and the other two datasets were used for validation. The discrimination accuracy was measured by the out-of-bag (OOB) error. There was no relation between the significance and the discrimination accuracy in each individual gene. The identified gene set included both significant and insignificant genes. One of the most significant gene sets in the classification of normal and OSCC included MMP1, SOCS3 and ACOX1. Furthermore, in the case of oral dysplasia and OSCC discrimination, two combined biomarkers were identified. The combined genomic expression achieved better performance in the discrimination of different conditions than in a single significant gene. Therefore, it could be expected that accurate diagnosis for cancer could be possible with a combined biomarker.Keywords: oral squamous cell carcinoma, combined biomarker, microarray dataset, correlated genes
Procedia PDF Downloads 4233261 The Influence of Superordinate Identity and Group Size on Group Decision Making through Discussion
Authors: Lin Peng, Jin Zhang, Yuanyuan Miao, Quanquan Zheng
Abstract:
Group discussion and group decision-making have long been a topic of research interest. Traditional research on group decision making typically focuses on the strategies or functional models of combining members’ preferences to reach an optimal consensus. In this research, we want to explore natural process group decision making through discussion and examine relevant, influential factors--common superordinate identity shared by group and size of the groups. We manipulated the social identity of the groups into either a shared superordinate identity or different subgroup identities. We also manipulated the size to make it either a big (6-8 person) group or small group (3-person group). Using experimental methods, we found members of a superordinate identity group tend to modify more of their own opinions through the discussion, compared to those only identifying with their subgroups. Besides, members of superordinate identity groups also formed stronger identification with group decision--the results of group discussion than their subgroup peers. We also found higher member modification in bigger groups compared to smaller groups. Evaluations of decisions before and after discussion as well as group decisions are strongly linked to group identity, as members of superordinate group feel more confident and satisfied with both the results and decision-making process. Members’ opinions are more similar and homogeneous in smaller groups compared to bigger groups. This research have many implications for further research and applied behaviors in organizations.Keywords: group decision making, group size, identification, modification, superordinate identity
Procedia PDF Downloads 3073260 Communicative Language between Doctors and Patients in Healthcare
Authors: Anita Puspawati
Abstract:
A failure in obtaining informed consent from patient occurs because there is not effective communication skill in doctors. Therefore, the language is very important in communication between doctor and patient. This study uses descriptive analysis method, that is a method used mainly in researching the status of a group of people, an object, a condition, a system of thought or a class of events in the present. The result of this study indicates that the communicative language between doctors and patients will increase the trust of patients to their doctors and accordingşy, patients will provide the informed consent voluntarily.Keywords: communicative, language, doctor, patient
Procedia PDF Downloads 2923259 Modified Gold Screen Printed Electrode with Ruthenium Complex for Selective Detection of Porcine DNA
Authors: Siti Aishah Hasbullah
Abstract:
Studies on identification of pork content in food have grown rapidly to meet the Halal food standard in Malaysia. The used mitochondria DNA (mtDNA) approaches for the identification of pig species is thought to be the most precise marker due to the mtDNA genes are present in thousands of copies per cell, the large variability of mtDNA. The standard method commonly used for DNA detection is based on polymerase chain reaction (PCR) method combined with gel electrophoresis but has major drawback. Its major drawbacks are laborious, need longer time and toxic to handle. Therefore, the need for simplicity and fast assay of DNA is vital and has triggered us to develop DNA biosensors for porcine DNA detection. Therefore, the aim of this project is to develop electrochemical DNA biosensor based on ruthenium (II) complex, [Ru(bpy)2(p-PIP)]2+ as DNA hybridization label. The interaction of DNA and [Ru(bpy)2(p-HPIP)]2+ will be studied by electrochemical transduction using Gold Screen-Printed Electrode (GSPE) modified with gold nanoparticles (AuNPs) and succinimide acrylic microspheres. The electrochemical detection by redox active ruthenium (II) complex was measured by cyclic voltammetry (CV) and differential pulse voltammetry (DPV). The results indicate that the interaction of [Ru(bpy)2(PIP)]2+ with hybridization complementary DNA has higher response compared to single-stranded and mismatch complementary DNA. Under optimized condition, this porcine DNA biosensor incorporated modified GSPE shows good linear range towards porcine DNA.Keywords: gold, screen printed electrode, ruthenium, porcine DNA
Procedia PDF Downloads 3093258 Magnetic Resonance Imaging in Children with Brain Tumors
Authors: J. R. Ashrapov, G. A. Alihodzhaeva, D. E. Abdullaev, N. R. Kadirbekov
Abstract:
Diagnosis of brain tumors is one of the challenges, as several central nervous system diseases run the same symptoms. Modern diagnostic techniques such as CT, MRI helps to significantly improve the surgery in the operating period, after surgery, after allowing time to identify postoperative complications in neurosurgery. Purpose: To study the MRI characteristics and localization of brain tumors in children and to detect the postoperative complications in the postoperative period. Materials and methods: A retrospective study of treatment of 62 children with brain tumors in age from 2 to 5 years was performed. Results of the review: MRI scan of the brain of the 62 patients 52 (83.8%) case revealed a brain tumor. Distribution on MRI of brain tumors found in 15 (24.1%) - glioblastomas, 21 (33.8%) - astrocytomas, 7 (11.2%) - medulloblastomas, 9 (14.5%) - a tumor origin (craniopharyngiomas, chordoma of the skull base). MRI revealed the following characteristic features: an additional sign of the heterogeneous MRI signal of hyper and hypointensive T1 and T2 modes with a different perifocal swelling degree with involvement in the process of brain vessels. The main objectives of postoperative MRI study are the identification of early or late postoperative complications, evaluation of radical surgery, the identification of the extended-growing tumor that (in terms of 3-4 weeks). MRI performed in the following cases: 1. Suspicion of a hematoma (3 days or more) 2. Suspicion continued tumor growth (in terms of 3-4 weeks). Conclusions: Magnetic resonance tomography is a highly informative method of diagnostics of brain tumors in children. MRI also helps to determine the effectiveness and tactics of treatment and the follow up in the postoperative period.Keywords: brain tumors, children, MRI, treatment
Procedia PDF Downloads 1453257 Astronomical Object Classification
Authors: Alina Muradyan, Lina Babayan, Arsen Nanyan, Gohar Galstyan, Vigen Khachatryan
Abstract:
We present a photometric method for identifying stars, galaxies and quasars in multi-color surveys, which uses a library of ∼> 65000 color templates for comparison with observed objects. The method aims for extracting the information content of object colors in a statistically correct way, and performs a classification as well as a redshift estimation for galaxies and quasars in a unified approach based on the same probability density functions. For the redshift estimation, we employ an advanced version of the Minimum Error Variance estimator which determines the redshift error from the redshift dependent probability density function itself. The method was originally developed for the Calar Alto Deep Imaging Survey (CADIS), but is now used in a wide variety of survey projects. We checked its performance by spectroscopy of CADIS objects, where the method provides high reliability (6 errors among 151 objects with R < 24), especially for the quasar selection, and redshifts accurate within σz ≈ 0.03 for galaxies and σz ≈ 0.1 for quasars. For an optimization of future survey efforts, a few model surveys are compared, which are designed to use the same total amount of telescope time but different sets of broad-band and medium-band filters. Their performance is investigated by Monte-Carlo simulations as well as by analytic evaluation in terms of classification and redshift estimation. If photon noise were the only error source, broad-band surveys and medium-band surveys should perform equally well, as long as they provide the same spectral coverage. In practice, medium-band surveys show superior performance due to their higher tolerance for calibration errors and cosmic variance. Finally, we discuss the relevance of color calibration and derive important conclusions for the issues of library design and choice of filters. The calibration accuracy poses strong constraints on an accurate classification, which are most critical for surveys with few, broad and deeply exposed filters, but less severe for surveys with many, narrow and less deep filters.Keywords: VO, ArVO, DFBS, FITS, image processing, data analysis
Procedia PDF Downloads 783256 Trusting the Eyes: The Changing Landscape of Eyewitness Testimony
Authors: Manveen Singh
Abstract:
Since the very advent of law enforcement, eyewitness testimony has played a pivotal role in identifying, arresting and convicting suspects. Reliant heavily on the accuracy of human memory, nothing seems to carry more weight with the judiciary than the testimony of an actual witness. The acceptance of eyewitness testimony as a substantive piece of evidence lies embedded in the assumption that the human mind is adept at recording and storing events. Research though, has proven otherwise. Having carried out extensive study in the field of eyewitness testimony for the past 40 years, psychologists have concluded that human memory is fragile and needs to be treated carefully. The question that arises then, is how reliable is eyewitness testimony? The credibility of eyewitness testimony, simply put, depends on several factors leaving it reliable at times while not so much at others. This is further substantiated by the fact that as per scientific research, over 75 percent of all eyewitness testimonies may stand in error; quite a few of these cases resulting in life sentences. Although the advancement of scientific techniques, especially DNA testing, helped overturn many of these eyewitness testimony-based convictions, yet eyewitness identifications continue to form the backbone of most police investigations and courtroom decisions till date. What then is the solution to this long standing concern regarding the accuracy of eyewitness accounts? The present paper shall analyze the linkage between human memory and eyewitness identification as well as look at the various factors governing the credibility of eyewitness testimonies. Furthermore, it shall elaborate upon some best practices developed over the years to help reduce mistaken identifications. Thus, in the process, trace out the changing landscape of eyewitness testimony amidst the evolution of DNA and trace evidence.Keywords: DNA, eyewitness, identification, testimony, evidence
Procedia PDF Downloads 328