Search results for: minority forms of information processing
15191 Global Navigation Satellite System and Precise Point Positioning as Remote Sensing Tools for Monitoring Tropospheric Water Vapor
Authors: Panupong Makvichian
Abstract:
Global Navigation Satellite System (GNSS) is nowadays a common technology that improves navigation functions in our life. Additionally, GNSS is also being employed on behalf of an accurate atmospheric sensor these times. Meteorology is a practical application of GNSS, which is unnoticeable in the background of people’s life. GNSS Precise Point Positioning (PPP) is a positioning method that requires data from a single dual-frequency receiver and precise information about satellite positions and satellite clocks. In addition, careful attention to mitigate various error sources is required. All the above data are combined in a sophisticated mathematical algorithm. At this point, the research is going to demonstrate how GNSS and PPP method is capable to provide high-precision estimates, such as 3D positions or Zenith tropospheric delays (ZTDs). ZTDs combined with pressure and temperature information allows us to estimate the water vapor in the atmosphere as precipitable water vapor (PWV). If the process is replicated for a network of GNSS sensors, we can create thematic maps that allow extract water content information in any location within the network area. All of the above are possible thanks to the advances in GNSS data processing. Therefore, we are able to use GNSS data for climatic trend analysis and acquisition of the further knowledge about the atmospheric water content.Keywords: GNSS, precise point positioning, Zenith tropospheric delays, precipitable water vapor
Procedia PDF Downloads 19815190 Women’s Colours in Digital Innovation
Authors: Daniel J. Patricio Jiménez
Abstract:
Digital reality demands new ways of thinking, flexibility in learning, acquisition of new competencies, visualizing reality under new approaches, generating open spaces, understanding dimensions in continuous change, etc. We need inclusive growth, where colors are not lacking, where lights do not give a distorted reality, where science is not half-truth. In carrying out this study, the documentary or bibliographic collection has been taken into account, providing a reflective and analytical analysis of current reality. In this context, deductive and inductive methods have been used on different multidisciplinary information sources. Women today and tomorrow are a strategic element in science and arts, which, under the umbrella of sustainability, implies ‘meeting current needs without detriment to future generations’. We must build new scenarios, which qualify ‘the feminine and the masculine’ as an inseparable whole, encouraging cooperative behavior; nothing is exclusive or excluding, and that is where true respect for diversity must be based. We are all part of an ecosystem, which we will make better as long as there is a real balance in terms of gender. It is the time of ‘the lifting of the veil’, in other words, it is the time to discover the pseudonyms, the women who painted, wrote, investigated, recorded advances, etc. However, the current reality demands much more; we must remove doors where they are not needed. Mass processing of data, big data, needs to incorporate algorithms under the perspective of ‘the feminine’. However, most STEM students (science, technology, engineering, and math) are men. Our way of doing science is biased, focused on honors and short-term results to the detriment of sustainability. Historically, the canons of beauty, the way of looking, of perceiving, of feeling, depended on the circumstances and interests of each moment, and women had no voice in this. Parallel to science, there is an under-representation of women in the arts, but not so much in the universities, but when we look at galleries, museums, art dealers, etc., colours impoverish the gaze and once again highlight the gender gap and the silence of the feminine. Art registers sensations by divining the future, science will turn them into reality. The uniqueness of the so-called new normality requires women to be protagonists both in new forms of emotion and thought, and in the experimentation and development of new models. This will result in women playing a decisive role in the so-called "5.0 society" or, in other words, in a more sustainable, more humane world.Keywords: art, digitalization, gender, science
Procedia PDF Downloads 16515189 Effect of Citric Acid and Clove on Cured Smoked Meat: A Traditional Meat Product
Authors: Esther Eduzor, Charles A. Negbenebor, Helen O. Agu
Abstract:
Smoking of meat enhances the taste and look of meat, it also increases its longevity, and helps preserve the meat by slowing down the spoilage of fat and growth of bacteria. The Lean meat from the forequarter of beef carcass was obtained from the Maiduguri abattoir. The meat was cut into four portions with weight ranging from 525-545 g. The meat was cut into bits measuring about 8 cm in length, 3.5 cm in thickness and weighed 64.5 g. Meat samples were washed, cured with various concentration of sodium chloride, sodium nitrate, citric acid and clove for 30 min, drained and smoked in a smoking kiln at a temperature range of 55-600°C, for 8 hr a day for 3 days. The products were stored at ambient temperature and evaluated microbiologically and organoleptically. In terms of processing and storage there were increases in pH, free fatty acid content, a decrease in water holding capacity and microbial count of the cured smoked meat. The panelists rated control samples significantly (p < 0.05) higher in terms of colour, texture, taste and overall acceptability. The following organisms were isolated and identified during storage: Bacillus specie, Bacillus subtilis, streptococcus, Pseudomonas, Aspergillus niger, Candida and Penicillium specie. The study forms a basis for new product development for meat industry.Keywords: citric acid, cloves, smoked meat, bioengineering
Procedia PDF Downloads 44515188 Formulating Rough Approximations in Information Tables with Possibilistic Information
Authors: Michinori Nakata, Hiroshi Sakai
Abstract:
A rough set, which consists of lower and upper approximations, is formulated in information tables containing possibilistic information. First, lower and upper approximations on the basis of possible world semantics in the same way as Lipski did in the field of incomplete databases are shown in order to clarify fundamentals of rough sets under possibilistic information. Possibility and necessity measures are used, as is done in possibilistic databases. As a result, each object has certain and possible membership degrees to lower and upper approximations, which degrees are the lower and upper bounds. Therefore, the degree that the object belongs to lower and upper approximations is expressed by an interval value. And the complementary property linked with the lower and upper approximations holds, as is valid under complete information. Second, the approach based on indiscernibility relations, which is proposed by Dubois and Prade, are extended in three cases. The first case is that objects used to approximate a set of objects are characterized by possibilistic information. The second case is that objects used to approximate a set of objects with possibilistic information are characterized by complete information. The third case is that objects that are characterized by possibilistic information approximate a set of objects with possibilistic information. The extended approach create the same results as the approach based on possible world semantics. This justifies our extension.Keywords: rough sets, possibilistic information, possible world semantics, indiscernibility relations, lower approximations, upper approximations
Procedia PDF Downloads 32115187 Genomic Sequence Representation Learning: An Analysis of K-Mer Vector Embedding Dimensionality
Authors: James Jr. Mashiyane, Risuna Nkolele, Stephanie J. Müller, Gciniwe S. Dlamini, Rebone L. Meraba, Darlington S. Mapiye
Abstract:
When performing language tasks in natural language processing (NLP), the dimensionality of word embeddings is chosen either ad-hoc or is calculated by optimizing the Pairwise Inner Product (PIP) loss. The PIP loss is a metric that measures the dissimilarity between word embeddings, and it is obtained through matrix perturbation theory by utilizing the unitary invariance of word embeddings. Unlike in natural language, in genomics, especially in genome sequence processing, unlike in natural language processing, there is no notion of a “word,” but rather, there are sequence substrings of length k called k-mers. K-mers sizes matter, and they vary depending on the goal of the task at hand. The dimensionality of word embeddings in NLP has been studied using the matrix perturbation theory and the PIP loss. In this paper, the sufficiency and reliability of applying word-embedding algorithms to various genomic sequence datasets are investigated to understand the relationship between the k-mer size and their embedding dimension. This is completed by studying the scaling capability of three embedding algorithms, namely Latent Semantic analysis (LSA), Word2Vec, and Global Vectors (GloVe), with respect to the k-mer size. Utilising the PIP loss as a metric to train embeddings on different datasets, we also show that Word2Vec outperforms LSA and GloVe in accurate computing embeddings as both the k-mer size and vocabulary increase. Finally, the shortcomings of natural language processing embedding algorithms in performing genomic tasks are discussed.Keywords: word embeddings, k-mer embedding, dimensionality reduction
Procedia PDF Downloads 13715186 Detection of Phoneme [S] Mispronounciation for Sigmatism Diagnosis in Adults
Authors: Michal Krecichwost, Zauzanna Miodonska, Pawel Badura
Abstract:
The diagnosis of sigmatism is mostly based on the observation of articulatory organs. It is, however, not always possible to precisely observe the vocal apparatus, in particular in the oral cavity of the patient. Speech processing can allow to objectify the therapy and simplify the verification of its progress. In the described study the methodology for classification of incorrectly pronounced phoneme [s] is proposed. The recordings come from adults. They were registered with the speech recorder at the sampling rate of 44.1 kHz and the resolution of 16 bit. The database of pathological and normative speech has been collected for the study including reference assessments provided by the speech therapy experts. Ten adult subjects were asked to simulate a certain type of stigmatism under the speech therapy expert supervision. In the recordings, the analyzed phone [s] was surrounded by vowels, viz: ASA, ESE, ISI, SPA, USU, YSY. Thirteen MFCC (mel-frequency cepstral coefficients) and RMS (root mean square) values are calculated within each frame being a part of the analyzed phoneme. Additionally, 3 fricative formants along with corresponding amplitudes are determined for the entire segment. In order to aggregate the information within the segment, the average value of each MFCC coefficient is calculated. All features of other types are aggregated by means of their 75th percentile. The proposed method of features aggregation reduces the size of the feature vector used in the classification. Binary SVM (support vector machine) classifier is employed at the phoneme recognition stage. The first group consists of pathological phones, while the other of the normative ones. The proposed feature vector yields classification sensitivity and specificity measures above 90% level in case of individual logo phones. The employment of a fricative formants-based information improves the sole-MFCC classification results average of 5 percentage points. The study shows that the employment of specific parameters for the selected phones improves the efficiency of pathology detection referred to the traditional methods of speech signal parameterization.Keywords: computer-aided pronunciation evaluation, sibilants, sigmatism diagnosis, speech processing
Procedia PDF Downloads 28315185 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks
Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone
Abstract:
Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing
Procedia PDF Downloads 18815184 Phishing Attacks Facilitated by Open Source Intelligence
Authors: Urva Maryam
Abstract:
The information has become an important asset to the current cosmos. Globally, various tactics are being observed to confine the spread of information as it makes people vulnerable to security attacks. Open Source Intelligence (OSINT) is a publicly available source that has disseminated information about users or websites, companies, and various organizations. This paper focuses on the quantitative method of exploring various OSINT tools that reveal public information of personals. This information could further facilitate phishing attacks. Phishing attacks can be launched on email addresses, open ports, and unsecure web-surfing. This study allows to analyze the information retrieved from OSINT tools, i.e. theHarvester, and Maltego that can be used to send phishing attacks to individuals.Keywords: e-mail spoofing, Maltego, OSINT, phishing, spear phishing, theHarvester
Procedia PDF Downloads 14815183 Enhanced Retrieval-Augmented Generation (RAG) Method with Knowledge Graph and Graph Neural Network (GNN) for Automated QA Systems
Authors: Zhihao Zheng, Zhilin Wang, Linxin Liu
Abstract:
In the research of automated knowledge question-answering systems, accuracy and efficiency are critical challenges. This paper proposes a knowledge graph-enhanced Retrieval-Augmented Generation (RAG) method, combined with a Graph Neural Network (GNN) structure, to automatically determine the correctness of knowledge competition questions. First, a domain-specific knowledge graph was constructed from a large corpus of academic journal literature, with key entities and relationships extracted using Natural Language Processing (NLP) techniques. Then, the RAG method's retrieval module was expanded to simultaneously query both text databases and the knowledge graph, leveraging the GNN to further extract structured information from the knowledge graph. During answer generation, contextual information provided by the knowledge graph and GNN is incorporated to improve the accuracy and consistency of the answers. Experimental results demonstrate that the knowledge graph and GNN-enhanced RAG method perform excellently in determining the correctness of questions, achieving an accuracy rate of 95%. Particularly in cases involving ambiguity or requiring contextual information, the structured knowledge provided by the knowledge graph and GNN significantly enhances the RAG method's performance. This approach not only demonstrates significant advantages in improving the accuracy and efficiency of automated knowledge question-answering systems but also offers new directions and ideas for future research and practical applications.Keywords: knowledge graph, graph neural network, retrieval-augmented generation, NLP
Procedia PDF Downloads 3915182 Cost Effective Real-Time Image Processing Based Optical Mark Reader
Authors: Amit Kumar, Himanshu Singal, Arnav Bhavsar
Abstract:
In this modern era of automation, most of the academic exams and competitive exams are Multiple Choice Questions (MCQ). The responses of these MCQ based exams are recorded in the Optical Mark Reader (OMR) sheet. Evaluation of the OMR sheet requires separate specialized machines for scanning and marking. The sheets used by these machines are special and costs more than a normal sheet. Available process is non-economical and dependent on paper thickness, scanning quality, paper orientation, special hardware and customized software. This study tries to tackle the problem of evaluating the OMR sheet without any special hardware and making the whole process economical. We propose an image processing based algorithm which can be used to read and evaluate the scanned OMR sheets with no special hardware required. It will eliminate the use of special OMR sheet. Responses recorded in normal sheet is enough for evaluation. The proposed system takes care of color, brightness, rotation, little imperfections in the OMR sheet images.Keywords: OMR, image processing, hough circle trans-form, interpolation, detection, binary thresholding
Procedia PDF Downloads 17315181 Performance Degradation for the GLR Test-Statistics for Spatial Signal Detection
Authors: Olesya Bolkhovskaya, Alexander Maltsev
Abstract:
Antenna arrays are widely used in modern radio systems in sonar and communications. The solving of the detection problems of a useful signal on the background of noise is based on the GLRT method. There is a large number of problem which depends on the known a priori information. In this work, in contrast to the majority of already solved problems, it is used only difference spatial properties of the signal and noise for detection. We are analyzing the influence of the degree of non-coherence of signal and noise unhomogeneity on the performance characteristics of different GLRT statistics. The description of the signal and noise is carried out by means of the spatial covariance matrices C in the cases of different number of known information. The partially coherent signal is simulated as a plane wave with a random angle of incidence of the wave concerning a normal. Background noise is simulated as random process with uniform distribution function in each element. The results of investigation of degradation of performance characteristics for different cases are represented in this work.Keywords: GLRT, Neumann-Pearson’s criterion, Test-statistics, degradation, spatial processing, multielement antenna array
Procedia PDF Downloads 38515180 Conceptualizing IoT Based Framework for Enhancing Environmental Accounting By ERP Systems
Authors: Amin Ebrahimi Ghadi, Morteza Moalagh
Abstract:
This research is carried out to find how a perfect combination of IoT architecture (Internet of Things) and ERP system can strengthen environmental accounting to incorporate both economic and environmental information. IoT (e.g., sensors, software, and other technologies) can be used in the company’s value chain from raw material extraction through materials processing, manufacturing products, distribution, use, repair, maintenance, and disposal or recycling products (Cradle to Grave model). The desired ERP software then will have the capability to track both midpoint and endpoint environmental impacts on a green supply chain system for the whole life cycle of a product. All these enable environmental accounting to calculate, and real-time analyze the operation environmental impacts, control costs, prepare for environmental legislation and enhance the decision-making process. In this study, we have developed a model on how to use IoT devices in life cycle assessment (LCA) to gather emissions, energy consumption, hazards, and wastes information to be processed in different modules of ERP systems in an integrated way for using in environmental accounting to achieve sustainability.Keywords: ERP, environmental accounting, green supply chain, IOT, life cycle assessment, sustainability
Procedia PDF Downloads 17215179 Mixotropohic Growth of Chlorella sp. on Raw Food Processing Industrial Wastewater: Effect of COD Tolerance
Authors: Suvidha Gupta, R. A. Pandey, Sanjay Pawar
Abstract:
The effluents from various food processing industries are found with high BOD, COD, suspended solids, nitrate, and phosphate. Mixotrophic growth of microalgae using food processing industrial wastewater as an organic carbon source has emerged as more effective and energy intensive means for the nutrient removal and COD reduction. The present study details the treatment of non-sterilized unfiltered food processing industrial wastewater by microalgae for nutrient removal as well as to determine the tolerance to COD by taking different dilutions of wastewater. In addition, the effect of different inoculum percentages of microalgae on removal efficiency of the nutrients for given dilution has been studied. To see the effect of dilution and COD tolerance, the wastewater having initial COD 5000 mg/L (±5), nitrate 28 mg/L (±10), and phosphate 24 mg/L (±10) was diluted to get COD of 3000 mg/L and 1000 mg/L. The experiments were carried out in 1L conical flask by intermittent aeration with different inoculum percentage i.e. 10%, 20%, and 30% of Chlorella sp. isolated from nearby area of NEERI, Nagpur. The experiments were conducted for 6 days by providing 12:12 light- dark period and determined various parameters such as COD, TOC, NO3-- N, PO4-- P, and total solids on daily basis. Results revealed that, for 10% and 20% inoculum, over 90% COD and TOC reduction was obtained with wastewater containing COD of 3000 mg/L whereas over 80% COD and TOC reduction was obtained with wastewater containing COD of 1000 mg/L. Moreover, microalgae was found to tolerate wastewater containing COD 5000 mg/L and obtained over 60% and 80% reduction in COD and TOC respectively. The obtained results were found similar with 10% and 20% inoculum in all COD dilutions whereas for 30% inoculum over 60% COD and 70% TOC reduction was obtained. In case of nutrient removal, over 70% nitrate removal and 45% phosphate removal was obtained with 20% inoculum in all dilutions. The obtained results indicated that Microalgae assisted nutrient removal gives maximum COD and TOC reduction with 3000 mg/L COD and 20% inoculum. Hence, microalgae assisted wastewater treatment is not only effective for removal of nutrients but also can tolerate high COD up to 5000 mg/L and solid content.Keywords: Chlorella sp., chemical oxygen demand, food processing industrial wastewater, mixotrophic growth
Procedia PDF Downloads 33215178 The Relationship among Lifestyles, Accompany Forms, and Children’s Capability to Solve Problems of Modern Families
Authors: Tien-Ling Yeh, Jo-Han Chan
Abstract:
The percentage of dual-earner couples has become higher and higher each year. Family lifestyles in Taiwan have also been changing. This fact reflects the importance of family communication and parent-child relationship. This study aimed to explore the influences of family lifestyles and accompany forms on children’s capability to solve problems. The research process included two phases: (1) literature review, to explore the characteristics of children’s capability to solve problems and methods to measure this capability; and (2) questionnaire analyses, to explore the influences of lifestyles and accompany time and forms of modern families on their children’s capability to solve problems. The questionnaires were issued in October and November, 2016. A total of 300 questionnaires were retrieved, among which 250 were valid. The findings are summarized below: -The linguistic performances of the children from families of the busy and haggling lifestyle or the intermittent childcare lifestyle were rather good. Besides being interested in learning, these children could solve problems or difficulties independently. -The capability to ‘analyze problems’ of children from families with accompanying time during 19:00-19:30 (family dinner time) or 22:00-23:30 (before bedtime) was good. When facing a complex problem, these children could identify the most important factor in the problem. When seeing a problem, they would first look for the cause. If they encountered a bottleneck while solving a problem, they would review the context of the problem and related conditions to come up with another solution. -According to the literature, learning toys with numbers and symbols to learn to read can help develop children’s logic thinking, which is helpful to solve problems. Interestingly, some study suggested that children playing with fluid constructive toys are less likely to give up what they are doing and more likely to identify problems in their daily life. Some of them can even come up with creative and effective solutions.Keywords: accompany, lifestyle, parent-child, problem-solving
Procedia PDF Downloads 11715177 Smartphone Application for Social Inclusion of Deaf Parents and Children About Sphincter Training
Authors: Júlia Alarcon Pinto, Carlos João Schaffhausser, Gustavo Alarcon Pinto
Abstract:
Introduction: The deaf people in Brazil communicate through the Brazilian Sign Language (LIBRAS), which is restricted to this minority and people that received training. However, there is a lack of prepared professionals in the health system to deal with these patients. Therefore, effective communication, health education, quality of support and assistance are compromised. It is of utmost importance to develop measures that ensure the inclusion of deaf parents and children since there are frequent doubts about sphincter training and an absence of tools to promote effective communication between doctors and their patients. Objective: Use of an efficient, rapid and cheap communication method to promote social inclusion and patient education of deaf parents and children during pediatrics appointments. Results; The application demonstrates how to express phrases and symptoms within seconds and this allows patients to fully understand the information provided during the appointment and are capable to evaluate the signs of readiness, learn the correct approaches with the child, what are the adequate instruments, possible obstacles and the importance to execute medical orientations in order to achieve success in the process. Consequently, patients feel more satisfied, secured and embraced by professionals in the health system care. Conclusion: It is of utmost importance to use efficient and cheap methods that support patient care and education in order to promote health and social inclusion.Keywords: application, deaf patients, social inclusion, sphincter training
Procedia PDF Downloads 11915176 Reading High Rise Residential Development in Istanbul on the Theory of Globalization
Authors: Tuba Sari
Abstract:
One of the major transformations caused by the industrial revolution, technological developments and globalization is undoubtedly acceleration of urbanization process. Globalization, in particular, is one of the major factors that trigger this transformation. In this context, as a result of the global metropolitan city system, multifunctional rising structure forms are becoming undeniable fact of the world’s leading metropolises as the manifestation of prestige and power with different life choices, easy accessibility to services related to the era of technology. The scope of research deals with five different urban centers in İstanbul where high-rise housing is increasing dramatically after 2000’s. Therefore, the research regards multi-centered urban residential pattern being created by high-rise housing structures in the city. The methodology of the research is based on two main issue, one of them is related to sampling method of high-rise housing projects in İstanbul, while the other method of the research is based on the model of Semantics. In the framework of research hypothesis, it is aimed to prove that the character of vertical intensive structuring in Istanbul is based on seeking of different forms and images in the expressive quality, considering the production of existing high-rise buildings in residential areas in recent years. In respect to rising discourse of 'World City' in the globalizing world, it is very important to state the place of Istanbul in other developing world metropolises. In the perspective of 'World City' discourse, Istanbul has different projects concerning with globalization, international finance companies, cultural activities, mega projects, etc. In brief, the aim of this research is examining transformation forms of high-rise housing development in Istanbul within the frame of developing world cities, searching and analyzing discourse and image related to these projects.Keywords: globalization, high-rise, housing, image
Procedia PDF Downloads 28415175 Tool Condition Monitoring of Ceramic Inserted Tools in High Speed Machining through Image Processing
Authors: Javier A. Dominguez Caballero, Graeme A. Manson, Matthew B. Marshall
Abstract:
Cutting tools with ceramic inserts are often used in the process of machining many types of superalloy, mainly due to their high strength and thermal resistance. Nevertheless, during the cutting process, the plastic flow wear generated in these inserts enhances and propagates cracks due to high temperature and high mechanical stress. This leads to a very variable failure of the cutting tool. This article explores the relationship between the continuous wear that ceramic SiAlON (solid solutions based on the Si3N4 structure) inserts experience during a high-speed machining process and the evolution of sparks created during the same process. These sparks were analysed through pictures of the cutting process recorded using an SLR camera. Features relating to the intensity and area of the cutting sparks were extracted from the individual pictures using image processing techniques. These features were then related to the ceramic insert’s crater wear area.Keywords: ceramic cutting tools, high speed machining, image processing, tool condition monitoring, tool wear
Procedia PDF Downloads 29815174 Optimizing Machine Learning Through Python Based Image Processing Techniques
Authors: Srinidhi. A, Naveed Ahmed, Twinkle Hareendran, Vriksha Prakash
Abstract:
This work reviews some of the advanced image processing techniques for deep learning applications. Object detection by template matching, image denoising, edge detection, and super-resolution modelling are but a few of the tasks. The paper looks in into great detail, given that such tasks are crucial preprocessing steps that increase the quality and usability of image datasets in subsequent deep learning tasks. We review some of the methods for the assessment of image quality, more specifically sharpness, which is crucial to ensure a robust performance of models. Further, we will discuss the development of deep learning models specific to facial emotion detection, age classification, and gender classification, which essentially includes the preprocessing techniques interrelated with model performance. Conclusions from this study pinpoint the best practices in the preparation of image datasets, targeting the best trade-off between computational efficiency and retaining important image features critical for effective training of deep learning models.Keywords: image processing, machine learning applications, template matching, emotion detection
Procedia PDF Downloads 1315173 The Impact of the Information Technologies on the Accounting Department of the Romanian Companies
Authors: Dumitru Valentin Florentin
Abstract:
The need to use high volumes of data and the high competition are only two reasons which make necessary the use of information technologies. The objective of our research is to establish the impact of information technologies on the accounting department of the Romanian companies. In order to achieve it, starting from the literature review we made an empirical research based on a questionnaire. We investigated the types of technologies used, the reasons which led to the implementation of certain technologies, the benefits brought by the use of the information technologies, the difficulties brought by the implementation and the future effects of the applications. The conclusions show that there is an evolution in the degree of implementation of the information technologies in the Romanian companies, compared with the results of other studies conducted a few years before.Keywords: information technologies, impact, company, Romania, empirical study
Procedia PDF Downloads 42415172 High-Temperature Behavior of Boiler Steel by Friction Stir Processing
Authors: Supreet Singh, Manpreet Kaur, Manoj Kumar
Abstract:
High temperature corrosion is an imperative material degradation method experienced in thermal power plants and other energy generation sectors. Metallic materials such as ferritic steels have special properties such as easy fabrication and machinibilty, low cost, but a serious drawback of these materials is the worsening in properties initiating from the interaction with the environments. The metallic materials do not endure higher temperatures for extensive period of time because of their poor corrosion resistance. Friction Stir Processing (FSP), has emerged as the potent surface modification means and control of microstructure in thermo mechanically heat affecting zones of various metal alloys. In the current research work, FSP was done on the boiler tube of SA 210 Grade A1 material which is regularly used by thermal power plants. The strengthening of SA210 Grade A1 boiler steel through microstructural refinement by Friction Stir Processing (FSP) and analyze the effect of the same on high temperature corrosion behavior. The high temperature corrosion performance of the unprocessed and the FSPed specimens were evaluated in the laboratory using molten salt environment of Na₂SO₄-82%Fe₂(SO₄). The unprocessed and FSPed low carbon steel Gr A1 evaluation was done in terms of microstructure, corrosion resistance, mechanical properties like hardness- tensile. The in-depth characterization was done by EBSD, SEM/EDS and X-ray mapping analyses with an aim to propose the mechanism behind high temperature corrosion behavior of the FSPed steel.Keywords: boiler steel, characterization, corrosion, EBSD/SEM/EDS/XRD, friction stir processing
Procedia PDF Downloads 23715171 Reduction of Residual Stress by Variothermal Processing and Validation via Birefringence Measurement Technique on Injection Molded Polycarbonate Samples
Authors: Christoph Lohr, Hanna Wund, Peter Elsner, Kay André Weidenmann
Abstract:
Injection molding is one of the most commonly used techniques in the industrial polymer processing. In the conventional process of injection molding, the liquid polymer is injected into the cavity of the mold, where the polymer directly starts hardening at the cooled walls. To compensate the shrinkage, which is caused predominantly by the immediate cooling, holding pressure is applied. Through that whole process, residual stresses are produced by the temperature difference of the polymer melt and the injection mold and the relocation of the polymer chains, which were oriented by the high process pressures and injection speeds. These residual stresses often weaken or change the structural behavior of the parts or lead to deformation of components. One solution to reduce the residual stresses is the use of variothermal processing. Hereby the mold is heated – i.e. near/over the glass transition temperature of the polymer – the polymer is injected and before opening the mold and ejecting the part the mold is cooled. For the next cycle, the mold gets heated again and the procedure repeats. The rapid heating and cooling of the mold are realized indirectly by convection of heated and cooled liquid (here: water) which is pumped through fluid channels underneath the mold surface. In this paper, the influences of variothermal processing on the residual stresses are analyzed with samples in a larger scale (500 mm x 250 mm x 4 mm). In addition, the influence on functional elements, such as abrupt changes in wall thickness, bosses, and ribs, on the residual stress is examined. Therefore the polycarbonate samples are produced by variothermal and isothermal processing. The melt is injected into a heated mold, which has in our case a temperature varying between 70 °C and 160 °C. After the filling of the cavity, the closed mold is cooled down varying from 70 °C to 100 °C. The pressure and temperature inside the mold are monitored and evaluated with cavity sensors. The residual stresses of the produced samples are illustrated by birefringence where the effect on the refractive index on the polymer under stress is used. The colorful spectrum can be uncovered by placing the sample between a polarized light source and a second polarization filter. To show the achievement and processing effects on the reduction of residual stress the birefringence images of the isothermal and variothermal produced samples are compared and evaluated. In this comparison to the variothermal produced samples have a lower amount of maxima of each color spectrum than the isothermal produced samples, which concludes that the residual stress of the variothermal produced samples is lower.Keywords: birefringence, injection molding, polycarbonate, residual stress, variothermal processing
Procedia PDF Downloads 28315170 Financial Burden of Occupational Slip and Fall Incidences in Taiwan
Authors: Kai Way Li, Lang Gan
Abstract:
Slip &Fall are common in Taiwan. They could result in injuries and even fatalities. Official statistics indicate that more than 15% of all occupational incidences were slip/fall related. All the workers in Taiwan are required by the law to join the worker’s insurance program administered by the Bureau of Labor Insurance (BLI). The BLI is a government agency under the supervision of the Ministry of Labor. Workers claim with the BLI for insurance compensations when they suffer fatalities or injuries at work. Injuries statistics based on worker’s compensation claims were rarely studied. The objective of this study was to quantify the injury statistics and financial cost due to slip-fall incidences based on the BLI compensation records. Compensation records in the BLI during 2007 to 2013 were retrieved. All the original application forms, approval opinions, results for worker’s compensations were in hardcopy and were stored in the BLI warehouses. Xerox copies of the claims, excluding the personal information of the applicants (or the victim if passed away), were obtained. The content in the filing forms were coded in an Excel worksheet for further analyses. Descriptive statistics were performed to analyze the data. There were a total of 35,024 claims including 82 deaths, 878 disabilities, and 34,064 injuries/illnesses which were slip/fall related. It was found that the average losses for the death cases were 40 months. The total dollar amount for these cases paid was 86,913,195 NTD. For the disability cases, the average losses were 367.36 days. The total dollar amount for these cases paid was almost 2.6 times of those for the death cases (233,324,004 NTD). For the injury/illness cases, the average losses for the illness cases were 58.78 days. The total dollar amount for these cases paid was approximately 13 times of those of the death cases (1134,850,821 NTD). For the applicants/victims, 52.3% were males. There were more males than females for the deaths, disability, and injury/illness cases. Most (57.8%) of the female victims were between 45 to 59 years old. Most of the male victims (62.6%) were, on the other hand, between 25 to 39 years old. Most of the victims were in manufacturing industry (26.41%), next the construction industry (22.20%), and next the retail industry (13.69%). For the fatality cases, head injury was the main problem for immediate or eventual death (74.4%). For the disability case, foot (17.46%) and knee (9.05%) injuries were the leading problems. The compensation claims other than fatality and disability were mainly associated with injuries of the foot (18%), hand (12.87%), knee (10.42%), back (8.83%), and shoulder (6.77%). The slip/fall cases studied indicate that the ratios among the death, disability, and injury/illness counts were 1:10:415. The ratios of dollar amount paid by the BLI for the three categories were 1:2.6:13. Such results indicate the significance of slip-fall incidences resulting in different severity. Such information should be incorporated in to slip-fall prevention program in industry.Keywords: epidemiology, slip and fall, social burden, workers’ compensation
Procedia PDF Downloads 32315169 Violence in the School Environment: When the Teenager Encounters the Threat of Depression
Authors: Ndje Ndje Mireille
Abstract:
For some years in Cameroon, there has been an increase in violence in schools. This violence has gone from verbal to physical, sometimes going as far as murder. At the centre of this violence, we find the student who is a teenager in the midst of both physical and psychological changes. The unpredictable transformations of his body, the unexpected emotions arrouse when he encouters someonelse, intrusion, shortcomings, boredom, loneliness and self-deception are the threats to which the teenager faces daily. From the psychopathological point of view, the greatest threat in adolesence is probably the depresive threat. During adolescence and for several resons, the subject is confronted with the self image. He displays certantity which sometimes hides great uncertaintity about what leads him to manifest some particular behaviours or undertake certain actions. Faced with aggressiveness twards those he confronts, he feels more or less guilt. This can lead a certain number of adolescents to feel heplessness faced to their vis-à-vis, faced to life. This helplessness is sometimes reinforced by the social, cultural and economic context in which they are. The teeneger then feels threatens by this depression which, when it reaches its extreme, it is manifested by the feeling that he can no longer do anything. Generally, the depressive threats manifest itself in defensive forms vis-à-vis with the depression itself. Reason why, it is indeed a threat and not a threshold already crossed. This threat often manifests itself in inappropriate forms of attack on one’s own body as seen in a number of repetitive risky behaviours. We also see teenegers confront peers and even adults through physical attacks and often go as far as murder. All these behaviours appears as an absurd way of attacking and at the same time confronting the feeling of remaining alive. This depressive threats can also be expressed in forms of attacks on an individual’s thinking abilities or more explicitely in the form of accademic downfall. The depressive threats does not sum up all the problems of adolescence, but, undoubtly represents currently, one of the deepest form of unease adolescents face.Keywords: violence, school, depression threats, adolescent, behavior
Procedia PDF Downloads 8215168 Understanding the Heart of the Matter: A Pedagogical Framework for Apprehending Successful Second Language Development
Authors: Cinthya Olivares Garita
Abstract:
Untangling language processing in second language development has been either a taken-for-granted and overlooked task for some English language teaching (ELT) instructors or a considerable feat for others. From the most traditional language instruction to the most communicative methodologies, how to assist L2 learners in processing language in the classroom has become a challenging matter in second language teaching. Amidst an ample array of methods, strategies, and techniques to teach a target language, finding a suitable model to lead learners to process, interpret, and negotiate meaning to communicate in a second language has imposed a great responsibility on language teachers; committed teachers are those who are aware of their role in equipping learners with the appropriate tools to communicate in the target language in a 21stcentury society. Unfortunately, one might find some English language teachers convinced that their job is only to lecture students; others are advocates of textbook-based instruction that might hinder second language processing, and just a few might courageously struggle to facilitate second language learning effectively. Grounded on the most representative empirical studies on comprehensible input, processing instruction, and focus on form, this analysis aims to facilitate the understanding of how second language learners process and automatize input and propose a pedagogical framework for the successful development of a second language. In light of this, this paper is structured to tackle noticing and attention and structured input as the heart of processing instruction, comprehensible input as the missing link in second language learning, and form-meaning connections as opposed to traditional grammar approaches to language teaching. The author finishes by suggesting a pedagogical framework involving noticing-attention-comprehensible-input-form (NACIF based on their acronym) to support ELT instructors, teachers, and scholars on the challenging task of facilitating the understanding of effective second language development.Keywords: second language development, pedagogical framework, noticing, attention, comprehensible input, form
Procedia PDF Downloads 2815167 Problems concerning Legal Regulation of Electronic Governance in Georgia
Authors: Giga Phartenadze
Abstract:
In the legal framework of regulation of electronic governance, those norms are considered which include measures for improvement of functions of public institutions and a complex of actions for raising their standard such as websites of public institutions, online services, some forms of internet interactions and higher level of internet services. An important legal basis for electronic governance in Georgia is Georgian Law about Electronic Communications which defines legal and economic basis for utilizing electronic communication systems in Georgia. As for single electronic basis for e-governance regulation, it can be said that it does not exist at all. The official websites of public institutions do not have standards for proactive spreading of information. At the same time, there is no common legal norm which would make all public institutions have an official website for public relations, accountability, publicity, and raising information quality. Electronic governance in Georgia needs comprehensive legal regulation. Public administration in electronic form is on the initial stage of development. Currently existing legal basis has a low quality for public institutions and officials as well as citizens and business. Services of e-involvement and e-consultation have also low quality. So far there is no established legal framework for e-governance. Therefore, a single legislative system of e-governance should be created which will help develop effective, comprehensive and multi component electronic systems in the country (central, regional, local levels). Such comprehensive legal framework will provide relevant technological, institutional, and informational conditions.Keywords: law, e-government, public administration, Georgia
Procedia PDF Downloads 32315166 Adapting to College: Exploration of Psychological Well-Being, Coping, and Identity as Markers of Readiness
Authors: Marit D. Murry, Amy K. Marks
Abstract:
The transition to college is a critical period that affords abundant opportunities for growth in conjunction with novel challenges for emerging adults. During this time, emerging adults are garnering experiences and acquiring hosts of new information that they are required to synthesize and use to inform life-shaping decisions. This stage is characterized by instability and exploration, which necessitates a diverse set of coping skills to successfully navigate and positively adapt to their evolving environment. However, important sociocultural factors result in differences that occur developmentally for minority emerging adults (i.e., emerging adults with an identity that has been or is marginalized). While the transition to college holds vast potential, not all are afforded the same chances, and many individuals enter into this stage at varying degrees of readiness. Understanding the nuance and diversity of student preparedness for college and contextualizing these factors will better equip systems to support incoming students. Emerging adulthood for ethnic, racial minority students presents itself as an opportunity for growth and resiliency in the face of systemic adversity. Ethnic, racial identity (ERI) is defined as an identity that develops as a function of one’s ethnic-racial group membership. Research continues to demonstrate ERI as a resilience factor that promotes positive adjustment in young adulthood. Adaptive coping responses (e.g., engaging in help-seeking behavior, drawing on personal and community resources) have been identified as possible mechanisms through which ERI buffers youth against stressful life events, including discrimination. Additionally, trait mindfulness has been identified as a significant predictor of general psychological health, and mindfulness practice has been shown to be a self-regulatory strategy that promotes healthy stress responses and adaptive coping strategy selection. The current study employed a person-centered approach to explore emerging patterns across ethnic identity development and psychological well-being criterion variables among college freshmen. Data from 283 incoming college freshmen at Northeastern University were analyzed. The Brief COPE Acceptance and Emotional Support scales, the Five Factor Mindfulness Questionnaire, and MIEM Exploration and Affirmation measures were used to inform the cluster profiles. The TwoStep auto-clustering algorithm revealed an optimal three-cluster solution (BIC = 848.49), which classified 92.6% (n = 262) of participants in the sample into one of the three clusters. The clusters were characterized as ‘Mixed Adjustment’, ‘Lowest Adjustment’, and ‘Moderate Adjustment.’ Cluster composition varied significantly by ethnicity X² (2, N = 262) = 7.74 (p = .021) and gender X² (2, N = 259) = 10.40 (p = .034). The ‘Lowest Adjustment’ cluster contained the highest proportion of students of color, 41% (n = 32), and male-identifying students, 44.2% (n = 34). Follow-up analyses showed higher ERI exploration in ‘Moderate Adjustment’ cluster members, also reported higher levels of psychological distress, with significantly elevated depression scores (p = .011), psychological diagnoses of depression (p = .013), anxiety (p = .005) and psychiatric disorders (p = .025). Supporting prior research, students engaging with identity exploration processes often endure more psychological distress. These results indicate that students undergoing identity development may require more socialization and different services beyond normal strategies.Keywords: adjustment, coping, college, emerging adulthood, ethnic-racial identity, psychological well-being, resilience
Procedia PDF Downloads 11015165 Augmented Reality in Advertising and Brand Communication: An Experimental Study
Authors: O. Mauroner, L. Le, S. Best
Abstract:
Digital technologies offer many opportunities in the design and implementation of brand communication and advertising. Augmented reality (AR) is an innovative technology in marketing communication that focuses on the fact that virtual interaction with a product ad offers additional value to consumers. AR enables consumers to obtain (almost) real product experiences by the way of virtual information even before the purchase of a certain product. Aim of AR applications in relation with advertising is in-depth examination of product characteristics to enhance product knowledge as well as brand knowledge. Interactive design of advertising provides observers with an intense examination of a specific advertising message and therefore leads to better brand knowledge. The elaboration likelihood model and the central route to persuasion strongly support this argumentation. Nevertheless, AR in brand communication is still in an initial stage and therefore scientific findings about the impact of AR on information processing and brand attitude are rare. The aim of this paper is to empirically investigate the potential of AR applications in combination with traditional print advertising. To that effect an experimental design with different levels of interactivity is built to measure the impact of interactivity of an ad on different variables o advertising effectiveness.Keywords: advertising effectiveness, augmented reality, brand communication, brand recall
Procedia PDF Downloads 30215164 Image Processing of Scanning Electron Microscope Micrograph of Ferrite and Pearlite Steel for Recognition of Micro-Constituents
Authors: Subir Gupta, Subhas Ganguly
Abstract:
In this paper, we demonstrate the new area of application of image processing in metallurgical images to develop the more opportunity for structure-property correlation based approaches of alloy design. The present exercise focuses on the development of image processing tools suitable for phrase segmentation, grain boundary detection and recognition of micro-constituents in SEM micrographs of ferrite and pearlite steels. A comprehensive data of micrographs have been experimentally developed encompassing the variation of ferrite and pearlite volume fractions and taking images at different magnification (500X, 1000X, 15000X, 2000X, 3000X and 5000X) under scanning electron microscope. The variation in the volume fraction has been achieved using four different plain carbon steel containing 0.1, 0.22, 0.35 and 0.48 wt% C heat treated under annealing and normalizing treatments. The obtained data pool of micrographs arbitrarily divided into two parts to developing training and testing sets of micrographs. The statistical recognition features for ferrite and pearlite constituents have been developed by learning from training set of micrographs. The obtained features for microstructure pattern recognition are applied to test set of micrographs. The analysis of the result shows that the developed strategy can successfully detect the micro constitutes across the wide range of magnification and variation of volume fractions of the constituents in the structure with an accuracy of about +/- 5%.Keywords: SEM micrograph, metallurgical image processing, ferrite pearlite steel, microstructure
Procedia PDF Downloads 19915163 Visual Analytics of Higher Order Information for Trajectory Datasets
Authors: Ye Wang, Ickjai Lee
Abstract:
Due to the widespread of mobile sensing, there is a strong need to handle trails of moving objects, trajectories. This paper proposes three visual analytic approaches for higher order information of trajectory data sets based on the higher order Voronoi diagram data structure. Proposed approaches reveal geometrical information, topological, and directional information. Experimental results demonstrate the applicability and usefulness of proposed three approaches.Keywords: visual analytics, higher order information, trajectory datasets, spatio-temporal data
Procedia PDF Downloads 40215162 Development of Fake News Model Using Machine Learning through Natural Language Processing
Authors: Sajjad Ahmed, Knut Hinkelmann, Flavio Corradini
Abstract:
Fake news detection research is still in the early stage as this is a relatively new phenomenon in the interest raised by society. Machine learning helps to solve complex problems and to build AI systems nowadays and especially in those cases where we have tacit knowledge or the knowledge that is not known. We used machine learning algorithms and for identification of fake news; we applied three classifiers; Passive Aggressive, Naïve Bayes, and Support Vector Machine. Simple classification is not completely correct in fake news detection because classification methods are not specialized for fake news. With the integration of machine learning and text-based processing, we can detect fake news and build classifiers that can classify the news data. Text classification mainly focuses on extracting various features of text and after that incorporating those features into classification. The big challenge in this area is the lack of an efficient way to differentiate between fake and non-fake due to the unavailability of corpora. We applied three different machine learning classifiers on two publicly available datasets. Experimental analysis based on the existing dataset indicates a very encouraging and improved performance.Keywords: fake news detection, natural language processing, machine learning, classification techniques.
Procedia PDF Downloads 167