Search results for: Digital Image Processing
1661 Evaluation Techniques of Photography in Visual Communications in Iran
Authors: Firouzeh Keshavarzi
Abstract:
Although a picture can be automatically a graphic work, but especially in the field of graphics and images based on the idea of advertising and graphic design will be prepared and photographers to realize the design using his own knowledge and skills to help does. It is evident that knowledge of photography, photographer and designer of the facilities, fields of reaching a higher level of quality offers. At the same time do not have a graphic designer is also skilled photographer, but can execute your idea may delegate to an expert photographer. Using technology and methods in all fields of photography, graphic art may be applicable. But most of its application in Iran, in works such as packaging, posters, Bill Board, advertising, brochures and catalogs are. In this study, we review how the images and techniques in the chart should be used in Iranian graphic photo what impact has left. Using photography techniques and procedures can be designed and helped advance the goals graphic. Technique could not determine the idea. But what is important to think about design and photography and his creativity can flourish as a tool to be effective graphic designer in mind. Computer software to help it's very promotes creativity techniques shall graphic designer but also it is as a tool. Using images in various fields, especially graphic arts and only because it is not being documented, but applications are beautiful. As to his photographic style from today is graphics. Graphic works try to affect impacts on their audience. Hence the photo as an important factor is attention. The other hand saw the man with the extent of forgiving and understanding people's image, instead of using the word to your files, allows large messages and concepts should be sent in the shortest time. Posters, advertisements, brochures, catalog and packaging products very diverse agricultural, industrial and food could not be self-image. Today, the use of graphic images for a big score and the photos to richen the role graphic design plays a major.Keywords: Photo, Photography Techniques, Contacts, GraphicDesigner, Visual Communications, Iran.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28811660 Normalizing Flow to Augmented Posterior: Conditional Density Estimation with Interpretable Dimension Reduction for High Dimensional Data
Authors: Cheng Zeng, George Michailidis, Hitoshi Iyatomi, Leo L Duan
Abstract:
The conditional density characterizes the distribution of a response variable y given other predictor x, and plays a key role in many statistical tasks, including classification and outlier detection. Although there has been abundant work on the problem of Conditional Density Estimation (CDE) for a low-dimensional response in the presence of a high-dimensional predictor, little work has been done for a high-dimensional response such as images. The promising performance of normalizing flow (NF) neural networks in unconditional density estimation acts a motivating starting point. In this work, we extend NF neural networks when external x is present. Specifically, they use the NF to parameterize a one-to-one transform between a high-dimensional y and a latent z that comprises two components [zP , zN]. The zP component is a low-dimensional subvector obtained from the posterior distribution of an elementary predictive model for x, such as logistic/linear regression. The zN component is a high-dimensional independent Gaussian vector, which explains the variations in y not or less related to x. Unlike existing CDE methods, the proposed approach, coined Augmented Posterior CDE (AP-CDE), only requires a simple modification on the common normalizing flow framework, while significantly improving the interpretation of the latent component, since zP represents a supervised dimension reduction. In image analytics applications, AP-CDE shows good separation of x-related variations due to factors such as lighting condition and subject id, from the other random variations. Further, the experiments show that an unconditional NF neural network, based on an unsupervised model of z, such as Gaussian mixture, fails to generate interpretable results.
Keywords: Conditional density estimation, image generation, normalizing flow, supervised dimension reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1661659 Application of Particle Image Velocimetry in the Analysis of Scale Effects in Granular Soil
Authors: Zuhair Kadhim Jahanger, S. Joseph Antony
Abstract:
The available studies in the literature which dealt with the scale effects of strip footings on different sand packing systematically still remain scarce. In this research, the variation of ultimate bearing capacity and deformation pattern of soil beneath strip footings of different widths under plane-strain condition on the surface of loose, medium-dense and dense sand have been systematically studied using experimental and noninvasive methods for measuring microscopic deformations. The presented analyses are based on model scale compression test analysed using Particle Image Velocimetry (PIV) technique. Upper bound analysis of the current study shows that the maximum vertical displacement of the sand under the ultimate load increases for an increase in the width of footing, but at a decreasing rate with relative density of sand, whereas the relative vertical displacement in the sand decreases for an increase in the width of the footing. A well agreement is observed between experimental results for different footing widths and relative densities. The experimental analyses have shown that there exists pronounced scale effect for strip surface footing. The bearing capacity factors Nγ rapidly decrease up to footing widths B=0.25 m, 0.35 m, and 0.65 m for loose, medium-dense and dense sand respectively, after that there is no significant decrease in Nγ. The deformation modes of the soil as well as the ultimate bearing capacity values have been affected by the footing widths. The obtained results could be used to improve settlement calculation of the foundation interacting with granular soil.
Keywords: PIV, granular mechanics, scale effect, upper bound analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10091658 Hot Workability of High Strength Low Alloy Steels
Authors: Seok Hong Min, Jung Ho Moon, Woo Young Jung, Tae Kwon Ha
Abstract:
The hot deformation behavior of high strength low alloy (HSLA) steels with different chemical compositions under hot working conditions in the temperature range of 900 to 1100℃ and strain rate range from 0.1 to 10 s-1 has been studied by performing a series of hot compression tests. The dynamic materials model has been employed for developing the processing maps, which show variation of the efficiency of power dissipation with temperature and strain rate. Also the Kumar-s model has been used for developing the instability map, which shows variation of the instability for plastic deformation with temperature and strain rate. The efficiency of power dissipation increased with decreasing strain rate and increasing temperature in the steel with higher Cr and Ti content. High efficiency of power dissipation over 20 % was obtained at a finite strain level of 0.1 under the conditions of strain rate lower than 1 s-1 and temperature higher than 1050 ℃ . Plastic instability was expected in the regime of temperatures lower than 1000 ℃ and strain rate lower than 0.3 s-1. Steel with lower Cr and Ti contents showed high efficiency of power dissipation at higher strain rate and lower temperature conditions.Keywords: High strength low alloys steels, hot workability, Dynamic materials model, Processing maps.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20191657 Performance of an Improved Fluidized System for Processing Green Tea
Authors: Nickson Kipng’etich Lang’at, Thomas Thoruwa, John Abraham, John Wanyoko
Abstract:
Green tea is made from the top two leaves and buds of a shrub, Camellia sinensis, of the family Theaceae and the order Theales. The green tea leaves are picked and immediately sent to be dried or steamed to prevent fermentation. Fluid bed drying technique is a common drying method used in drying green tea because of its ease in design and construction and fluidization of fine tea particles. Major problems in this method are significant loss of chemical content of the leaf and green appearance of tea, retention of high moisture content in the leaves and bed channeling and defluidization. The energy associated with the drying technology has been shown to be a vital factor in determining the quality of green tea. As part of the implementation, prototype dryer was built that facilitated sequence of operations involving steaming, cooling, pre-drying and final drying. The major findings of the project were in terms of quality characteristics of tea leaves and energy consumption during processing. The optimal design achieved a moisture content of 4.2 ± 0.84%. With the optimum drying temperature of 100 ºC, the specific energy consumption was 1697.8 kj.Kg-1 and evaporation rate of 4.272 x 10-4 Kg.m-2.s-1. The energy consumption in a fluidized system can be further reduced by focusing on energy saving designs.
Keywords: Evaporation rate, fluid bed dryer, maceration, specific energy consumption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17011656 A Study to Assess the Energy Saving Potential and Economic Analysis of an Agro Based Industry in Karnataka, India
Authors: Sangamesh G. Sakri, Akash N. Patil, Sadashivappa M. Kotli
Abstract:
Agro based industries in India are considered as the micro, small and medium enterprises (MSME). In India, MSMEs contribute approximately 8 percent of the country’s GDP, 42 percent of the manufacturing output and 40 percent of exports. The toor dal (scientific name Cajanus cajan, commonly known as yellow gram, pigeon pea) is the second largest pulse crop in India accounting for about 20% of total pulse production. The toor dal milling industry in India is one of the major agro-processing industries in the country. Most of the dal mills are concentrated in pulse producing areas, which are spread all over the country. In Karnataka state, Gulbarga is a district, where toor dal is the main crop and is grown extensively. There are more than 500 dal mills in and around the Gulbarga district to process dal. However, the majority of these dal milling units use traditional methods of processing which are energy and capital intensive. There exists a huge energy saving potential in these mills. An energy audit is conducted on a dal mill in Gulbarga to understand the energy consumption pattern to assess the energy saving potential, and an economic analysis is conducted to identify energy conservation opportunities.Keywords: Conservation, demand side management, load curve, toor dal.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15251655 Emotions Triggered by Children’s Literature Images
Abstract:
The role of images/illustrations in communicating meanings and triggering emotions assumes an increasingly relevant role in contemporary texts, regardless of the age group for which they are intended or the nature of the texts that host them. It is no coincidence that children's books are full of illustrations and that the image/text ratio decreases as the age group grows. The vast majority of children's books can be considered as multimodal texts containing text and images/illustrations, interacting with each other, to provide the young reader with a broader and more creative understanding of the book's narrative. This interaction is very diverse, ranging from images/illustrations that are not essential for understanding the storytelling to those that contribute significantly to the meaning of the story. Usually, these books are also read by adults, namely by parents, educators, and teachers who act as mediators between the book and the children, explaining aspects that are or seem to be too complex for the child's context. It should be noted that there are books labeled as children's books, that are clearly intended for both children and adults. In this work, following a qualitative and interpretative methodology based on written productions, participant observation, and field notes, we will describe the perceptions of future teachers of the 1st cycle of basic education, attending a master’s degree at a Portuguese university, about the role of the image in literary and non-literary texts, namely in mathematical texts, and how these can constitute precious resources for emotional regulation and for the design of creative didactic situations. The analysis of the collected data allowed us to obtain evidence regarding the evolution of the participants' perception regarding the crucial role of images in children's literature, not only as an emotional regulator for young readers but also as a creative source for the design of meaningful didactical situations, crossing other scientific areas, other than the mother tongue, namely mathematics.
Keywords: Children’s literature, emotions, multimodal texts, soft skills.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2001654 Temporal Signal Processing by Inference Bayesian Approach for Detection of Abrupt Variation of Statistical Characteristics of Noisy Signals
Authors: Farhad Asadi, Hossein Sadati
Abstract:
In fields such as neuroscience and especially in cognition modeling of mental processes, uncertainty processing in temporal zone of signal is vital. In this paper, Bayesian online inferences in estimation of change-points location in signal are constructed. This method separated the observed signal into independent series and studies the change and variation of the regime of data locally with related statistical characteristics. We give conditions on simulations of the method when the data characteristics of signals vary, and provide empirical evidence to show the performance of method. It is verified that correlation between series around the change point location and its characteristics such as Signal to Noise Ratios and mean value of signal has important factor on fluctuating in finding proper location of change point. And one of the main contributions of this study is related to representing of these influences of signal statistical characteristics for finding abrupt variation in signal. There are two different structures for simulations which in first case one abrupt change in temporal section of signal is considered with variable position and secondly multiple variations are considered. Finally, influence of statistical characteristic for changing the location of change point is explained in details in simulation results with different artificial signals.
Keywords: Time series, fluctuation in statistical characteristics, optimal learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5641653 Image Segmentation Using Suprathreshold Stochastic Resonance
Authors: Rajib Kumar Jha, P.K.Biswas, B.N.Chatterji
Abstract:
In this paper a new concept of partial complement of a graph G is introduced and using the same a new graph parameter, called completion number of a graph G, denoted by c(G) is defined. Some basic properties of graph parameter, completion number, are studied and upperbounds for completion number of classes of graphs are obtained , the paper includes the characterization also.
Keywords: Completion Number, Maximum Independent subset, Partial complements, Partial self complementary.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12281652 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach
Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh
Abstract:
This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.Keywords: River stage-discharge process, LSSVM, discrete wavelet transform (DWT), ensemble empirical decomposition mode (EEMD), multi-station modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6641651 Noninvasive Brain-Machine Interface to Control Both Mecha TE Robotic Hands Using Emotiv EEG Neuroheadset
Authors: Adrienne Kline, Jaydip Desai
Abstract:
Electroencephalogram (EEG) is a noninvasive technique that registers signals originating from the firing of neurons in the brain. The Emotiv EEG Neuroheadset is a consumer product comprised of 14 EEG channels and was used to record the reactions of the neurons within the brain to two forms of stimuli in 10 participants. These stimuli consisted of auditory and visual formats that provided directions of ‘right’ or ‘left.’ Participants were instructed to raise their right or left arm in accordance with the instruction given. A scenario in OpenViBE was generated to both stimulate the participants while recording their data. In OpenViBE, the Graz Motor BCI Stimulator algorithm was configured to govern the duration and number of visual stimuli. Utilizing EEGLAB under the cross platform MATLAB®, the electrodes most stimulated during the study were defined. Data outputs from EEGLAB were analyzed using IBM SPSS Statistics® Version 20. This aided in determining the electrodes to use in the development of a brain-machine interface (BMI) using real-time EEG signals from the Emotiv EEG Neuroheadset. Signal processing and feature extraction were accomplished via the Simulink® signal processing toolbox. An Arduino™ Duemilanove microcontroller was used to link the Emotiv EEG Neuroheadset and the right and left Mecha TE™ Hands.
Keywords: Brain-machine interface, EEGLAB, emotiv EEG neuroheadset, openViBE, simulink.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28041650 Colour Stability of Wild Cactus Pear Juice
Authors: Kgatla T.E, Howard S.S., Hiss D.C.
Abstract:
Prickly pear (Opuntia spp) fruit has received renewed interest since it contains a betalain pigment that has an attractive purple colour for the production of juice. Prickly pear juice was prepared by homogenizing the fruit and treating the pulp with 48 g of pectinase from Aspergillus niger. Titratable acidity was determined by diluting 10 ml prickly pear juice with 90 ml deionized water and titrating to pH 8.2 with 0.1 N NaOH. Brix was measured using a refractometer and ascorbic acid content assayed spectrophotometrically. Colour variation was determined colorimetrically (Hunter L.a.b.). Hunter L.a.b. analysis showed that the red purple colour of prickly pear juice had been affected by juice treatments. This was indicated by low light values of colour difference meter (CDML*), hue, CDMa* and CDMb* values. It was observed that non-treated prickly pear juice had a high (colour difference meter of light) CDML* of 3.9 compared to juice treatments (range 3.29 to 2.14). The CDML* significantly (p<0.05) decreased as the juice was preserved. Spectrophotometric colour analysis showed that browning was low in all treated prickly juice samples as indicated by high values at 540 nm and low values at 476 nm (browning index). The brightness of prickly pear had been affected by acidification compared to other juice treatments. This study presents evidence that processing has a positive effect on the colour quality attribute that offers a clear advantage for the production of red-purple prickly pear juice.Keywords: Colour, Hunter L.a.b, Prickly pear juice, processing, physicochemical.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28301649 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-
Authors: Nieto Bernal Wilson, Carmona Suarez Edgar
Abstract:
The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects. Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.
Keywords: Data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14781648 Landcover Mapping Using Lidar Data and Aerial Image and Soil Fertility Degradation Assessment for Rice Production Area in Quezon, Nueva Ecija, Philippines
Authors: Eliza. E. Camaso, Guiller. B. Damian, Miguelito. F. Isip, Ronaldo T. Alberto
Abstract:
Land-cover maps were important for many scientific, ecological and land management purposes and during the last decades, rapid decrease of soil fertility was observed to be due to land use practices such as rice cultivation. High-precision land-cover maps are not yet available in the area which is important in an economy management. To assure accurate mapping of land cover to provide information, remote sensing is a very suitable tool to carry out this task and automatic land use and cover detection. The study did not only provide high precision land cover maps but it also provides estimates of rice production area that had undergone chemical degradation due to fertility decline. Land-cover were delineated and classified into pre-defined classes to achieve proper detection features. After generation of Land-cover map, of high intensity of rice cultivation, soil fertility degradation assessment in rice production area due to fertility decline was created to assess the impact of soils used in agricultural production. Using Simple spatial analysis functions and ArcGIS, the Land-cover map of Municipality of Quezon in Nueva Ecija, Philippines was overlaid to the fertility decline maps from Land Degradation Assessment Philippines- Bureau of Soils and Water Management (LADA-Philippines-BSWM) to determine the area of rice crops that were most likely where nitrogen, phosphorus, zinc and sulfur deficiencies were induced by high dosage of urea and imbalance N:P fertilization. The result found out that 80.00 % of fallow and 99.81% of rice production area has high soil fertility decline.
Keywords: Aerial image, land-cover, LiDAR, soil fertility degradation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11421647 Potential of Salvia sclarea L. for Phytoremediation of Soils Contaminated with Heavy Metals
Authors: Violina R. Angelova, Radka V. Ivanova, Givko M. Todorov, Krasimir I. Ivanov
Abstract:
A field study was conducted to evaluate the efficacy of Salvia sclarea L. for phytoremediation of contaminated soils. The experiment was performed on an agricultural fields contaminated by the Non-Ferrous-Metal Works near Plovdiv, Bulgaria. The content of heavy metals in different parts of Salvia sclarea L. (roots, stems, leaves and inflorescences) was determined by ICP. The essential oil of the Salvia sclarea L. was obtained by steam distillation in laboratory conditions and was analyzed for heavy metals and its chemical composition was determined. Salvia sclarea L. is a plant which is tolerant to heavy metals and can be grown on contaminated soils. Based on the obtained results and using the most common criteria, Salvia sclarea L. can be classified as Pb hyperaccumulator and Cd and Zn accumulators, therefore, this plant has suitable potential for the phytoremediation of heavy metal contaminated soils. Favorable is also the fact that heavy metals do not influence the development of the Salvia sclarea L., as well as on the quality and quantity of the essential oil. For clary sage oil obtained from the processing of clary sage grown on highly contaminated soils, its key odour-determining ingredients meet the quality requirements of the European Pharmacopoeia and BS ISO 7609 regarding Bulgarian clary sage oil and/or have values that are close to the limits of these standards. The possibility of further industrial processing will make Salvia sclarea L. an economically interesting crop for farmers of phytoextraction technology.
Keywords: Clary sage, heavy metals, phytoremediation, polluted soils.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18411646 A Novel Impulse Detector for Filtering of Highly Corrupted Images
Authors: Umesh Ghanekar
Abstract:
As the performance of the filtering system depends upon the accuracy of the noise detection scheme, in this paper, we present a new scheme for impulse noise detection based on two levels of decision. In this scheme in the first stage we coarsely identify the corrupted pixels and in the second stage we finally decide whether the pixel under consideration is really corrupt or not. The efficacy of the proposed filter has been confirmed by extensive simulations.Keywords: Impulse detection, noise removal, image filtering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14091645 Processing the Medical Sensors Signals Using Fuzzy Inference System
Authors: S. Bouharati, I. Bouharati, C. Benzidane, F. Alleg, M. Belmahdi
Abstract:
Sensors possess several properties of physical measures. Whether devices that convert a sensed signal into an electrical signal, chemical sensors and biosensors, thus all these sensors can be considered as an interface between the physical and electrical equipment. The problem is the analysis of the multitudes of saved settings as input variables. However, they do not all have the same level of influence on the outputs. In order to identify the most sensitive parameters, those that can guide users in gathering information on the ground and in the process of model calibration and sensitivity analysis for the effect of each change made. Mathematical models used for processing become very complex. In this paper a fuzzy rule-based system is proposed as a solution for this problem. The system collects the available signals information from sensors. Moreover, the system allows the study of the influence of the various factors that take part in the decision system. Since its inception fuzzy set theory has been regarded as a formalism suitable to deal with the imprecision intrinsic to many problems. At the same time, fuzzy sets allow to use symbolic models. In this study an example was applied for resolving variety of physiological parameters that define human health state. The application system was done for medical diagnosis help. The inputs are the signals expressed the cardiovascular system parameters, blood pressure, Respiratory system paramsystem was done, it will be able to predict the state of patient according any input values.Keywords: Sensors, Sensivity, fuzzy logic, analysis, physiological parameters, medical diagnosis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19671644 A System for Analyzing and Eliciting Public Grievances Using Cache Enabled Big Data
Authors: P. Kaladevi, N. Giridharan
Abstract:
The system for analyzing and eliciting public grievances serves its main purpose to receive and process all sorts of complaints from the public and respond to users. Due to the more number of complaint data becomes big data which is difficult to store and process. The proposed system uses HDFS to store the big data and uses MapReduce to process the big data. The concept of cache was applied in the system to provide immediate response and timely action using big data analytics. Cache enabled big data increases the response time of the system. The unstructured data provided by the users are efficiently handled through map reduce algorithm. The processing of complaints takes place in the order of the hierarchy of the authority. The drawbacks of the traditional database system used in the existing system are set forth by our system by using Cache enabled Hadoop Distributed File System. MapReduce framework codes have the possible to leak the sensitive data through computation process. We propose a system that add noise to the output of the reduce phase to avoid signaling the presence of sensitive data. If the complaints are not processed in the ample time, then automatically it is forwarded to the higher authority. Hence it ensures assurance in processing. A copy of the filed complaint is sent as a digitally signed PDF document to the user mail id which serves as a proof. The system report serves to be an essential data while making important decisions based on legislation.Keywords: Big Data, Hadoop, HDFS, Caching, MapReduce, web personalization, e-governance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15921643 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach
Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar
Abstract:
Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.
Keywords: ANN, DWT, GLCM, KNN, ROI, artificial neural networks, discrete wavelet transform, gray-level co-occurrence matrix, k-nearest neighbor, region of interest.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9601642 An Experimentally Validated Thermo- Mechanical Finite Element Model for Friction Stir Welding in Carbon Steels
Authors: A. H. Kheireddine, A. A. Khalil, A. H. Ammouri, G. T. Kridli, R. F. Hamade
Abstract:
Solidification cracking and hydrogen cracking are some defects generated in the fusion welding of ultrahigh carbon steels. However, friction stir welding (FSW) of such steels, being a solid-state technique, has been demonstrated to alleviate such problems encountered in traditional welding. FSW include different process parameters that must be carefully defined prior processing. These parameters included but not restricted to: tool feed, tool RPM, tool geometry, tool tilt angle. These parameters form a key factor behind avoiding warm holes and voids behind the tool and in achieving a defect-free weld. More importantly, these parameters directly affect the microstructure of the weld and hence the final mechanical properties of weld. For that, 3D finite element (FE) thermo-mechanical model was developed using DEFORM 3D to simulate FSW of carbon steel. At points of interest in the joint, tracking is done for history of critical state variables such as temperature, stresses, and strain rates. Typical results found include the ability to simulate different weld zones. Simulations predictions were successfully compared to experimental FSW tests. It is believed that such a numerical model can be used to optimize FSW processing parameters to favor desirable defect free weld with better mechanical properties.
Keywords: Carbon Steels, DEFORM 3D, FEM, Friction stir welding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25711641 A New Method to Enhance Contrast of Electron Micrograph of Rat Tissues Sections
Authors: Lise P. Labéjof, Raiza S. P. Bizerra, Galileu B. Costa, Thaísa B. dos Santos
Abstract:
This report presents an alternative technique of application of contrast agent in vivo, i.e. before sampling. By this new method the electron micrograph of tissue sections have an acceptable contrast compared to other methods and present no artifact of precipitation on sections. Another advantage is that a small amount of contrast is needed to get a good result given that most of them are expensive and extremely toxic.Keywords: Image quality, Microscopy research, Staining technique, Ultrathin section.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16031640 Unstructured-Data Content Search Based on Optimized EEG Signal Processing and Multi-Objective Feature Extraction
Authors: Qais M. Yousef, Yasmeen A. Alshaer
Abstract:
Over the last few years, the amount of data available on the globe has been increased rapidly. This came up with the emergence of recent concepts, such as the big data and the Internet of Things, which have furnished a suitable solution for the availability of data all over the world. However, managing this massive amount of data remains a challenge due to their large verity of types and distribution. Therefore, locating the required file particularly from the first trial turned to be a not easy task, due to the large similarities of names for different files distributed on the web. Consequently, the accuracy and speed of search have been negatively affected. This work presents a method using Electroencephalography signals to locate the files based on their contents. Giving the concept of natural mind waves processing, this work analyses the mind wave signals of different people, analyzing them and extracting their most appropriate features using multi-objective metaheuristic algorithm, and then classifying them using artificial neural network to distinguish among files with similar names. The aim of this work is to provide the ability to find the files based on their contents using human thoughts only. Implementing this approach and testing it on real people proved its ability to find the desired files accurately within noticeably shorter time and retrieve them as a first choice for the user.
Keywords: Artificial intelligence, data contents search, human active memory, mind wave, multi-objective optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9201639 Device for 3D Analysis of Basic Movements of the Lower Extremity
Authors: Jiménez Villanueva Mayra Alejandra, Ortíz Casallas Diana Carolina, Luengas Contreras Lely Adriana
Abstract:
This document details the process of developing a wireless device that captures the basic movements of the foot (plantar flexion, dorsal flexion, abduction, adduction.), and the knee movement (flexion). It implements a motion capture system by using a hardware based on optical fiber sensors, due to the advantages in terms of scope, noise immunity and speed of data transmission and reception. The operating principle used by this system is the detection and transmission of joint movement by mechanical elements and their respective measurement by optical ones (in this case infrared). Likewise, Visual Basic software is used for reception, analysis and signal processing of data acquired by the device, generating a 3D graphical representation in real time of each movement. The result is a boot in charge of capturing the movement, a transmission module (Implementing Xbee Technology) and a receiver module for receiving information and sending it to the PC for their respective processing. The main idea with this device is to help on topics such as bioengineering and medicine, by helping to improve the quality of life and movement analysis.Keywords: abduction, adduction, A / D converter, Autodesk 3DMax, Infrared Diode, Driver, extension, flexion, Infrared LEDs, Interface, Modeling OPENGL, Optical Fiber, USB CDC(Communications Device Class), Virtual Reality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16941638 Fault Detection and Diagnosis of Broken Bar Problem in Induction Motors Base Wavelet Analysis and EMD Method: Case Study of Mobarakeh Steel Company in Iran
Authors: M. Ahmadi, M. Kafil, H. Ebrahimi
Abstract:
Nowadays, induction motors have a significant role in industries. Condition monitoring (CM) of this equipment has gained a remarkable importance during recent years due to huge production losses, substantial imposed costs and increases in vulnerability, risk, and uncertainty levels. Motor current signature analysis (MCSA) is one of the most important techniques in CM. This method can be used for rotor broken bars detection. Signal processing methods such as Fast Fourier transformation (FFT), Wavelet transformation and Empirical Mode Decomposition (EMD) are used for analyzing MCSA output data. In this study, these signal processing methods are used for broken bar problem detection of Mobarakeh steel company induction motors. Based on wavelet transformation method, an index for fault detection, CF, is introduced which is the variation of maximum to the mean of wavelet transformation coefficients. We find that, in the broken bar condition, the amount of CF factor is greater than the healthy condition. Based on EMD method, the energy of intrinsic mode functions (IMF) is calculated and finds that when motor bars become broken the energy of IMFs increases.
Keywords: Broken bar, condition monitoring, diagnostics, empirical mode decomposition, Fourier transform, wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8011637 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks
Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone
Abstract:
Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.
Keywords: Artificial Neural Network, Data Mining, Electroencephalogram, Epilepsy, Feature Extraction, Seizure Detection, Signal Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13141636 An Improved Design of Area Efficient Two Bit Comparator
Authors: Shashank Gautam, Pramod Sharma
Abstract:
In present era, development of digital circuits, signal processors and other integrated circuits, magnitude comparators are challenged by large area and more power consumption. Comparator is most basic circuit that performs comparison. This paper presents a technique to design a two bit comparator which consumes less area and power. DSCH and MICROWIND version 3 are used to design the schematic and design the layout of the schematic, observe the performance parameters at different nanometer technologies respectively.
Keywords: Chip design, consumed power, layout area, two bit comparator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12181635 An Analysis of the Representation of the Translator and Translation Process into Brazilian Social Networking Groups
Authors: Érica Lima
Abstract:
In the digital era, in which we have an avalanche of information, it is not new that the Internet has brought new modes of communication and knowledge access. Characterized by the multiplicity of discourses, opinions, beliefs and cultures, the web is a space of political-ideological dimensions where people (who often do not know each other) interact and create representations, deconstruct stereotypes, and redefine identities. Currently, the translator needs to be able to deal with digital spaces ranging from specific software to social media, which inevitably impact on his professional life. One of the most impactful ways of being seen in cyberspace is the participation in social networking groups. In addition to its ability to disseminate information among participants, social networking groups allow a significant personal and social exposure. Such exposure is due to the visibility of each participant achieved not only on its personal profile page, but also in each comment or post the person makes in the groups. The objective of this paper is to study the representations of translators and translation process on the Internet, more specifically in publications in two Brazilian groups of great influence on the Facebook: "Translators/Interpreters" and "Translators, Interpreters and Curious". These chosen groups represent the changes the network has brought to the profession, including the way translators are seen and see themselves. The analyzed posts allowed a reading of what common sense seems to think about the translator as opposed to what the translators seem to think about themselves as a professional class. The results of the analysis lead to the conclusion that these two positions are antagonistic and sometimes represent conflict of interests: on the one hand, the society in general consider the translator’s work something easy, therefore it is not necessary to be well remunerated; on the other hand, the translators who know how complex a translation process is and how much it takes to be a good professional. The results also reveal that social networking sites such as Facebook provide more visibility, but it takes a more active role from the translator to achieve a greater appreciation of the profession and more recognition of the role of the translator, especially in face of increasingly development of automatic translation programs.
Keywords: Facebook, social representation, translation, translator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8181634 Automatic Fluid-Structure Interaction Modeling and Analysis of Butterfly Valve Using Python Script
Authors: N. Guru Prasath, Sangjin Ma, Chang-Wan Kim
Abstract:
A butterfly valve is a quarter turn valve which is used to control the flow of a fluid through a section of pipe. Generally, butterfly valve is used in wide range of applications such as water distribution, sewage, oil and gas plants. In particular, butterfly valve with larger diameter finds its immense applications in hydro power plants to control the fluid flow. In-lieu with the constraints in cost and size to run laboratory setup, analysis of large diameter values will be mostly studied by computational method which is the best and inexpensive solution. For fluid and structural analysis, CFD and FEM software is used to perform large scale valve analyses, respectively. In order to perform above analysis in butterfly valve, the CAD model has to recreate and perform mesh in conventional software’s for various dimensions of valve. Therefore, its limitation is time consuming process. In-order to overcome that issue, python code was created to outcome complete pre-processing setup automatically in Salome software. Applying dimensions of the model clearly in the python code makes the running time comparatively lower and easier way to perform analysis of the valve. Hence, in this paper, an attempt was made to study the fluid-structure interaction (FSI) of butterfly valves by varying the valve angles and dimensions using python code in pre-processing software, and results are produced.
Keywords: Butterfly valve, fluid-structure interaction, automatic CFD analysis, flow coefficient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12971633 A Multi-Signature Scheme based on Coding Theory
Authors: Mohammed Meziani, Pierre-Louis Cayrel
Abstract:
In this paper we propose two first non-generic constructions of multisignature scheme based on coding theory. The first system make use of the CFS signature scheme and is secure in random oracle while the second scheme is based on the KKS construction and is a few times. The security of our construction relies on a difficult problems in coding theory: The Syndrome Decoding problem which has been proved NP-complete [4].Keywords: Post-quantum cryptography, Coding-based cryptography, Digital signature, Multisignature scheme.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18801632 Transforming Health Information from Manual to Digital (Electronic) World–Reference and Guide
Authors: S. Karthikeyan, Naveen Bindra
Abstract:
Introduction: To update ourselves and understand the concept of latest electronic formats available for Health care providers and how it could be used and developed as per standards. The idea is to correlate between the patients Manual Medical Records keeping and maintaining patients Electronic Information in a Health care setup in this world. Furthermore, this stands with adapting to the right technology depending upon the organization and improve our quality and quantity of Healthcare providing skills. Objective: The concept and theory is to explain the terms of Electronic Medical Record (EMR), Electronic Health Record (EHR) and Personal Health Record (PHR) and selecting the best technical among the available Electronic sources and software before implementing. It is to guide and make sure the technology used by the end users without any doubts and difficulties. The idea is to evaluate is to admire the uses and barriers of EMR-EHR-PHR. Aim and Scope: The target is to achieve the health care providers like Physicians, Nurses, Therapists, Medical Bill reimbursements, Insurances and Government to assess the patient’s information on easy and systematic manner without diluting the confidentiality of patient’s information. Method: Health Information Technology can be implemented with the help of Organisations providing with legal guidelines and help to stand by the health care provider. The main objective is to select the correct embedded and affordable database management software and generating large-scale data. The parallel need is to know how the latest software available in the market. Conclusion: The question lies here is implementing the Electronic information system with healthcare providers and organization. The clinicians are the main users of the technology and manage us to “go paperless”. The fact is that day today changing technologically is very sound and up to date. Basically, the idea is to tell how to store the data electronically safe and secure. All three exemplifies the fact that an electronic format has its own benefit as well as barriers.
Keywords: Medical records, digital records, health information, electronic record system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1361