Search results for: Digital Signal Processing (DSP)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3216

Search results for: Digital Signal Processing (DSP)

1836 Image Segmentation Using 2-D Histogram in RGB Color Space in Digital Libraries

Authors: El Asnaoui Khalid, Aksasse Brahim, Ouanan Mohammed

Abstract:

This paper presents an unsupervised color image segmentation method. It is based on a hierarchical analysis of 2-D histogram in RGB color space. This histogram minimizes storage space of images and thus facilitates the operations between them. The improved segmentation approach shows a better identification of objects in a color image and, at the same time, the system is fast.

Keywords: Image segmentation, hierarchical analysis, 2-D histogram, Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1600
1835 Taguchi Robust Design for Optimal Setting of Process Wastes Parameters in an Automotive Parts Manufacturing Company

Authors: Charles Chikwendu Okpala, Christopher Chukwutoo Ihueze

Abstract:

As a technique that reduces variation in a product by lessening the sensitivity of the design to sources of variation, rather than by controlling their sources, Taguchi Robust Design entails the designing of ideal goods, by developing a product that has minimal variance in its characteristics and also meets the desired exact performance. This paper examined the concept of the manufacturing approach and its application to brake pad product of an automotive parts manufacturing company. Although the firm claimed that only defects, excess inventory, and over-production were the few wastes that grossly affect their productivity and profitability, a careful study and analysis of their manufacturing processes with the application of Single Minute Exchange of Dies (SMED) tool showed that the waste of waiting is the fourth waste that bedevils the firm. The selection of the Taguchi L9 orthogonal array which is based on the four parameters and the three levels of variation for each parameter revealed that with a range of 2.17, that waiting is the major waste that the company must reduce in order to continue to be viable. Also, to enhance the company’s throughput and profitability, the wastes of over-production, excess inventory, and defects with ranges of 2.01, 1.46, and 0.82, ranking second, third, and fourth respectively must also be reduced to the barest minimum. After proposing -33.84 as the highest optimum Signal-to-Noise ratio to be maintained for the waste of waiting, the paper advocated for the adoption of all the tools and techniques of Lean Production System (LPS), and Continuous Improvement (CI), and concluded by recommending SMED in order to drastically reduce set up time which leads to unnecessary waiting.

Keywords: Taguchi Robust Design, signal to noise ratio, Single Minute Exchange of Dies, lean production system, waste.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 948
1834 Classifying Turbomachinery Blade Mode Shapes Using Artificial Neural Networks

Authors: Ismail Abubakar, Hamid Mehrabi, Reg Morton

Abstract:

Currently, extensive signal analysis is performed in order to evaluate structural health of turbomachinery blades. This approach is affected by constraints of time and the availability of qualified personnel. Thus, new approaches to blade dynamics identification that provide faster and more accurate results are sought after. Generally, modal analysis is employed in acquiring dynamic properties of a vibrating turbomachinery blade and is widely adopted in condition monitoring of blades. The analysis provides useful information on the different modes of vibration and natural frequencies by exploring different shapes that can be taken up during vibration since all mode shapes have their corresponding natural frequencies. Experimental modal testing and finite element analysis are the traditional methods used to evaluate mode shapes with limited application to real live scenario to facilitate a robust condition monitoring scheme. For a real time mode shape evaluation, rapid evaluation and low computational cost is required and traditional techniques are unsuitable. In this study, artificial neural network is developed to evaluate the mode shape of a lab scale rotating blade assembly by using result from finite element modal analysis as training data. The network performance evaluation shows that artificial neural network (ANN) is capable of mapping the correlation between natural frequencies and mode shapes. This is achieved without the need of extensive signal analysis. The approach offers advantage from the perspective that the network is able to classify mode shapes and can be employed in real time including simplicity in implementation and accuracy of the prediction. The work paves the way for further development of robust condition monitoring system that incorporates real time mode shape evaluation.

Keywords: Modal analysis, artificial neural network, mode shape, natural frequencies, pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 882
1833 Temperature-Based Detection of Initial Yielding Point in Loading of Tensile Specimens Made of Structural Steel

Authors: Aqsa Jamil, Hiroshi Tamura, Hiroshi Katsuchi, Jiaqi Wang

Abstract:

Yield point represents the upper limit of forces which can be applied on a specimen without causing any permanent deformation. After yielding, the behavior of specimen suddenly changes including the possibility of cracking or buckling. So, the accumulation of damage or type of fracture changes depending on this condition. As it is difficult to accurately detect yield points of the several stress concentration points in structural steel specimens, an effort has been made in this research work to develop a convenient technique using thermography (temperature-based detection) during tensile tests for the precise detection of yield point initiation. To verify the applicability of thermography camera, tests were conducted under different loading conditions and measuring the deformation by installing various strain gauges and monitoring the surface temperature with the help of thermography camera. The yield point of specimens was estimated by the help of temperature dip which occurs due to the thermoelastic effect during the plastic deformation. The scattering of the data has been checked by performing repeatability analysis. The effect of temperature imperfection and light source has been checked by carrying out the tests at daytime as well as midnight and by calculating the signal to noise ratio (SNR) of the noised data from the infrared thermography camera, it can be concluded that the camera is independent of testing time and the presence of a visible light source. Furthermore, a fully coupled thermal-stress analysis has been performed by using Abaqus/Standard exact implementation technique to validate the temperature profiles obtained from the thermography camera and to check the feasibility of numerical simulation for the prediction of results extracted with the help of thermographic technique.

Keywords: Signal to noise ratio, thermoelastic effect, thermography, yield point.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 323
1832 Hot Workability of High Strength Low Alloy Steels

Authors: Seok Hong Min, Jung Ho Moon, Woo Young Jung, Tae Kwon Ha

Abstract:

The hot deformation behavior of high strength low alloy (HSLA) steels with different chemical compositions under hot working conditions in the temperature range of 900 to 1100℃ and strain rate range from 0.1 to 10 s-1 has been studied by performing a series of hot compression tests. The dynamic materials model has been employed for developing the processing maps, which show variation of the efficiency of power dissipation with temperature and strain rate. Also the Kumar-s model has been used for developing the instability map, which shows variation of the instability for plastic deformation with temperature and strain rate. The efficiency of power dissipation increased with decreasing strain rate and increasing temperature in the steel with higher Cr and Ti content. High efficiency of power dissipation over 20 % was obtained at a finite strain level of 0.1 under the conditions of strain rate lower than 1 s-1 and temperature higher than 1050 ℃ . Plastic instability was expected in the regime of temperatures lower than 1000 ℃ and strain rate lower than 0.3 s-1. Steel with lower Cr and Ti contents showed high efficiency of power dissipation at higher strain rate and lower temperature conditions.

Keywords: High strength low alloys steels, hot workability, Dynamic materials model, Processing maps.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2000
1831 Performance of an Improved Fluidized System for Processing Green Tea

Authors: Nickson Kipng’etich Lang’at, Thomas Thoruwa, John Abraham, John Wanyoko

Abstract:

Green tea is made from the top two leaves and buds of a shrub, Camellia sinensis, of the family Theaceae and the order Theales. The green tea leaves are picked and immediately sent to be dried or steamed to prevent fermentation. Fluid bed drying technique is a common drying method used in drying green tea because of its ease in design and construction and fluidization of fine tea particles. Major problems in this method are significant loss of chemical content of the leaf and green appearance of tea, retention of high moisture content in the leaves and bed channeling and defluidization. The energy associated with the drying technology has been shown to be a vital factor in determining the quality of green tea. As part of the implementation, prototype dryer was built that facilitated sequence of operations involving steaming, cooling, pre-drying and final drying. The major findings of the project were in terms of quality characteristics of tea leaves and energy consumption during processing. The optimal design achieved a moisture content of 4.2 ± 0.84%. With the optimum drying temperature of 100 ºC, the specific energy consumption was 1697.8 kj.Kg-1 and evaporation rate of 4.272 x 10-4 Kg.m-2.s-1. The energy consumption in a fluidized system can be further reduced by focusing on energy saving designs.

Keywords: Evaporation rate, fluid bed dryer, maceration, specific energy consumption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1675
1830 A Study to Assess the Energy Saving Potential and Economic Analysis of an Agro Based Industry in Karnataka, India

Authors: Sangamesh G. Sakri, Akash N. Patil, Sadashivappa M. Kotli

Abstract:

Agro based industries in India are considered as the micro, small and medium enterprises (MSME). In India, MSMEs contribute approximately 8 percent of the country’s GDP, 42 percent of the manufacturing output and 40 percent of exports. The toor dal (scientific name Cajanus cajan, commonly known as yellow gram, pigeon pea) is the second largest pulse crop in India accounting for about 20% of total pulse production. The toor dal milling industry in India is one of the major agro-processing industries in the country. Most of the dal mills are concentrated in pulse producing areas, which are spread all over the country. In Karnataka state, Gulbarga is a district, where toor dal is the main crop and is grown extensively. There are more than 500 dal mills in and around the Gulbarga district to process dal. However, the majority of these dal milling units use traditional methods of processing which are energy and capital intensive. There exists a huge energy saving potential in these mills. An energy audit is conducted on a dal mill in Gulbarga to understand the energy consumption pattern to assess the energy saving potential, and an economic analysis is conducted to identify energy conservation opportunities.

Keywords: Conservation, demand side management, load curve, toor dal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1504
1829 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach

Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh

Abstract:

This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.

Keywords: River stage-discharge process, LSSVM, discrete wavelet transform (DWT), ensemble empirical decomposition mode (EEMD), multi-station modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 631
1828 Integration of Image and Patient Data, Software and International Coding Systems for Use in a Mammography Research Project

Authors: V. Balanica, W. I. D. Rae, M. Caramihai, S. Acho, C. P. Herbst

Abstract:

Mammographic images and data analysis to facilitate modelling or computer aided diagnostic (CAD) software development should best be done using a common database that can handle various mammographic image file formats and relate these to other patient information. This would optimize the use of the data as both primary reporting and enhanced information extraction of research data could be performed from the single dataset. One desired improvement is the integration of DICOM file header information into the database, as an efficient and reliable source of supplementary patient information intrinsically available in the images. The purpose of this paper was to design a suitable database to link and integrate different types of image files and gather common information that can be further used for research purposes. An interface was developed for accessing, adding, updating, modifying and extracting data from the common database, enhancing the future possible application of the data in CAD processing. Technically, future developments envisaged include the creation of an advanced search function to selects image files based on descriptor combinations. Results can be further used for specific CAD processing and other research. Design of a user friendly configuration utility for importing of the required fields from the DICOM files must be done.

Keywords: Database Integration, Mammogram Classification, Tumour Classification, Computer Aided Diagnosis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1926
1827 Graph Cuts Segmentation Approach Using a Patch-Based Similarity Measure Applied for Interactive CT Lung Image Segmentation

Authors: Aicha Majda, Abdelhamid El Hassani

Abstract:

Lung CT image segmentation is a prerequisite in lung CT image analysis. Most of the conventional methods need a post-processing to deal with the abnormal lung CT scans such as lung nodules or other lesions. The simplest similarity measure in the standard Graph Cuts Algorithm consists of directly comparing the pixel values of the two neighboring regions, which is not accurate because this kind of metrics is extremely sensitive to minor transformations such as noise or other artifacts problems. In this work, we propose an improved version of the standard graph cuts algorithm based on the Patch-Based similarity metric. The boundary penalty term in the graph cut algorithm is defined Based on Patch-Based similarity measurement instead of the simple intensity measurement in the standard method. The weights between each pixel and its neighboring pixels are Based on the obtained new term. The graph is then created using theses weights between its nodes. Finally, the segmentation is completed with the minimum cut/Max-Flow algorithm. Experimental results show that the proposed method is very accurate and efficient, and can directly provide explicit lung regions without any post-processing operations compared to the standard method.

Keywords: Graph cuts, lung CT scan, lung parenchyma segmentation, patch based similarity metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 718
1826 Novel Linear Autozeroing Floating-gate Amplifier for Ultra Low-voltage Applications

Authors: Yngvar Berg, Mehdi Azadmehr

Abstract:

In this paper we present a linear autozeroing ultra lowvoltage amplifier. The autozeroing performed by all ULV circuits is important to reduce the impact of noise and especially avoid power supply noise in mixed signal low-voltage CMOS circuits. The simulated data presented is relevant for a 90nm TSMC CMOS process.

Keywords: Low-voltage, trans conductance amplifier, linearity, floating-gate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1365
1825 Colour Stability of Wild Cactus Pear Juice

Authors: Kgatla T.E, Howard S.S., Hiss D.C.

Abstract:

Prickly pear (Opuntia spp) fruit has received renewed interest since it contains a betalain pigment that has an attractive purple colour for the production of juice. Prickly pear juice was prepared by homogenizing the fruit and treating the pulp with 48 g of pectinase from Aspergillus niger. Titratable acidity was determined by diluting 10 ml prickly pear juice with 90 ml deionized water and titrating to pH 8.2 with 0.1 N NaOH. Brix was measured using a refractometer and ascorbic acid content assayed spectrophotometrically. Colour variation was determined colorimetrically (Hunter L.a.b.). Hunter L.a.b. analysis showed that the red purple colour of prickly pear juice had been affected by juice treatments. This was indicated by low light values of colour difference meter (CDML*), hue, CDMa* and CDMb* values. It was observed that non-treated prickly pear juice had a high (colour difference meter of light) CDML* of 3.9 compared to juice treatments (range 3.29 to 2.14). The CDML* significantly (p<0.05) decreased as the juice was preserved. Spectrophotometric colour analysis showed that browning was low in all treated prickly juice samples as indicated by high values at 540 nm and low values at 476 nm (browning index). The brightness of prickly pear had been affected by acidification compared to other juice treatments. This study presents evidence that processing has a positive effect on the colour quality attribute that offers a clear advantage for the production of red-purple prickly pear juice.

Keywords: Colour, Hunter L.a.b, Prickly pear juice, processing, physicochemical.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2803
1824 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-

Authors: Nieto Bernal Wilson, Carmona Suarez Edgar

Abstract:

The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects.  Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.

Keywords: Data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1458
1823 Potential of Salvia sclarea L. for Phytoremediation of Soils Contaminated with Heavy Metals

Authors: Violina R. Angelova, Radka V. Ivanova, Givko M. Todorov, Krasimir I. Ivanov

Abstract:

A field study was conducted to evaluate the efficacy of Salvia sclarea L. for phytoremediation of contaminated soils. The experiment was performed on an agricultural fields contaminated by the Non-Ferrous-Metal Works near Plovdiv, Bulgaria. The content of heavy metals in different parts of Salvia sclarea L. (roots, stems, leaves and inflorescences) was determined by ICP. The essential oil of the Salvia sclarea L. was obtained by steam distillation in laboratory conditions and was analyzed for heavy metals and its chemical composition was determined. Salvia sclarea L. is a plant which is tolerant to heavy metals and can be grown on contaminated soils. Based on the obtained results and using the most common criteria, Salvia sclarea L. can be classified as Pb hyperaccumulator and Cd and Zn accumulators, therefore, this plant has suitable potential for the phytoremediation of heavy metal contaminated soils. Favorable is also the fact that heavy metals do not influence the development of the Salvia sclarea L., as well as on the quality and quantity of the essential oil. For clary sage oil obtained from the processing of clary sage grown on highly contaminated soils, its key odour-determining ingredients meet the quality requirements of the European Pharmacopoeia and BS ISO 7609 regarding Bulgarian clary sage oil and/or have values that are close to the limits of these standards. The possibility of further industrial processing will make Salvia sclarea L. an economically interesting crop for farmers of phytoextraction technology.

Keywords: Clary sage, heavy metals, phytoremediation, polluted soils.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1807
1822 An Analysis of the Representation of the Translator and Translation Process into Brazilian Social Networking Groups

Authors: Érica Lima

Abstract:

In the digital era, in which we have an avalanche of information, it is not new that the Internet has brought new modes of communication and knowledge access. Characterized by the multiplicity of discourses, opinions, beliefs and cultures, the web is a space of political-ideological dimensions where people (who often do not know each other) interact and create representations, deconstruct stereotypes, and redefine identities. Currently, the translator needs to be able to deal with digital spaces ranging from specific software to social media, which inevitably impact on his professional life. One of the most impactful ways of being seen in cyberspace is the participation in social networking groups. In addition to its ability to disseminate information among participants, social networking groups allow a significant personal and social exposure. Such exposure is due to the visibility of each participant achieved not only on its personal profile page, but also in each comment or post the person makes in the groups. The objective of this paper is to study the representations of translators and translation process on the Internet, more specifically in publications in two Brazilian groups of great influence on the Facebook: "Translators/Interpreters" and "Translators, Interpreters and Curious". These chosen groups represent the changes the network has brought to the profession, including the way translators are seen and see themselves. The analyzed posts allowed a reading of what common sense seems to think about the translator as opposed to what the translators seem to think about themselves as a professional class. The results of the analysis lead to the conclusion that these two positions are antagonistic and sometimes represent conflict of interests: on the one hand, the society in general consider the translator’s work something easy, therefore it is not necessary to be well remunerated; on the other hand, the translators who know how complex a translation process is and how much it takes to be a good professional. The results also reveal that social networking sites such as Facebook provide more visibility, but it takes a more active role from the translator to achieve a greater appreciation of the profession and more recognition of the role of the translator, especially in face of increasingly development of automatic translation programs.

Keywords: Facebook, social representation, translation, translator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 789
1821 A Multi-Signature Scheme based on Coding Theory

Authors: Mohammed Meziani, Pierre-Louis Cayrel

Abstract:

In this paper we propose two first non-generic constructions of multisignature scheme based on coding theory. The first system make use of the CFS signature scheme and is secure in random oracle while the second scheme is based on the KKS construction and is a few times. The security of our construction relies on a difficult problems in coding theory: The Syndrome Decoding problem which has been proved NP-complete [4].

Keywords: Post-quantum cryptography, Coding-based cryptography, Digital signature, Multisignature scheme.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1851
1820 Isolation and Classification of Red Blood Cells in Anemic Microscopic Images

Authors: Jameela Ali Alkrimi, Loay E. George, Azizah Suliman, Abdul Rahim Ahmad, Karim Al-Jashamy

Abstract:

Red blood cells (RBCs) are among the most commonly and intensively studied type of blood cells in cell biology. Anemia is a lack of RBCs is characterized by its level compared to the normal hemoglobin level. In this study, a system based image processing methodology was developed to localize and extract RBCs from microscopic images. Also, the machine learning approach is adopted to classify the localized anemic RBCs images. Several textural and geometrical features are calculated for each extracted RBCs. The training set of features was analyzed using principal component analysis (PCA). With the proposed method, RBCs were isolated in 4.3secondsfrom an image containing 18 to 27 cells. The reasons behind using PCA are its low computation complexity and suitability to find the most discriminating features which can lead to accurate classification decisions. Our classifier algorithm yielded accuracy rates of 100%, 99.99%, and 96.50% for K-nearest neighbor (K-NN) algorithm, support vector machine (SVM), and neural network RBFNN, respectively. Classification was evaluated in highly sensitivity, specificity, and kappa statistical parameters. In conclusion, the classification results were obtained within short time period, and the results became better when PCA was used.

Keywords: Red blood cells, pre-processing image algorithms, classification algorithms, principal component analysis PCA, confusion matrix, kappa statistical parameters, ROC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3177
1819 Design of Compliant Mechanism Based Microgripper with Three Finger Using Topology Optimization

Authors: R. Bharanidaran, B. T. Ramesh

Abstract:

High precision in motion is required to manipulate the micro objects in precision industries for micro assembly, cell manipulation etc. Precision manipulation is achieved based on the appropriate mechanism design of micro devices such as microgrippers. Design of a compliant based mechanism is the better option to achieve a highly precised and controlled motion. This research article highlights the method of designing a compliant based three fingered microgripper suitable for holding asymmetric objects. Topological optimization technique, a systematic method is implemented in this research work to arrive a topologically optimized design of the mechanism needed to perform the required micro motion of the gripper. Optimization technique has a drawback of generating senseless regions such as node to node connectivity and staircase effect at the boundaries. Hence, it is required to have post processing of the design to make it manufacturable. To reduce the effect of post processing stage and to preserve the edges of the image, a cubic spline interpolation technique is introduced in the MATLAB program. Structural performance of the topologically developed mechanism design is tested using finite element method (FEM) software. Further the microgripper structure is examined to find its fatigue life and vibration characteristics.

Keywords: Compliant mechanism, Cubic spline interpolation, FEM, Topology optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3557
1818 Transforming Health Information from Manual to Digital (Electronic) World–Reference and Guide

Authors: S. Karthikeyan, Naveen Bindra

Abstract:

Introduction: To update ourselves and understand the concept of latest electronic formats available for Health care providers and how it could be used and developed as per standards. The idea is to correlate between the patients Manual Medical Records keeping and maintaining patients Electronic Information in a Health care setup in this world. Furthermore, this stands with adapting to the right technology depending upon the organization and improve our quality and quantity of Healthcare providing skills. Objective: The concept and theory is to explain the terms of Electronic Medical Record (EMR), Electronic Health Record (EHR) and Personal Health Record (PHR) and selecting the best technical among the available Electronic sources and software before implementing. It is to guide and make sure the technology used by the end users without any doubts and difficulties. The idea is to evaluate is to admire the uses and barriers of EMR-EHR-PHR. Aim and Scope: The target is to achieve the health care providers like Physicians, Nurses, Therapists, Medical Bill reimbursements, Insurances and Government to assess the patient’s information on easy and systematic manner without diluting the confidentiality of patient’s information. Method: Health Information Technology can be implemented with the help of Organisations providing with legal guidelines and help to stand by the health care provider. The main objective is to select the correct embedded and affordable database management software and generating large-scale data. The parallel need is to know how the latest software available in the market. Conclusion: The question lies here is implementing the Electronic information system with healthcare providers and organization. The clinicians are the main users of the technology and manage us to “go paperless”. The fact is that day today changing technologically is very sound and up to date. Basically, the idea is to tell how to store the data electronically safe and secure. All three exemplifies the fact that an electronic format has its own benefit as well as barriers.

Keywords: Medical records, digital records, health information, electronic record system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1337
1817 Histogram Slicing to Better Reveal Special Thermal Objects

Authors: S. Ratna Sulistiyanti, Adhi Susanto, Thomas Sri Widodo, Gede Bayu Suparta

Abstract:

In this paper, an experimentation to enhance the visibility of hot objects in a thermal image acquired with ordinary digital camera is reported, after the applications of lowpass and median filters to suppress the distracting granular noises. The common thresholding and slicing techniques were used on the histogram at different gray levels, followed by a subjective comparative evaluation. The best result came out with the threshold level 115 and the number of slices 3.

Keywords: enhance, thermal image, thresholding and slicingtechniques, granular noise, hot objects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1716
1816 Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences

Authors: Yuan-Hsiang Chang, Pin-Chi Lin, Li-Der Jeng

Abstract:

Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.). 

Keywords: Motion detection, motion tracking, trajectory analysis, video surveillance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1703
1815 Estimating the Traffic Impacts of Green Light Optimal Speed Advisory Systems Using Microsimulation

Authors: C. B. Masera, M. Imprialou, L. Budd, C. Morton

Abstract:

Even though signalised intersections are necessary for urban road traffic management, they can act as bottlenecks and disrupt traffic operations. Interrupted traffic flow causes congestion, delays, stop-and-go conditions (i.e. excessive acceleration/deceleration) and longer journey times. Vehicle and infrastructure connectivity offers the potential to provide improved new services with additional functions of assisting drivers. This paper focuses on one of the applications of vehicle-to-infrastructure communication namely Green Light Optimal Speed Advisory (GLOSA). To assess the effectiveness of GLOSA in the urban road network, an integrated microscopic traffic simulation framework is built into VISSIM software. Vehicle movements and vehicle-infrastructure communications are simulated through the interface of External Driver Model. A control algorithm is developed for recommending an optimal speed that is continuously updated in every time step for all vehicles approaching a signal-controlled point. This algorithm allows vehicles to pass a traffic signal without stopping or to minimise stopping times at a red phase. This study is performed with all connected vehicles at 100% penetration rate. Conventional vehicles are also simulated in the same network as a reference. A straight road segment composed of two opposite directions with two traffic lights per lane is studied. The simulation is implemented under 150 vehicles per hour and 200 per hour traffic volume conditions to identify how different traffic densities influence the benefits of GLOSA. The results indicate that traffic flow is improved by the application of GLOSA. According to this study, vehicles passed through the traffic lights more smoothly, and waiting times were reduced by up to 28 seconds. Average delays decreased for the entire network by 86.46% and 83.84% under traffic densities of 150 vehicles per hour per lane and 200 vehicles per hour per lane, respectively.

Keywords: Connected vehicles, GLOSA, intelligent transportation systems, infrastructure-to-vehicle communication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
1814 Modified Fuzzy PID Control for Networked Control Systems with Random Delays

Authors: Yong-can Cao, Wei-dong Zhang

Abstract:

To deal with random delays in Networked Control System (NCS), Modified Fuzzy PID Controller is introduced in this paper to implement real-time control adaptively. Via adjusting the control signal dynamically, the system performance is improved. In this paper, the design process and the ultimate simulation results are represented. Finally, examples and corresponding comparisons prove the significance of this method.

Keywords: Fuzzy Control, Networked Control System, PID, Random Delays

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1528
1813 A System for Analyzing and Eliciting Public Grievances Using Cache Enabled Big Data

Authors: P. Kaladevi, N. Giridharan

Abstract:

The system for analyzing and eliciting public grievances serves its main purpose to receive and process all sorts of complaints from the public and respond to users. Due to the more number of complaint data becomes big data which is difficult to store and process. The proposed system uses HDFS to store the big data and uses MapReduce to process the big data. The concept of cache was applied in the system to provide immediate response and timely action using big data analytics. Cache enabled big data increases the response time of the system. The unstructured data provided by the users are efficiently handled through map reduce algorithm. The processing of complaints takes place in the order of the hierarchy of the authority. The drawbacks of the traditional database system used in the existing system are set forth by our system by using Cache enabled Hadoop Distributed File System. MapReduce framework codes have the possible to leak the sensitive data through computation process. We propose a system that add noise to the output of the reduce phase to avoid signaling the presence of sensitive data. If the complaints are not processed in the ample time, then automatically it is forwarded to the higher authority. Hence it ensures assurance in processing. A copy of the filed complaint is sent as a digitally signed PDF document to the user mail id which serves as a proof. The system report serves to be an essential data while making important decisions based on legislation.

Keywords: Big Data, Hadoop, HDFS, Caching, MapReduce, web personalization, e-governance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1574
1812 Objects Extraction by Cooperating Optical Flow, Edge Detection and Region Growing Procedures

Authors: C. Lodato, S. Lopes

Abstract:

The image segmentation method described in this paper has been developed as a pre-processing stage to be used in methodologies and tools for video/image indexing and retrieval by content. This method solves the problem of whole objects extraction from background and it produces images of single complete objects from videos or photos. The extracted images are used for calculating the object visual features necessary for both indexing and retrieval processes. The segmentation algorithm is based on the cooperation among an optical flow evaluation method, edge detection and region growing procedures. The optical flow estimator belongs to the class of differential methods. It permits to detect motions ranging from a fraction of a pixel to a few pixels per frame, achieving good results in presence of noise without the need of a filtering pre-processing stage and includes a specialised model for moving object detection. The first task of the presented method exploits the cues from motion analysis for moving areas detection. Objects and background are then refined using respectively edge detection and seeded region growing procedures. All the tasks are iteratively performed until objects and background are completely resolved. The method has been applied to a variety of indoor and outdoor scenes where objects of different type and shape are represented on variously textured background.

Keywords: Image Segmentation, Motion Detection, Object Extraction, Optical Flow

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1741
1811 Determination of Electromagnetic Properties of Human Tissues

Authors: Iliana Marinova, Valentin Mateev

Abstract:

In this paper a computer system for electromagnetic properties measurements is designed. The system employs Agilent 4294A precision impedance analyzer to measure the amplitude and the phase of a signal applied over a tested biological tissue sample. Measured by the developed computer system data could be used for tissue characterization in wide frequency range from 40Hz to 110MHz. The computer system can interface with output devices acquiring flexible testing process.

Keywords: Electromagnetic properties, human tissue, bioimpedance, measurement system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2408
1810 An Experimentally Validated Thermo- Mechanical Finite Element Model for Friction Stir Welding in Carbon Steels

Authors: A. H. Kheireddine, A. A. Khalil, A. H. Ammouri, G. T. Kridli, R. F. Hamade

Abstract:

Solidification cracking and hydrogen cracking are some defects generated in the fusion welding of ultrahigh carbon steels. However, friction stir welding (FSW) of such steels, being a solid-state technique, has been demonstrated to alleviate such problems encountered in traditional welding. FSW include different process parameters that must be carefully defined prior processing. These parameters included but not restricted to: tool feed, tool RPM, tool geometry, tool tilt angle. These parameters form a key factor behind avoiding warm holes and voids behind the tool and in achieving a defect-free weld. More importantly, these parameters directly affect the microstructure of the weld and hence the final mechanical properties of weld. For that, 3D finite element (FE) thermo-mechanical model was developed using DEFORM 3D to simulate FSW of carbon steel. At points of interest in the joint, tracking is done for history of critical state variables such as temperature, stresses, and strain rates. Typical results found include the ability to simulate different weld zones. Simulations predictions were successfully compared to experimental FSW tests. It is believed that such a numerical model can be used to optimize FSW processing parameters to favor desirable defect free weld with better mechanical properties.

Keywords: Carbon Steels, DEFORM 3D, FEM, Friction stir welding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2553
1809 Managers’ Capacity Building for Institutional Sustainability Performance

Authors: Analiza Acuña-Villacorte

Abstract:

The Institutional Sustainability Performance (ISP) of State Universities and Colleges (SUCs) in the Philippines reveals the level of compliance and fidelity of the latter to the mandates of the state. This performance evaluation procedure aims to perpetually monitor and sustain the quality of services provided by the state institutions in the country. Importantly, the SUC level rating is one of the key indicators of the merit system adopted by the state to give incentives to government institutions. With the crucial role of the ISP and SUC level in the performance of an institution and in sustaining quality assurance, this study theorized that the top managers’ capacity to influence is the critical factor in meeting the expectations of the state. This study assessed the top managers’ capacity to influence. The hypothesis in this study proved that leadership style of top managers has significant relationship to the managers’ capacity to influence for institutional sustainability performance. Thus, the subjects of this study were restricted only to the State Universities and Colleges (SUC) that qualified in the top 20 Institutional Sustainability Performance; the digital governance performance, and the SUC leveling status nationwide. The top managers and their subordinates with doctorate of Bulacan State University and Bataan Peninsula State University whose programs have been consistently submitted to accreditation and were ranked Levels III and IV were subjected and participated to the study. This study assessed the top managers’ capacity to influence. The hypothesis in this study proved that leadership style of top managers has significant relationship to the managers’ capacity to influence for institutional sustainability performance. Thus, the subjects of this study were restricted only to the State Universities and Colleges (SUC) that qualified in the top 20 Institutional Sustainability Performance; the digital governance performance, and the SUC leveling status nationwide. The top managers and their subordinates with doctorate of Bulacan State University and Bataan Peninsula State University whose programs have been consistently submitted to accreditation and were ranked Levels III and IV were subjected and participated to the study.

Keywords: Capacity to Influence, Descriptive Design, Institutional Sustainability Performance, Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1887
1808 “Post-Industrial” Journalism as a Creative Industry

Authors: Lynette Sheridan Burns, Benjamin J. Matthews

Abstract:

The context of post-industrial journalism is one in which the material circumstances of mechanical publication have been displaced by digital technologies, increasing the distance between the orthodoxy of the newsroom and the culture of journalistic writing. Content is, with growing frequency, created for delivery via the internet, publication on web-based ‘platforms’ and consumption on screen media. In this environment, the question is not ‘who is a journalist?’ but ‘what is journalism?’ today. The changes bring into sharp relief new distinctions between journalistic work and journalistic labor, providing a key insight into the current transition between the industrial journalism of the 20th century, and the post-industrial journalism of the present. In the 20th century, the work of journalists and journalistic labor went hand-in-hand as most journalists were employees of news organizations, whilst in the 21st century evidence of a decoupling of ‘acts of journalism’ (work) and journalistic employment (labor) is beginning to appear. This 'decoupling' of the work and labor that underpins journalism practice is far reaching in its implications, not least for institutional structures. Under these conditions we are witnessing the emergence of expanded ‘entrepreneurial’ journalism, based on smaller, more independent and agile - if less stable - enterprise constructs that are a feature of creative industries. Entrepreneurial journalism is realized in a range of organizational forms from social enterprise, through to profit driven start-ups and hybrids of the two. In all instances, however, the primary motif of the organization is an ideological definition of journalism. An example is the Scoop Foundation for Public Interest Journalism in New Zealand, which owns and operates Scoop Publishing Limited, a not for profit company and social enterprise that publishes an independent news site that claims to have over 500,000 monthly users. Our paper demonstrates that this journalistic work meets the ideological definition of journalism; conducted within the creative industries using an innovative organizational structure that offers a new, viable post-industrial future for journalism.

Keywords: Creative industries, digital communication, journalism, post-industrial.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1873
1807 Pulse Generator with Constant Pulse Width

Authors: Hanif Che Lah, Wee Leong Son, Rozita Borhan

Abstract:

This paper is about method to produce a stable and accurate constant output pulse width regardless of the amplitude, period and pulse width variation of the input signal source. The pulse generated is usually being used in numerous applications as the reference input source to other circuits in the system. Therefore, it is crucial to produce a clean and constant pulse width to make sure the system is working accurately as expected.

Keywords: Amplitude, Constant Pulse Width, Frequency Divider, Pulse Generator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3643