Search results for: Successive Approximation Register Analog-to- Digital Converter
1176 Modeling and Visualizing Seismic Wave Propagation in Elastic Medium Using Multi-Dimension Wave Digital Filtering Approach
Authors: Jason Chien-Hsun Tseng, Nguyen Dong-Thai Dao, Chong-Ching Chang
Abstract:
A novel PDE solver using the multidimensional wave digital filtering (MDWDF) technique to achieve the solution of a 2D seismic wave system is presented. In essence, the continuous physical system served by a linear Kirchhoff circuit is transformed to an equivalent discrete dynamic system implemented by a MD wave digital filtering (MDWDF) circuit. This amounts to numerically approximating the differential equations used to describe elements of a MD passive electronic circuit by a grid-based difference equations implemented by the so-called state quantities within the passive MDWDF circuit. So the digital model can track the wave field on a dense 3D grid of points. Details about how to transform the continuous system into a desired discrete passive system are addressed. In addition, initial and boundary conditions are properly embedded into the MDWDF circuit in terms of state quantities. Graphic results have clearly demonstrated some physical effects of seismic wave (P-wave and S–wave) propagation including radiation, reflection, and refraction from and across the hard boundaries. Comparison between the MDWDF technique and the finite difference time domain (FDTD) approach is also made in terms of the computational efficiency.Keywords: Seismic Wave Propagation, Multi-dimension WaveDigital Filters, Partial Differential Equations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14341175 Secure Cryptographic Operations on SIM Card for Mobile Financial Services
Authors: Kerem Ok, Serafettin Senturk, Serdar Aktas, Cem Cevikbas
Abstract:
Mobile technology is very popular nowadays and it provides a digital world where users can experience many value-added services. Service Providers are also eager to offer diverse value-added services to users such as digital identity, mobile financial services and so on. In this context, the security of data storage in smartphones and the security of communication between the smartphone and service provider are critical for the success of these services. In order to provide the required security functions, the SIM card is one acceptable alternative. Since SIM cards include a Secure Element, they are able to store sensitive data, create cryptographically secure keys, encrypt and decrypt data. In this paper, we design and implement a SIM and a smartphone framework that uses a SIM card for secure key generation, key storage, data encryption, data decryption and digital signing for mobile financial services. Our frameworks show that the SIM card can be used as a controlled Secure Element to provide required security functions for popular e-services such as mobile financial services.Keywords: SIM Card, mobile financial services, cryptography, secure data storage.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20641174 Use of Smartphone in Practical Classes to Facilitate Teaching and Learning of Microscopic Analysis and Interpretation of Tissues Sections
Authors: Lise P. Labéjof, Krisnayne S. Ribeiro, Jackson A. Santos, Nicolle P. dos Santos
Abstract:
An unrecorded experiment of use of the smartphone as a tool for practical classes of histology is presented in this paper. Behavior and learning of students of science courses at the University were analyzed and compared as well as the mode of teaching of this discipline and the appreciation of the students, using either digital photographs taken by phone or drawings for record microscopic observations, analyze and interpret histological sections of human or animal tissues.Keywords: Cell phone, digital micrographs, learning of sciences, teaching practices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17021173 Robust Semi-Blind Digital Image Watermarking Technique in DT-CWT Domain
Authors: Samira Mabtoul, Elhassan Ibn Elhaj, Driss Aboutajdine
Abstract:
In this paper a new robust digital image watermarking algorithm based on the Complex Wavelet Transform is proposed. This technique embeds different parts of a watermark into different blocks of an image under the complex wavelet domain. To increase security of the method, two chaotic maps are employed, one map is used to determine the blocks of the host image for watermark embedding, and another map is used to encrypt the watermark image. Simulation results are presented to demonstrate the effectiveness of the proposed algorithm.Keywords: Image watermarking, Chaotic map, DT-CWT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16911172 Digital Transformation of Lean Production: Systematic Approach for the Determination of Digitally Pervasive Value Chains
Authors: Peter Burggräf, Matthias Dannapfel, Hanno Voet, Patrick-Benjamin Bök, Jérôme Uelpenich, Julian Hoppe
Abstract:
The increasing digitalization of value chains can help companies to handle rising complexity in their processes and thereby reduce the steadily increasing planning and control effort in order to raise performance limits. Due to technological advances, companies face the challenge of smart value chains for the purpose of improvements in productivity, handling the increasing time and cost pressure and the need of individualized production. Therefore, companies need to ensure quick and flexible decisions to create self-optimizing processes and, consequently, to make their production more efficient. Lean production, as the most commonly used paradigm for complexity reduction, reaches its limits when it comes to variant flexible production and constantly changing market and environmental conditions. To lift performance limits, which are inbuilt in current value chains, new methods and tools must be applied. Digitalization provides the potential to derive these new methods and tools. However, companies lack the experience to harmonize different digital technologies. There is no practicable framework, which instructs the transformation of current value chains into digital pervasive value chains. Current research shows that a connection between lean production and digitalization exists. This link is based on factors such as people, technology and organization. In this paper, the introduced method for the determination of digitally pervasive value chains takes the factors people, technology and organization into account and extends existing approaches by a new dimension. It is the first systematic approach for the digital transformation of lean production and consists of four steps: The first step of ‘target definition’ describes the target situation and defines the depth of the analysis with regards to the inspection area and the level of detail. The second step of ‘analysis of the value chain’ verifies the lean-ability of processes and lies in a special focus on the integration capacity of digital technologies in order to raise the limits of lean production. Furthermore, the ‘digital evaluation process’ ensures the usefulness of digital adaptions regarding their practicability and their integrability into the existing production system. Finally, the method defines actions to be performed based on the evaluation process and in accordance with the target situation. As a result, the validation and optimization of the proposed method in a German company from the electronics industry shows that the digital transformation of current value chains based on lean production achieves a raise of their inbuilt performance limits.
Keywords: Digitalization, digital transformation, lean production, Industrie 4.0, value chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20321171 Education in Technology for Sustainable Development Applied to School Gardens
Authors: Sara Blanc, José V. Benlloch-Dualde, Laura Grindei, Ana C. Torres, Angélica Monteiro
Abstract:
This paper presents a study that leads an experience by introducing digital learning applied to a case study focused on primary and secondary school garden-based education. The approach represents an example for interaction among different education and research agents at different countries and levels, such as universities, public and private researches and schools, to get involved in the implementation of education for sustainable development that will make students become more sensible to natural environment, more responsible for their consumption, more aware about waste reduction and recycling, more conscious of the sustainable use of natural resources and, at the same time, more ‘digitally competent’. The experience was designed attending to the European digital education context and OECD (Organization for Economic Co-operation and Development) directives in transversal skills education. The paper presents the methodology carried out in the study as well as outcomes obtained from the experience.
Keywords: School gardens, primary education, secondary education, science technology and innovation in education, digital learning, sustainable development goals, university, knowledge transference.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1781170 Bridging Quantitative and Qualitative of Glaucoma Detection
Authors: Noor Elaiza Abdul Khalid, Noorhayati Mohamed Noor, Zamalia Mahmud, Saadiah Yahya, and Norharyati Md Ariff
Abstract:
Glaucoma diagnosis involves extracting three features of the fundus image; optic cup, optic disc and vernacular. Present manual diagnosis is expensive, tedious and time consuming. A number of researches have been conducted to automate this process. However, the variability between the diagnostic capability of an automated system and ophthalmologist has yet to be established. This paper discusses the efficiency and variability between ophthalmologist opinion and digital technique; threshold. The efficiency and variability measures are based on image quality grading; poor, satisfactory or good. The images are separated into four channels; gray, red, green and blue. A scientific investigation was conducted on three ophthalmologists who graded the images based on the image quality. The images are threshold using multithresholding and graded as done by the ophthalmologist. A comparison of grade from the ophthalmologist and threshold is made. The results show there is a small variability between result of ophthalmologists and digital threshold.Keywords: Digital Fundus Image, Glaucoma Detection, Multithresholding, Segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20421169 An Evaluation on Fixed Wing and Multi-Rotor UAV Images Using Photogrammetric Image Processing
Authors: Khairul Nizam Tahar, Anuar Ahmad
Abstract:
This paper has introduced a slope photogrammetric mapping using unmanned aerial vehicle. There are two units of UAV has been used in this study; namely; fixed wing and multi-rotor. Both UAVs were used to capture images at the study area. A consumer digital camera was mounted vertically at the bottom of UAV and captured the images at an altitude. The objectives of this study are to obtain three dimensional coordinates of slope area and to determine the accuracy of photogrammetric product produced from both UAVs. Several control points and checkpoints were established Real Time Kinematic Global Positioning System (RTK-GPS) in the study area. All acquired images from both UAVs went through all photogrammetric processes such as interior orientation, exterior orientation, aerial triangulation and bundle adjustment using photogrammetric software. Two primary results were produced in this study; namely; digital elevation model and digital orthophoto. Based on results, UAV system can be used to mapping slope area especially for limited budget and time constraints project.
Keywords: Slope mapping, 3D, DEM, UAV, Photogrammetry, image processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 60841168 MIMO-OFDM Coded for Digital Terrestrial Television Broadcasting Systems
Authors: El Miloud A.R. Reyouchi, Kamal Ghoumid, Koutaiba Amezian, Otman Mrabet
Abstract:
This paper proposes and analyses the wireless telecommunication system with multiple antennas to the emission and reception MIMO (multiple input multiple output) with space diversity in a OFDM context. In particular it analyses the performance of a DTT (Digital Terrestrial Television) broadcasting system that includes MIMO-OFDM techniques. Different propagation channel models and configurations are considered for each diversity scheme. This study has been carried out in the context of development of the next generation DVB-T/H and WRAN.Keywords: MIMO, MISO, OFDM, DVB-/H/T2, WRAN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27061167 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)
Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton
Abstract:
Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.Keywords: Cold-start, expectation propagation, multi-armed bandits, Thompson sampling, variational inference.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5501166 Data Embedding Based on Better Use of Bits in Image Pixels
Authors: Rehab H. Alwan, Fadhil J. Kadhim, Ahmad T. Al-Taani
Abstract:
In this study, a novel approach of image embedding is introduced. The proposed method consists of three main steps. First, the edge of the image is detected using Sobel mask filters. Second, the least significant bit LSB of each pixel is used. Finally, a gray level connectivity is applied using a fuzzy approach and the ASCII code is used for information hiding. The prior bit of the LSB represents the edged image after gray level connectivity, and the remaining six bits represent the original image with very little difference in contrast. The proposed method embeds three images in one image and includes, as a special case of data embedding, information hiding, identifying and authenticating text embedded within the digital images. Image embedding method is considered to be one of the good compression methods, in terms of reserving memory space. Moreover, information hiding within digital image can be used for security information transfer. The creation and extraction of three embedded images, and hiding text information is discussed and illustrated, in the following sections.
Keywords: Image embedding, Edge detection, gray level connectivity, information hiding, digital image compression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21441165 A Blind Digital Watermark in Hadamard Domain
Authors: Saeid Saryazdi, Hossein Nezamabadi-pour
Abstract:
A new blind gray-level watermarking scheme is described. In the proposed method, the host image is first divided into 4*4 non-overlapping blocks. For each block, two first AC coefficients of its Hadamard transform are then estimated using DC coefficients of its neighbor blocks. A gray-level watermark is then added into estimated values. Since embedding watermark does not change the DC coefficients, watermark extracting could be done by estimating AC coefficients and comparing them with their actual values. Several experiments are made and results suggest the robustness of the proposed algorithm.
Keywords: Digital Watermarking, Image watermarking, Information Hiden, Steganography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22621164 Image Adaptive Watermarking with Visual Model in Orthogonal Polynomials based Transformation Domain
Authors: Krishnamoorthi R., Sheba Kezia Malarchelvi P. D.
Abstract:
In this paper, an image adaptive, invisible digital watermarking algorithm with Orthogonal Polynomials based Transformation (OPT) is proposed, for copyright protection of digital images. The proposed algorithm utilizes a visual model to determine the watermarking strength necessary to invisibly embed the watermark in the mid frequency AC coefficients of the cover image, chosen with a secret key. The visual model is designed to generate a Just Noticeable Distortion mask (JND) by analyzing the low level image characteristics such as textures, edges and luminance of the cover image in the orthogonal polynomials based transformation domain. Since the secret key is required for both embedding and extraction of watermark, it is not possible for an unauthorized user to extract the embedded watermark. The proposed scheme is robust to common image processing distortions like filtering, JPEG compression and additive noise. Experimental results show that the quality of OPT domain watermarked images is better than its DCT counterpart.Keywords: Orthogonal Polynomials based Transformation, Digital Watermarking, Copyright Protection, Visual model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16951163 Application of Digital Image Correlation Technique on Vacuum Assisted Resin Transfer Molding Process and Performance Evaluation of the Produced Materials
Authors: Dingding Chen, Kazuo Arakawa, Masakazu Uchino, Changheng Xu
Abstract:
Vacuum assisted resin transfer moulding (VARTM) is a promising manufacture process for making large and complex fiber reinforced composite structures. However, the complexity of the flow of the resin in the infusion stage usually leads to nonuniform property distribution of the produced composite part. In order to control the flow of the resin, the situation of flow should be mastered. For the safety of the usage of the produced composite in practice, the understanding of the property distribution is essential. In this paper, we did some trials on monitoring the resin infusion stage and evaluation for the fiber volume fraction distribution of the VARTM produced composite using the digital image correlation methods. The results showthat3D-DIC is valid on monitoring the resin infusion stage and it is possible to use 2D-DIC to estimate the distribution of the fiber volume fraction on a FRP plate.
Keywords: Digital image correlation, VARTM, FRP, fiber volume fraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24341162 Variational EM Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification
Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park
Abstract:
In this paper, we propose the variational EM inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multiclass. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.
Keywords: Bayesian rule, Gaussian process classification model with multiclass, Gaussian process prior, human action classification, laplace approximation, variational EM algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17571161 A Digital Twin Approach for Sustainable Territories Planning: A Case Study on District Heating
Authors: A. Amrani, O. Allali, A. Ben Hamida, F. Defrance, S. Morland, E. Pineau, T. Lacroix
Abstract:
The energy planning process is a very complex task that involves several stakeholders and requires the consideration of several local and global factors and constraints. In order to optimize and simplify this process, we propose a tool-based iterative approach applied to district heating planning. We build our tool with the collaboration of a French territory using actual district data and implementing the European incentives. We set up an iterative process including data visualization and analysis, identification and extraction of information related to the area concerned by the operation, design of sustainable planning scenarios leveraging local renewable and recoverable energy sources, and finally, the evaluation of scenarios. The last step is performed by a dynamic digital twin replica of the city. Territory’s energy experts confirm that the tool provides them with valuable support towards sustainable energy planning.
Keywords: Climate change, data management, decision support, digital twin, district heating, energy planning, renewables, smart city.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6521160 Self Watermarking based on Visual Cryptography
Authors: Mahmoud A. Hassan, Mohammed A. Khalili
Abstract:
We are proposing a simple watermarking method based on visual cryptography. The method is based on selection of specific pixels from the original image instead of random selection of pixels as per Hwang [1] paper. Verification information is generated which will be used to verify the ownership of the image without the need to embed the watermark pattern into the original digital data. Experimental results show the proposed method can recover the watermark pattern from the marked data even if some changes are made to the original digital data.Keywords: Watermarking, visual cryptography, visualthreshold.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17381159 Analysis of Poverty Reduction Strategies as Mechanism for Development in Nigeria from 1999-2014
Authors: Ahmed Usman Egye, Hamza Muhammad
Abstract:
Poverty alleviation is one of the most difficult challenges facing third world countries in their development efforts. Evidences in Nigeria showed that the number of those in poverty has continued to increase. This paper is aimed at analyzing the performance of poverty alleviation measures undertaken by successive administrations in Nigeria with a view to addressing the quagmire. The study identified the whole gamut of factors that served as stumbling blocks to the implementation of each of the strategies and recommended the involvement of local people in the identification and design of projects so that sufficient participation could be achieved.Keywords: Poverty, development, strategies, Nigeria.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27711158 Enhanced Data Access Control of Cooperative Environment used for DMU Based Design
Authors: Wei Lifan, Zhang Huaiyu, Yang Yunbin, Li Jia
Abstract:
Through the analysis of the process digital design based on digital mockup, the fact indicates that a distributed cooperative supporting environment is the foundation conditions to adopt design approach based on DMU. Data access authorization is concerned firstly because the value and sensitivity of the data for the enterprise. The access control for administrators is often rather weak other than business user. So authors established an enhanced system to avoid the administrators accessing the engineering data by potential approach and without authorization. Thus the data security is improved.Keywords: access control, DMU, PLM, virtual prototype.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14621157 Improving Digital Image Edge Detection by Fuzzy Systems
Authors: Begol, Moslem, Maghooli, Keivan
Abstract:
Image Edge Detection is one of the most important parts of image processing. In this paper, by fuzzy technique, a new method is used to improve digital image edge detection. In this method, a 3x3 mask is employed to process each pixel by means of vicinity. Each pixel is considered a fuzzy input and by examining fuzzy rules in its vicinity, the edge pixel is specified and by utilizing calculation algorithms in image processing, edges are displayed more clearly. This method shows significant improvement compared to different edge detection methods (e.g. Sobel, Canny).Keywords: Fuzzy Systems, Edge Detection, Fuzzy edgedetection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20861156 Hybrid Approach for Country’s Performance Evaluation
Authors: C. Slim
Abstract:
This paper presents an integrated model, which hybridized data envelopment analysis (DEA) and support vector machine (SVM) together, to class countries according to their efficiency and performance. This model takes into account aspects of multi-dimensional indicators, decision-making hierarchy and relativity of measurement. Starting from a set of indicators of performance as exhaustive as possible, a process of successive aggregations has been developed to attain an overall evaluation of a country’s competitiveness.
Keywords: Artificial neural networks, support vector machine, data envelopment analysis, aggregations, indicators of performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10601155 A Process of Forming a Single Competitive Factor in the Digital Camera Industry
Authors: Kiyohiro Yamazaki
Abstract:
This paper considers a forming process of a single competitive factor in the digital camera industry from the viewpoint of product platform. To make product development easier for companies and to increase product introduction ratios, development efforts concentrate on improving and strengthening certain product attributes, and it is born in the process that the product platform is formed continuously. It is pointed out that the formation of this product platform raises product development efficiency of individual companies, but on the other hand, it has a trade-off relationship of causing unification of competitive factors in the whole industry. This research tries to analyze product specification data which were collected from the web page of digital camera companies. Specifically, this research collected all product specification data released in Japan from 1995 to 2003 and analyzed the composition of image sensor and optical lens; and it identified product platforms shared by multiple products and discussed their application. As a result, this research found that the product platformation was born in the development of the standard product for major market segmentation. Every major company has made product platforms of image sensors and optical lenses, and as a result, this research found that the competitive factors were unified in the entire industry throughout product platformation. In other words, this product platformation brought product development efficiency of individual firms; however, it also caused industrial competition factors to be unified in the industry.
Keywords: Digital camera industry, product evolution trajectory, product platform, unification of competitive factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6511154 An Efficient Clustering Technique for Copy-Paste Attack Detection
Authors: N. Chaitawittanun, M. Munlin
Abstract:
Due to rapid advancement of powerful image processing software, digital images are easy to manipulate and modify by ordinary people. Lots of digital images are edited for a specific purpose and more difficult to distinguish form their original ones. We propose a clustering method to detect a copy-move image forgery of JPEG, BMP, TIFF, and PNG. The process starts with reducing the color of the photos. Then, we use the clustering technique to divide information of measuring data by Hausdorff Distance. The result shows that the purposed methods is capable of inspecting the image file and correctly identify the forgery.
Keywords: Image detection, forgery image, copy-paste.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13181153 Influence of the Paint Coating Thickness in Digital Image Correlation Experiments
Authors: Jesús A. Pérez, Sam Coppieters, Dimitri Debruyne
Abstract:
In the past decade, the use of digital image correlation (DIC) techniques has increased significantly in the area of experimental mechanics, especially for materials behavior characterization. This non-contact tool enables full field displacement and strain measurements over a complete region of interest. The DIC algorithm requires a random contrast pattern on the surface of the specimen in order to perform properly. To create this pattern, the specimen is usually first coated using a white matt paint. Next, a black random speckle pattern is applied using any suitable method. If the applied paint coating is too thick, its top surface may not be able to exactly follow the deformation of the specimen, and consequently, the strain measurement might be underestimated. In the present article, a study of the influence of the paint thickness on the strain underestimation is performed for different strain levels. The results are then compared to typical paint coating thicknesses applied by experienced DIC users. A slight strain underestimation was observed for paint coatings thicker than about 30μm. On the other hand, this value was found to be uncommonly high compared to coating thicknesses applied by DIC users.Keywords: Digital Image Correlation, paint coating thickness, strain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23011152 Limits of Phase Modulated Frequency Shifted Holographic Vibrometry at Low Amplitudes of Vibrations
Authors: Pavel Psota, Vít Lédl, Jan Václavík, Roman Doleček, Pavel Mokrý, Petr Vojtíšek
Abstract:
This paper presents advanced time average digital holography by means of frequency shift and phase modulation. This technique can measure amplitudes of vibrations at ultimate dynamic range while the amplitude distribution evaluation is done independently in every pixel. The main focus of the paper is to gain insight into behavior of the method at low amplitudes of vibrations. In order to reach that, a set of experiments was performed. Results of the experiments together with novel noise suppression show the limit of the method to be below 0.1 nm.Keywords: Acousto-optical modulator, digital holography, low amplitudes, vibrometry.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11171151 Arriving at an Optimum Value of Tolerance Factor for Compressing Medical Images
Authors: Sumathi Poobal, G. Ravindran
Abstract:
Medical imaging uses the advantage of digital technology in imaging and teleradiology. In teleradiology systems large amount of data is acquired, stored and transmitted. A major technology that may help to solve the problems associated with the massive data storage and data transfer capacity is data compression and decompression. There are many methods of image compression available. They are classified as lossless and lossy compression methods. In lossy compression method the decompressed image contains some distortion. Fractal image compression (FIC) is a lossy compression method. In fractal image compression an image is coded as a set of contractive transformations in a complete metric space. The set of contractive transformations is guaranteed to produce an approximation to the original image. In this paper FIC is achieved by PIFS using quadtree partitioning. PIFS is applied on different images like , Ultrasound, CT Scan, Angiogram, X-ray, Mammograms. In each modality approximately twenty images are considered and the average values of compression ratio and PSNR values are arrived. In this method of fractal encoding, the parameter, tolerance factor Tmax, is varied from 1 to 10, keeping the other standard parameters constant. For all modalities of images the compression ratio and Peak Signal to Noise Ratio (PSNR) are computed and studied. The quality of the decompressed image is arrived by PSNR values. From the results it is observed that the compression ratio increases with the tolerance factor and mammogram has the highest compression ratio. The quality of the image is not degraded upto an optimum value of tolerance factor, Tmax, equal to 8, because of the properties of fractal compression.Keywords: Fractal image compression, IFS, PIFS, PSNR, Quadtree partitioning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17391150 Digital Preservation in Nigeria Universities Libraries: A Comparison between University of Nigeria Nsukka and Ahmadu Bello University Zaria
Authors: Suleiman Musa, Shuaibu Sidi Safiyanu
Abstract:
This study examined the digital preservation in Nigeria university libraries. A comparison between the university of Nigeria Nsukka (UNN) and Ahmadu Bello University Zaria (ABU, Zaria). The study utilized primary source of data obtained from two selected institution librarians. Finding revealed varying results in terms of skills acquired by librarians before and after digitization of the two institutions. The study reports that journals publication, text book, CD-ROMS, conference papers and proceedings, theses, dissertations and seminar papers are among the information resources available for digitization. The study further documents that copyright issue, power failure, and unavailability of needed materials are among the challenges facing the digitization of library of the institution. On the basis of the finding, the study concluded that digitization of library enhances efficiency in organization and retrieval of information services. The study therefore recommended that software should be upgraded with backup, training of the librarians on digital process, installation of antivirus and enhancement of technical collaboration between the library and MIS.Keywords: Digitalization, preservation, libraries, comparison.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17241149 Digital Automatic Gain Control Integrated on WLAN Platform
Authors: Emilija Miletic, Milos Krstic, Maxim Piz, Michael Methfessel
Abstract:
In this work we present a solution for DAGC (Digital Automatic Gain Control) in WLAN receivers compatible to IEEE 802.11a/g standard. Those standards define communication in 5/2.4 GHz band using Orthogonal Frequency Division Multiplexing OFDM modulation scheme. WLAN Transceiver that we have used enables gain control over Low Noise Amplifier (LNA) and a Variable Gain Amplifier (VGA). The control over those signals is performed in our digital baseband processor using dedicated hardware block DAGC. DAGC in this process is used to automatically control the VGA and LNA in order to achieve better signal-to-noise ratio, decrease FER (Frame Error Rate) and hold the average power of the baseband signal close to the desired set point. DAGC function in baseband processor is done in few steps: measuring power levels of baseband samples of an RF signal,accumulating the differences between the measured power level and actual gain setting, adjusting a gain factor of the accumulation, and applying the adjusted gain factor the baseband values. Based on the measurement results of RSSI signal dependence to input power we have concluded that this digital AGC can be implemented applying the simple linearization of the RSSI. This solution is very simple but also effective and reduces complexity and power consumption of the DAGC. This DAGC is implemented and tested both in FPGA and in ASIC as a part of our WLAN baseband processor. Finally, we have integrated this circuit in a compact WLAN PCMCIA board based on MAC and baseband ASIC chips designed from us.Keywords: WLAN, AGC, RSSI, baseband processor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 39471148 Upgraded Rough Clustering and Outlier Detection Method on Yeast Dataset by Entropy Rough K-Means Method
Authors: P. Ashok, G. M. Kadhar Nawaz
Abstract:
Rough set theory is used to handle uncertainty and incomplete information by applying two accurate sets, Lower approximation and Upper approximation. In this paper, the rough clustering algorithms are improved by adopting the Similarity, Dissimilarity–Similarity and Entropy based initial centroids selection method on three different clustering algorithms namely Entropy based Rough K-Means (ERKM), Similarity based Rough K-Means (SRKM) and Dissimilarity-Similarity based Rough K-Means (DSRKM) were developed and executed by yeast dataset. The rough clustering algorithms are validated by cluster validity indexes namely Rand and Adjusted Rand indexes. An experimental result shows that the ERKM clustering algorithm perform effectively and delivers better results than other clustering methods. Outlier detection is an important task in data mining and very much different from the rest of the objects in the clusters. Entropy based Rough Outlier Factor (EROF) method is seemly to detect outlier effectively for yeast dataset. In rough K-Means method, by tuning the epsilon (ᶓ) value from 0.8 to 1.08 can detect outliers on boundary region and the RKM algorithm delivers better results, when choosing the value of epsilon (ᶓ) in the specified range. An experimental result shows that the EROF method on clustering algorithm performed very well and suitable for detecting outlier effectively for all datasets. Further, experimental readings show that the ERKM clustering method outperformed the other methods.
Keywords: Clustering, Entropy, Outlier, Rough K-Means, validity index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14111147 Performance Analysis of Digital Signal Processors Using SMV Benchmark
Authors: Erh-Wen Hu, Cyril S. Ku, Andrew T. Russo, Bogong Su, Jian Wang
Abstract:
Unlike general-purpose processors, digital signal processors (DSP processors) are strongly application-dependent. To meet the needs for diverse applications, a wide variety of DSP processors based on different architectures ranging from the traditional to VLIW have been introduced to the market over the years. The functionality, performance, and cost of these processors vary over a wide range. In order to select a processor that meets the design criteria for an application, processor performance is usually the major concern for digital signal processing (DSP) application developers. Performance data are also essential for the designers of DSP processors to improve their design. Consequently, several DSP performance benchmarks have been proposed over the past decade or so. However, none of these benchmarks seem to have included recent new DSP applications. In this paper, we use a new benchmark that we recently developed to compare the performance of popular DSP processors from Texas Instruments and StarCore. The new benchmark is based on the Selectable Mode Vocoder (SMV), a speech-coding program from the recent third generation (3G) wireless voice applications. All benchmark kernels are compiled by the compilers of the respective DSP processors and run on their simulators. Weighted arithmetic mean of clock cycles and arithmetic mean of code size are used to compare the performance of five DSP processors. In addition, we studied how the performance of a processor is affected by code structure, features of processor architecture and optimization of compiler. The extensive experimental data gathered, analyzed, and presented in this paper should be helpful for DSP processor and compiler designers to meet their specific design goals.Keywords: digital signal processors, DSP benchmark, instruction level parallelism, modified cyclomatic complexity, performance analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1607