Search results for: elliptic curve digital signature algorithm
7167 Robust Data Image Watermarking for Data Security
Authors: Harsh Vikram Singh, Ankur Rai, Anand Mohan
Abstract:
In this paper, we propose secure and robust data hiding algorithm based on DCT by Arnold transform and chaotic sequence. The watermark image is scrambled by Arnold cat map to increases its security and then the chaotic map is used for watermark signal spread in middle band of DCT coefficients of the cover image The chaotic map can be used as pseudo-random generator for digital data hiding, to increase security and robustness .Performance evaluation for robustness and imperceptibility of proposed algorithm has been made using bit error rate (BER), normalized correlation (NC), and peak signal to noise ratio (PSNR) value for different watermark and cover images such as Lena, Girl, Tank images and gain factor .We use a binary logo image and text image as watermark. The experimental results demonstrate that the proposed algorithm achieves higher security and robustness against JPEG compression as well as other attacks such as addition of noise, low pass filtering and cropping attacks compared to other existing algorithm using DCT coefficients. Moreover, to recover watermarks in proposed algorithm, there is no need to original cover image.Keywords: data hiding, watermarking, DCT, chaotic sequence, arnold transforms
Procedia PDF Downloads 5157166 Evaluating Key Attributes of Effective Digital Games in Tertiary Education
Authors: Roopali Kulkarni, Yuliya Khrypko
Abstract:
A major problem in educational digital game design is that game developers are often focused on maintaining the fun and playability of an educational game, whereas educators are more concerned with the learning aspect of the game rather than its entertaining characteristics. There is a clear need to understand what key aspects of digital learning games make them an effective learning medium in tertiary education. Through a systematic literature review and content analysis, this paper identifies, evaluates, and summarizes twenty-three key attributes of digital games used in tertiary education and presents a summary digital game-based learning (DGBL) model for designing and evaluating an educational digital game of any genre that promotes effective learning in tertiary education. The proposed solution overcomes limitations of previously designed models for digital game evaluation, such as a small number of game attributes considered or applicability to a specific genre of digital games. The proposed DGBL model can be used to assist game designers and educators with creating effective and engaging educational digital games for the tertiary education curriculum.Keywords: DGBL model, digital games, educational games, game-based learning, tertiary education
Procedia PDF Downloads 2837165 Acoustic Echo Cancellation Using Different Adaptive Algorithms
Authors: Hamid Sharif, Nazish Saleem Abbas, Muhammad Haris Jamil
Abstract:
An adaptive filter is a filter that self-adjusts its transfer function according to an optimization algorithm driven by an error signal. Because of the complexity of the optimization algorithms, most adaptive filters are digital filters. Adaptive filtering constitutes one of the core technologies in digital signal processing and finds numerous application areas in science as well as in industry. Adaptive filtering techniques are used in a wide range of applications, including adaptive noise cancellation and echo cancellation. Acoustic echo cancellation is a common occurrence in today’s telecommunication systems. The signal interference caused by acoustic echo is distracting to both users and causes a reduction in the quality of the communication. In this paper, we review different techniques of adaptive filtering to reduce this unwanted echo. In this paper, we see the behavior of techniques and algorithms of adaptive filtering like Least Mean Square (LMS), Normalized Least Mean Square (NLMS), Variable Step-Size Least Mean Square (VSLMS), Variable Step-Size Normalized Least Mean Square (VSNLMS), New Varying Step Size LMS Algorithm (NVSSLMS) and Recursive Least Square (RLS) algorithms to reduce this unwanted echo, to increase communication quality.Keywords: adaptive acoustic, echo cancellation, LMS algorithm, adaptive filter, normalized least mean square (NLMS), variable step-size least mean square (VSLMS)
Procedia PDF Downloads 807164 Role of Digital Economy in the Emerging Countries Like Nigeria
Authors: Aminu Fagge Muhammad
Abstract:
The digital economy is fast becoming the most innovative and widest reaching economy in the world, especially in developing countries. The paper aimed at examining role of digital economy in the emerging countries like Nigeria. The methodology used in the study is Business Model Perspective: lying between the process and structural perspectives, bring in the idea of the new business models that are being enabled e.g. e-business or e-commerce. The paper concluded that, the policy objectives and measures, and processes and structures necessary to enhance digital economy growth and its contribution to socio-economic development. The finding reveals that, digital infrastructure is in part incomplete, costly and poorly-performing in emerging economies like Nigeria. The wider digital ecosystem suffers a shortfall in human capabilities, weak financing, and poor governance. It is also found that, Growth in the digital economy is exacerbating digital exclusion, inequality, adverse incorporation and other digital harms. It is recommended that, government in partnership with private sector should build strong local infrastructure to enable broadband availability and accessibility and to create an enabling environment for strong competition in the telecom and technology ecosystem.Keywords: Digital Economy, Emerging Countries, Business Model , Nigeria
Procedia PDF Downloads 1277163 Refuge(e)s in Digital Diaspora: Reimagining and Reimaging ‘Ethnically Cleansed’ Villages as ‘Cyber Villages’
Authors: Hariz Halilovich
Abstract:
Based on conventional and digital ethnography, this paper discusses the ways Bosnian refugees utilise digital technologies and new media to recreate, synchronise and sustain their identities and memories in the aftermath of ‘ethnic cleansing’ and genocide and in the contexts of their new emplacements and home-making practices in diaspora. In addition to discussing representations of displacement and emplacement in the ‘digital age’, the paper also aims to make a contribution to the understanding and application of digital ethnography as an emerging method of inquiry in anthropology and related social science disciplines. While some researchers see digital ethnography as an exclusively online–based research, the author of this paper argues that it is critical to understand the online world in the context of the real world—made of real people, places, and social relations.Keywords: Bosnia, cyber villages, digital diaspora, refugees
Procedia PDF Downloads 2427162 Quantifying the Second-Level Digital Divide on Sub-National Level with a Composite Index
Authors: Vladimir Korovkin, Albert Park, Evgeny Kaganer
Abstract:
The paper studies the second-level digital divide (the one defined by the way how digital technology is used in everyday life) between regions of the Russian Federation. The paper offers a systemic review of literature on the measurement of the digital divide; based upon this it suggests a composite Digital Life Index, that captures the complex multi-dimensional character of the phenomenon. The model of the index studies separately the digital supply and demand across seven independent dimensions providing for 14 subindices. The Index is based on Internet-borne data, a distinction from traditional research approaches that rely on official statistics or surveys. Regression analysis is used to determine the relative importance of factors like income, human capital, and policy in determining the digital divide. The result of the analysis suggests that the digital divide is driven more by the differences in demand (defined by consumer competencies) than in supply; the role of income is insignificant, and the quality of human capital is the key determinant of the divide. The paper advances the existing methodological literature on the issue and can also inform practical decision-making regarding the strategies of national and regional digital development.Keywords: digital transformation, second-level digital divide, composite index, digital policy, regional development, Russia
Procedia PDF Downloads 1867161 The Family Resemblance in the Handwriting of Painters: Jacek and Rafał Malczewski’s Case
Authors: Olivia Rybak-Karkosz
Abstract:
This paper aims to present the results of scientific research on family resemblance in the handwriting of painters. Such a problem is known in handwriting analysis, but it was never a research subject in the scope of painters' signatures on works of art. For this research, the author chose Jacek, and Rafał Malczewski (father and son) as many of their paintings are in museums, and most of them are signed. The aim was to create a catalogue of traits similar to the handwriting of both artists. Such data could be helpful for the expert’s opinion in the decision-making process to establish whether the signature is authentic and, if so, whether it is the artist whose signature is analysed, not the other family member. There are known examples of relatives of the artists who signed their works. Many of them were artists themselves. For instance Andrzej Wróblewski’s mother, Krystyna was a printmaker. To save his legacy, she signed many of her son’s works after his death using his name. This research methodology consisted of completing representative samples of signatures of both artists, which were collected in selected Polish museums. Then a catalogue of traits was created using a forensic handwriting graphic-comparative method (graphic method). The paper contains a concluding statement that it could be one of the elements of research in an expert’s analysis of the authenticity of the signature on paintings.Keywords: artist’s signatures, authenticity of an artwork, forensic handwriting analysis, graphic-comparative method
Procedia PDF Downloads 837160 Particle Swarm Optimization Algorithm vs. Genetic Algorithm for Image Watermarking Based Discrete Wavelet Transform
Authors: Omaima N. Ahmad AL-Allaf
Abstract:
Over communication networks, images can be easily copied and distributed in an illegal way. The copyright protection for authors and owners is necessary. Therefore, the digital watermarking techniques play an important role as a valid solution for authority problems. Digital image watermarking techniques are used to hide watermarks into images to achieve copyright protection and prevent its illegal copy. Watermarks need to be robust to attacks and maintain data quality. Therefore, we discussed in this paper two approaches for image watermarking, first is based on Particle Swarm Optimization (PSO) and the second approach is based on Genetic Algorithm (GA). Discrete wavelet transformation (DWT) is used with the two approaches separately for embedding process to cover image transformation. Each of PSO and GA is based on co-relation coefficient to detect the high energy coefficient watermark bit in the original image and then hide the watermark in original image. Many experiments were conducted for the two approaches with different values of PSO and GA parameters. From experiments, PSO approach got better results with PSNR equal 53, MSE equal 0.0039. Whereas GA approach got PSNR equal 50.5 and MSE equal 0.0048 when using population size equal to 100, number of iterations equal to 150 and 3×3 block. According to the results, we can note that small block size can affect the quality of image watermarking based PSO/GA because small block size can increase the search area of the watermarking image. Better PSO results were obtained when using swarm size equal to 100.Keywords: image watermarking, genetic algorithm, particle swarm optimization, discrete wavelet transform
Procedia PDF Downloads 2267159 Quadrature Mirror Filter Bank Design Using Population Based Stochastic Optimization
Authors: Ju-Hong Lee, Ding-Chen Chung
Abstract:
The paper deals with the optimal design of two-channel linear-phase (LP) quadrature mirror filter (QMF) banks using a metaheuristic based optimization technique. Based on the theory of two-channel QMF banks using two recursive digital all-pass filters (DAFs), the design problem is appropriately formulated to result in an objective function which is a weighted sum of the group delay error of the designed QMF bank and the magnitude response error of the designed low-pass analysis filter. Through a frequency sampling and a weighted least squares approach, the optimization problem of the objective function can be solved by utilizing a particle swarm optimization algorithm. The resulting two-channel QMF banks can possess approximately LP response without magnitude distortion. Simulation results are presented for illustration and comparison.Keywords: quadrature mirror filter bank, digital all-pass filter, weighted least squares algorithm, particle swarm optimization
Procedia PDF Downloads 5217158 A Benchmark System for Testing Medium Voltage Direct Current (MVDC-CB) Robustness Utilizing Real Time Digital Simulation and Hardware-In-Loop Theory
Authors: Ali Kadivar, Kaveh Niayesh
Abstract:
The integration of green energy resources is a major focus, and the role of Medium Voltage Direct Current (MVDC) systems is exponentially expanding. However, the protection of MVDC systems against DC faults is a challenge that can have consequences on reliable and safe grid operation. This challenge reveals the need for MVDC circuit breakers (MVDC CB), which are in infancies of their improvement. Therefore will be a lack of MVDC CBs standards, including thresholds for acceptable power losses and operation speed. To establish a baseline for comparison purposes, a benchmark system for testing future MVDC CBs is vital. The literatures just give the timing sequence of each switch and the emphasis is on the topology, without in-depth study on the control algorithm of DCCB, as the circuit breaker control system is not yet systematic. A digital testing benchmark is designed for the Proof-of-concept of simulation studies using software models. It can validate studies based on real-time digital simulators and Transient Network Analyzer (TNA) models. The proposed experimental setup utilizes data accusation from the accurate sensors installed on the tested MVDC CB and through general purpose input/outputs (GPIO) from the microcontroller and PC Prototype studies in the laboratory-based models utilizing Hardware-in-the-Loop (HIL) equipment connected to real-time digital simulators is achieved. The improved control algorithm of the circuit breaker can reduce the peak fault current and avoid arc resignation, helping the coordination of DCCB in relay protection. Moreover, several research gaps are identified regarding case studies and evaluation approaches.Keywords: DC circuit breaker, hardware-in-the-loop, real time digital simulation, testing benchmark
Procedia PDF Downloads 797157 Two-Dimensional Symmetric Half-Plane Recursive Doubly Complementary Digital Lattice Filters
Authors: Ju-Hong Lee, Chong-Jia Ciou, Yuan-Hau Yang
Abstract:
This paper deals with the problem of two-dimensional (2-D) recursive doubly complementary (DC) digital filter design. We present a structure of 2-D recursive DC filters by using 2-D symmetric half-plane (SHP) recursive digital all-pass lattice filters (DALFs). The novelty of using 2-D SHP recursive DALFs to construct a 2-D recursive DC digital lattice filter is that the resulting 2-D SHP recursive DC digital lattice filter provides better performance than the existing 2-D SHP recursive DC digital filter. Moreover, the proposed structure possesses a favorable 2-D DC half-band (DC-HB) property that allows about half of the 2-D SHP recursive DALF’s coefficients to be zero. This leads to considerable savings in computational burden for implementation. To ensure the stability of a designed 2-D SHP recursive DC digital lattice filter, some necessary constraints on the phase of the 2-D SHP recursive DALF during the design process are presented. Design of a 2-D diamond-shape decimation/interpolation filter is presented for illustration and comparison.Keywords: all-pass digital filter, doubly complementary, lattice structure, symmetric half-plane digital filter, sampling rate conversion
Procedia PDF Downloads 4377156 Finite Element and Split Bregman Methods for Solving a Family of Optimal Control Problem with Partial Differential Equation Constraint
Authors: Mahmoud Lot
Abstract:
In this article, we will discuss the solution of elliptic optimal control problem. First, by using the nite element method, we obtain the discrete form of the problem. The obtained discrete problem is actually a large scale constrained optimization problem. Solving this optimization problem with traditional methods is difficult and requires a lot of CPU time and memory. But split Bergman method converts the constrained problem to an unconstrained, and hence it saves time and memory requirement. Then we use the split Bregman method for solving this problem, and examples show the speed and accuracy of split Bregman methods for solving these types of problems. We also use the SQP method for solving the examples and compare with the split Bregman method.Keywords: Split Bregman Method, optimal control with elliptic partial differential equation constraint, finite element method
Procedia PDF Downloads 1527155 Automatic Registration of Rail Profile Based Local Maximum Curvature Entropy
Authors: Hao Wang, Shengchun Wang, Weidong Wang
Abstract:
On the influence of train vibration and environmental noise on the measurement of track wear, we proposed a method for automatic extraction of circular arc on the inner or outer side of the rail waist and achieved the high-precision registration of rail profile. Firstly, a polynomial fitting method based on truncated residual histogram was proposed to find the optimal fitting curve of the profile and reduce the influence of noise on profile curve fitting. Then, based on the curvature distribution characteristics of the fitting curve, the interval search algorithm based on dynamic window’s maximum curvature entropy was proposed to realize the automatic segmentation of small circular arc. At last, we fit two circle centers as matching reference points based on small circular arcs on both sides and realized the alignment from the measured profile to the standard designed profile. The static experimental results show that the mean and standard deviation of the method are controlled within 0.01mm with small measurement errors and high repeatability. The dynamic test also verified the repeatability of the method in the train-running environment, and the dynamic measurement deviation of rail wear is within 0.2mm with high repeatability.Keywords: curvature entropy, profile registration, rail wear, structured light, train-running
Procedia PDF Downloads 2607154 Mapping the Potential and Development Strategy of Digital Economy in Indonesia
Authors: Jordan Putra Cahyono, Tiara Ayu Kusumaningtyas, Mohtar Rasyid
Abstract:
This article aims to map the potential and strategy of digital economy develop-ment in Indonesia by using literature study and secondary data analysis. In the Indonesian context, the digital economy is attracting attention, especially amid the COVID-19 pandemic, which has brought substantial changes in economic activi-ties. This research aims to provide new insights into the potential and develop-ment strategies of the digital economy in Indonesia. This article also evaluates the effectiveness and efficiency of digital economic development strategies imple-mented in Indonesia. A literature review concluded that Indonesia has great po-tential to develop the digital economy with favorable conditions, including a large population, improved ICT infrastructure, and relatively liberalized regulations. Using qualitative and quantitative approaches, this research covers the subject of the potential and strategies for developing a digital economy in Indonesia. This article presents the research results, which are then discussed in the context of the potential and strategy of digital economy development in Indonesia. This article is expected to contribute to understanding Indonesia's digital economy and stimulate further discussion to formulate a robust development strategy and appropriate regulatory framework.Keywords: indonesia's digital economy, ICT infrastructure, development strategy, potential
Procedia PDF Downloads 607153 Facial Pose Classification Using Hilbert Space Filling Curve and Multidimensional Scaling
Authors: Mekamı Hayet, Bounoua Nacer, Benabderrahmane Sidahmed, Taleb Ahmed
Abstract:
Pose estimation is an important task in computer vision. Though the majority of the existing solutions provide good accuracy results, they are often overly complex and computationally expensive. In this perspective, we propose the use of dimensionality reduction techniques to address the problem of facial pose estimation. Firstly, a face image is converted into one-dimensional time series using Hilbert space filling curve, then the approach converts these time series data to a symbolic representation. Furthermore, a distance matrix is calculated between symbolic series of an input learning dataset of images, to generate classifiers of frontal vs. profile face pose. The proposed method is evaluated with three public datasets. Experimental results have shown that our approach is able to achieve a correct classification rate exceeding 97% with K-NN algorithm.Keywords: machine learning, pattern recognition, facial pose classification, time series
Procedia PDF Downloads 3507152 Mastering Digitization: A Quality-Adapted Digital Transformation Model
Authors: Franziska Schaefer, Marlene Kuhn, Heiner Otten
Abstract:
In the very near future, digitization will be the main challenge a company has to master to survive in a highly competitive market. Developing the right transformation strategy by considering all relevant aspects determines the success or failure of a company. Especially the digital focus on the customer plays a key role in creating sustainable competitive advantages, also leading to new tasks within the quality management. Therefore, quality management needs to be particularly addressed to support the upcoming digital change. In this paper, we present an analysis of existing digital transformation approaches and derive a transformation strategy from a quality management perspective. We identify and classify different transformation dimensions and assess their relevance to quality management tasks, resulting in a quality-adapted digital transformation model. Furthermore, we introduce applicable and customized quality management methods to support the presented digital transformation tasks. With our developed model we provide a digital transformation guideline from a quality perspective to master future disruptive changes.Keywords: digital transformation, digitization, quality management, strategy
Procedia PDF Downloads 4787151 Segmenting 3D Optical Coherence Tomography Images Using a Kalman Filter
Authors: Deniz Guven, Wil Ward, Jinming Duan, Li Bai
Abstract:
Over the past two decades or so, Optical Coherence Tomography (OCT) has been used to diagnose retina and optic nerve diseases. The retinal nerve fibre layer, for example, is a powerful diagnostic marker for detecting and staging glaucoma. With the advances in optical imaging hardware, the adoption of OCT is now commonplace in clinics. More and more OCT images are being generated, and for these OCT images to have clinical applicability, accurate automated OCT image segmentation software is needed. Oct image segmentation is still an active research area, as OCT images are inherently noisy, with the multiplicative speckling noise. Simple edge detection algorithms are unsuitable for detecting retinal layer boundaries in OCT images. Intensity fluctuation, motion artefact, and the presence of blood vessels also decrease further OCT image quality. In this paper, we introduce a new method for segmenting three-dimensional (3D) OCT images. This involves the use of a Kalman filter, which is commonly used in computer vision for object tracking. The Kalman filter is applied to the 3D OCT image volume to track the retinal layer boundaries through the slices within the volume and thus segmenting the 3D image. Specifically, after some pre-processing of the OCT images, points on the retinal layer boundaries in the first image are identified, and curve fitting is applied to them such that the layer boundaries can be represented by the coefficients of the curve equations. These coefficients then form the state space for the Kalman Filter. The filter then produces an optimal estimate of the current state of the system by updating its previous state using the measurements available in the form of a feedback control loop. The results show that the algorithm can be used to segment the retinal layers in OCT images. One of the limitations of the current algorithm is that the curve representation of the retinal layer boundary does not work well when the layer boundary is split into two, e.g., at the optic nerve, the layer boundary split into two. This maybe resolved by using a different approach to representing the boundaries, such as b-splines or level sets. The use of a Kalman filter shows promise to developing accurate and effective 3D OCT segmentation methods.Keywords: optical coherence tomography, image segmentation, Kalman filter, object tracking
Procedia PDF Downloads 4827150 Using Signature Assignments and Rubrics in Assessing Institutional Learning Outcomes and Student Learning
Authors: Leigh Ann Wilson, Melanie Borrego
Abstract:
The purpose of institutional learning outcomes (ILOs) is to assess what students across the university know and what they do not. The issue is gathering this information in a systematic and usable way. This presentation will explain how one institution has engineered this process for both student success and maximum faculty curriculum and course design input. At Brandman University, there are three levels of learning outcomes: course, program, and institutional. Institutional Learning Outcomes (ILOs) are mapped to specific courses. Faculty course developers write the signature assignments (SAs) in alignment with the Institutional Learning Outcomes for each course. These SAs use a specific rubric that is applied consistently by every section and every instructor. Each year, the 12-member General Education Team (GET), as a part of their work, conducts the calibration and assessment of the university-wide SAs and the related rubrics for one or two of the five ILOs. GET members, who are senior faculty and administrators who represent each of the university's schools, lead the calibration meetings. Specifically, calibration is a process designed to ensure the accuracy and reliability of evaluating signature assignments by working with peer faculty to interpret rubrics and compare scoring. These calibration meetings include the full time and adjunct faculty members who teach the course to ensure consensus on the application of the rubric. Each calibration session is chaired by a GET representative as well as the course custodian/contact where the ILO signature assignment resides. The overall calibration process GET follows includes multiple steps, such as: contacting and inviting relevant faculty members to participate; organizing and hosting calibration sessions; and reviewing and discussing at least 10 samples of student work from class sections during the previous academic year, for each applicable signature assignment. Conversely, the commitment for calibration teams consist of attending two virtual meetings lasting up to three hours in duration. The first meeting focuses on interpreting the rubric, and the second meeting involves comparing scores for sample work and sharing feedback about the rubric and assignment. Next, participants are expected to follow all directions provided and participate actively, and respond to scheduling requests and other emails within 72 hours. The virtual meetings are recorded for future institutional use. Adjunct faculty are paid a small stipend after participating in both calibration meetings. Full time faculty can use this work on their annual faculty report for "internal service" credit.Keywords: assessment, assurance of learning, course design, institutional learning outcomes, rubrics, signature assignments
Procedia PDF Downloads 2807149 Digital Activism and the Individual: A Utilitarian Perspective
Authors: Tania Mitra
Abstract:
Digital Activism or Cyber Activism uses digital media as a means to disseminate information and mobilize masses towards a specific goal. When digital activism was first born in the early 1990s, it was primarily used by groups of organized political activists. However, with the advent of social media, online activism has filtered down to the individual- one who does not necessarily belong to or identify with an agenda, group, or political party. A large part of digital activism today stems from the individual’s notion of what is right and wrong. This gives rise to a discourse around descriptive ethics and the implications of the independent digital activist. Although digital activism has paved the way for and bolstered support for causes like the MeToo Movement and Black Lives Matter, the lack of a unified, organized body has led to counterintuitive progressions and suspicions regarding the movements. The paper introduces the ideas of 'clout' culture, click baits, and clicktivism (the phenomenon where activism is reduced to a blind following of the online trends), to discuss the impacts of exclusive digital activism. By using Jeremy Bentham's utilitarian approach to ethics, that places emphasis on the best possible outcome for a society, the paper will show how individual online activism reaching for a larger, more common end can sometimes lead to an undermining of that end, not only in the online space but also how it manifests in the real world.Keywords: digital activism, ethics, independent digital activist, utilitarianism
Procedia PDF Downloads 1257148 Kinoform Optimisation Using Gerchberg- Saxton Iterative Algorithm
Authors: M. Al-Shamery, R. Young, P. Birch, C. Chatwin
Abstract:
Computer Generated Holography (CGH) is employed to create digitally defined coherent wavefronts. A CGH can be created by using different techniques such as by using a detour-phase technique or by direct phase modulation to create a kinoform. The detour-phase technique was one of the first techniques that was used to generate holograms digitally. The disadvantage of this technique is that the reconstructed image often has poor quality due to the limited dynamic range it is possible to record using a medium with reasonable spatial resolution.. The kinoform (phase-only hologram) is an alternative technique. In this method, the phase of the original wavefront is recorded but the amplitude is constrained to be constant. The original object does not need to exist physically and so the kinoform can be used to reconstruct an almost arbitrary wavefront. However, the image reconstructed by this technique contains high levels of noise and is not identical to the reference image. To improve the reconstruction quality of the kinoform, iterative techniques such as the Gerchberg-Saxton algorithm (GS) are employed. In this paper the GS algorithm is described for the optimisation of a kinoform used for the reconstruction of a complex wavefront. Iterations of the GS algorithm are applied to determine the phase at a plane (with known amplitude distribution which is often taken as uniform), that satisfies given phase and amplitude constraints in a corresponding Fourier plane. The GS algorithm can be used in this way to enhance the reconstruction quality of the kinoform. Different images are employed as the reference object and their kinoform is synthesised using the GS algorithm. The quality of the reconstructed images is quantified to demonstrate the enhanced reconstruction quality achieved by using this method.Keywords: computer generated holography, digital holography, Gerchberg-Saxton algorithm, kinoform
Procedia PDF Downloads 5337147 Development of a Tesla Music Coil from Signal Processing
Authors: Samaniego Campoverde José Enrique, Rosero Muñoz Jorge Enrique, Luzcando Narea Lorena Elizabeth
Abstract:
This paper presents a practical and theoretical model for the operation of the Tesla coil using digital signal processing. The research is based on the analysis of ten scientific papers exploring the development and operation of the Tesla coil. Starting from the Testa coil, several modifications were carried out on the Tesla coil, with the aim of amplifying the digital signal by making use of digital signal processing. To achieve this, an amplifier with a transistor and digital filters provided by MATLAB software were used, which were chosen according to the characteristics of the signals in question.Keywords: tesla coil, digital signal process, equalizer, graphical environment
Procedia PDF Downloads 1177146 Improving the Performance of Back-Propagation Training Algorithm by Using ANN
Authors: Vishnu Pratap Singh Kirar
Abstract:
Artificial Neural Network (ANN) can be trained using backpropagation (BP). It is the most widely used algorithm for supervised learning with multi-layered feed-forward networks. Efficient learning by the BP algorithm is required for many practical applications. The BP algorithm calculates the weight changes of artificial neural networks, and a common approach is to use a two-term algorithm consisting of a learning rate (LR) and a momentum factor (MF). The major drawbacks of the two-term BP learning algorithm are the problems of local minima and slow convergence speeds, which limit the scope for real-time applications. Recently the addition of an extra term, called a proportional factor (PF), to the two-term BP algorithm was proposed. The third increases the speed of the BP algorithm. However, the PF term also reduces the convergence of the BP algorithm, and criteria for evaluating convergence are required to facilitate the application of the three terms BP algorithm. Although these two seem to be closely related, as described later, we summarize various improvements to overcome the drawbacks. Here we compare the different methods of convergence of the new three-term BP algorithm.Keywords: neural network, backpropagation, local minima, fast convergence rate
Procedia PDF Downloads 4987145 Digital Innovation and Business Transformation
Authors: Bisola Stella Sonde
Abstract:
Digital innovation has emerged as a pivotal driver of business transformation in the contemporary landscape. This case study research explores the dynamic interplay between digital innovation and the profound metamorphosis of businesses across industries. It delves into the multifaceted dimensions of digital innovation, elucidating its impact on organizational structures, customer experiences, and operational paradigms. The study investigates real-world instances of businesses harnessing digital technologies to enhance their competitiveness, agility, and sustainability. It scrutinizes the strategic adoption of digital platforms, data analytics, artificial intelligence, and emerging technologies as catalysts for transformative change. The cases encompass a diverse spectrum of industries, spanning from traditional enterprises to disruptive startups, offering insights into the universal relevance of digital innovation. Moreover, the research scrutinizes the challenges and opportunities posed by the digital era, shedding light on the intricacies of managing cultural shifts, data privacy, and cybersecurity concerns in the pursuit of innovation. It unveils the strategies that organizations employ to adapt, thrive, and lead in the era of digital disruption. In summary, this case study research underscores the imperative of embracing digital innovation as a cornerstone of business transformation. It offers a comprehensive exploration of the contemporary digital landscape, offering valuable lessons for organizations striving to navigate the ever-evolving terrain of the digital age.Keywords: business transformation, digital innovation, emerging technologies, organizational structures
Procedia PDF Downloads 607144 Prospective Validation of the FibroTest Score in Assessing Liver Fibrosis in Hepatitis C Infection with Genotype 4
Authors: G. Shiha, S. Seif, W. Samir, K. Zalata
Abstract:
Prospective Validation of the FibroTest Score in assessing Liver Fibrosis in Hepatitis C Infection with Genotype 4 FibroTest (FT) is non-invasive score of liver fibrosis that combines the quantitative results of 5 serum biochemical markers (alpha-2-macroglobulin, haptoglobin, apolipoprotein A1, gamma glutamyl transpeptidase (GGT) and bilirubin) and adjusted with the patient's age and sex in a patented algorithm to generate a measure of fibrosis. FT has been validated in patients with chronic hepatitis C (CHC) (Halfon et al., Gastroenterol. Clin Biol.( 2008), 32 6suppl 1, 22-39). The validation of fibro test ( FT) in genotype IV is not well studied. Our aim was to evaluate the performance of FibroTest in an independent prospective cohort of hepatitis C patients with genotype 4. Subject was 122 patients with CHC. All liver biopsies were scored using METAVIR system. Our fibrosis score(FT) were measured, and the performance of the cut-off score were done using ROC curve. Among patients with advanced fibrosis, the FT was identically matched with the liver biopsy in 18.6%, overestimated the stage of fibrosis in 44.2% and underestimated the stage of fibrosis in 37.7% of cases. Also in patients with no/mild fibrosis, identical matching was detected in 39.2% of cases with overestimation in 48.1% and underestimation in 12.7%. So, the overall results of the test were identical matching, overestimation and underestimation in 32%, 46.7% and 21.3% respectively. Using ROC curve it was found that (FT) at the cut-off point of 0.555 could discriminate early from advanced stages of fibrosis with an area under ROC curve (AUC) of 0.72, sensitivity of 65%, specificity of 69%, PPV of 68%, NPV of 66% and accuracy of 67%. As FibroTest Score overestimates the stage of advanced fibrosis, it should not be considered as a reliable surrogate for liver biopsy in hepatitis C infection with genotype 4.Keywords: fibrotest, chronic Hepatitis C, genotype 4, liver biopsy
Procedia PDF Downloads 4157143 Tabu Random Algorithm for Guiding Mobile Robots
Authors: Kevin Worrall, Euan McGookin
Abstract:
The use of optimization algorithms is common across a large number of diverse fields. This work presents the use of a hybrid optimization algorithm applied to a mobile robot tasked with carrying out a search of an unknown environment. The algorithm is then applied to the multiple robots case, which results in a reduction in the time taken to carry out the search. The hybrid algorithm is a Random Search Algorithm fused with a Tabu mechanism. The work shows that the algorithm locates the desired points in a quicker time than a brute force search. The Tabu Random algorithm is shown to work within a simulated environment using a validated mathematical model. The simulation was run using three different environments with varying numbers of targets. As an algorithm, the Tabu Random is small, clear and can be implemented with minimal resources. The power of the algorithm is the speed at which it locates points of interest and the robustness to the number of robots involved. The number of robots can vary with no changes to the algorithm resulting in a flexible algorithm.Keywords: algorithms, control, multi-agent, search and rescue
Procedia PDF Downloads 2397142 Lifelong Learning and Digital Literacies in Language Learning
Authors: Selma Karabinar
Abstract:
Lifelong learning can be described as a system where learning takes place for a person over the course of a lifespan and comprises formal, non-formal and informal learning to achieve the maximum possible improvement in personal, social, and vocational life. 21st century is marked with the digital technologies and people need to learn and adapt to new literacies as part of their lifelong learning. Our current knowledge gap brings to mind several questions: Do people with digital mindsets have different assumptions about affordances of digital technologies? How do digital mindsets lead language learners use digital technologies within and beyond classrooms? Does digital literacies have different significance for the learners? The presentation is based on a study attempted to answer these questions and show the relationship between lifelong learning and digital literacies. The study was conducted with learners of English language at a state university in Istanbul. The quantitative data in terms of participants' lifelong learning perception was collected through a lifelong learning scale from 150 students. Then 5 students with high and 5 with low lifelong learning perception were interviewed. They were questioned about their personal sense of agency in lifelong learning and how they use digital technologies in their language learning. Therefore, the qualitative data was analyzed in terms of their knowledge about digital literacies and actual use of it in their personal and educational life. The results of the study suggest why teaching new literacies are important for lifelong learning and also suggests implications for language teachers' education and language pedagogy.Keywords: digital mindsets, language learning, lifelong learning, new literacies
Procedia PDF Downloads 3817141 A Closer Look on Economic and Fiscal Incentives for Digital TV Industry
Authors: Yunita Anwar, Maya Safira Dewi
Abstract:
With the increasing importance on digital TV industry, there must be several incentives given to support the growth of the industry. Prior research have found mixed findings of economic and fiscal incentives to economic growth, which means these incentives do not necessarily boost the economic growth while providing support to a particular industry. Focusing on a setting of digital TV transition in Indonesia, this research will conduct document analysis to analyze incentives have been given in other country and incentives currently available in Indonesia. Our results recommend that VAT exemption and local tax incentives could be considered to be added to the incentives list available for digital TV industry.Keywords: Digital TV transition, Economic Incentives, Fiscal Incentives, Policy.
Procedia PDF Downloads 3247140 Digital Literacy Landscape of Islamic Boarding Schools in Indonesia
Authors: Zainuddin Abuhamid Muhammad Ghozali, Andrew Whitworth
Abstract:
Islamic boarding school or pesantren is a distinctive education institution in Indonesia focusing on religious teachings. Its stance in restricting access to the internet raises a question about its students’ development of digital literacy. Inspired by Luckin’s ecology of resource model, this study aims to map out the digital literacy situation of the institution based on the availability of learning resources, such as digital facilities, digital accessibility, and digital competence. This study was carried out through a survey method involving 50 teachers from pesantrens across the nation. The result shows that pesantrens have provided students with digital facilities at a moderate level, yet the accessibility to using them is still limited. They also incorporated digital competencies into their curriculum, with an emphasis on digital ethics. The study also identifies different patterns of pesantrens’ behavior based on types and educational levels, where certain school types and educational levels tend to give a stricter policy compared to others or vice versa. The restriction of digital resources in pesantren indicated that they had done a filtration process to design their learning environment. The filtration was mainly motivated by sociocultural factors, where they drew concern for the negative impact of the internet. Notably, this restriction also contributes to students’ poor development of digital literacy.Keywords: digital literacy, ecology of resources, Indonesia, Islamic boarding school
Procedia PDF Downloads 717139 Digital Recording System Identification Based on Audio File
Authors: Michel Kulhandjian, Dimitris A. Pados
Abstract:
The objective of this work is to develop a theoretical framework for reliable digital recording system identification from digital audio files alone, for forensic purposes. A digital recording system consists of a microphone and a digital sound processing card. We view the cascade as a system of unknown transfer function. We expect same manufacturer and model microphone-sound card combinations to have very similar/near identical transfer functions, bar any unique manufacturing defect. Input voice (or other) signals are modeled as non-stationary processes. The technical problem under consideration becomes blind deconvolution with non-stationary inputs as it manifests itself in the specific application of digital audio recording equipment classification.Keywords: blind system identification, audio fingerprinting, blind deconvolution, blind dereverberation
Procedia PDF Downloads 3047138 Filling the Gap of Extraction of Digital Evidence from Emerging Platforms Without Forensics Tools
Authors: Yi Anson Lam, Siu Ming Yiu, Kam Pui Chow
Abstract:
Digital evidence has been tendering to courts at an exponential rate in recent years. As an industrial practice, most digital evidence is extracted and preserved using specialized and well-accepted forensics tools. On the other hand, the advancement in technologies enables the creation of quite a few emerging platforms such as Telegram, Signal etc. Existing (well-accepted) forensics tools were not designed to extract evidence from these emerging platforms. While new forensics tools require a significant amount of time and effort to be developed and verified, this paper tries to address how to fill this gap using quick-fix alternative methods for digital evidence collection (e.g., based on APIs provided by Apps) and discuss issues related to the admissibility of this evidence to courts with support from international courts’ stance and the circumstances of accepting digital evidence using these proposed alternatives.Keywords: extraction, digital evidence, laws, investigation
Procedia PDF Downloads 67