Search results for: elliptic curve digital signature algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7092

Search results for: elliptic curve digital signature algorithm

6912 Fault Location Identification in High Voltage Transmission Lines

Authors: Khaled M. El Naggar

Abstract:

This paper introduces a digital method for fault section identification in transmission lines. The method uses digital set of the measured short circuit current to locate faults in electrical power systems. The digitized current is used to construct a set of overdetermined system of equations. The problem is then constructed and solved using the proposed digital optimization technique to find the fault distance. The proposed optimization methodology is an application of simulated annealing optimization technique. The method is tested using practical case study to evaluate the proposed method. The accurate results obtained show that the algorithm can be used as a powerful tool in the area of power system protection.

Keywords: optimization, estimation, faults, measurement, high voltage, simulated annealing

Procedia PDF Downloads 369
6911 A Method to Enhance the Accuracy of Digital Forensic in the Absence of Sufficient Evidence in Saudi Arabia

Authors: Fahad Alanazi, Andrew Jones

Abstract:

Digital forensics seeks to achieve the successful investigation of digital crimes through obtaining acceptable evidence from digital devices that can be presented in a court of law. Thus, the digital forensics investigation is normally performed through a number of phases in order to achieve the required level of accuracy in the investigation processes. Since 1984 there have been a number of models and frameworks developed to support the digital investigation processes. In this paper, we review a number of the investigation processes that have been produced throughout the years and introduce a proposed digital forensic model which is based on the scope of the Saudi Arabia investigation process. The proposed model has been integrated with existing models for the investigation processes and produced a new phase to deal with a situation where there is initially insufficient evidence.

Keywords: digital forensics, process, metadata, Traceback, Sauid Arabia

Procedia PDF Downloads 317
6910 Establishing Digital Forensics Capability and Capacity among Malaysia's Law Enforcement Agencies: Issues, Challenges and Recommendations

Authors: Sarah Taylor, Nor Zarina Zainal Abidin, Mohd Zabri Adil Talib

Abstract:

Although cybercrime is on the rise, yet many Law Enforcement Agencies in Malaysia faces difficulty in establishing own digital forensics capability and capacity. The main reasons are undoubtedly because of the high cost and difficulty in convincing their management. A survey has been conducted among Malaysia’s Law Enforcement Agencies owning a digital forensics laboratory to understand their history of building digital forensics capacity and capability, the challenges and the impact of having own laboratory to their case investigation. The result of the study shall be used by other Law Enforcement Agencies in justifying to their management to establish own digital forensics capability and capacity.

Keywords: digital forensics, digital forensics capacity and capability, laboratory, law enforcement agency

Procedia PDF Downloads 201
6909 Infinite Impulse Response Digital Filters Design

Authors: Phuoc Si Nguyen

Abstract:

Infinite impulse response (IIR) filters can be designed from an analogue low pass prototype by using frequency transformation in the s-domain and bilinear z-transformation with pre-warping frequency; this method is known as frequency transformation from the s-domain to the z-domain. This paper will introduce a new method to transform an IIR digital filter to another type of IIR digital filter (low pass, high pass, band pass, band stop or narrow band) using a technique based on inverse bilinear z-transformation and inverse matrices. First, a matrix equation is derived from inverse bilinear z-transformation and Pascal’s triangle. This Low Pass Digital to Digital Filter Pascal Matrix Equation is used to transform a low pass digital filter to other digital filter types. From this equation and the inverse matrix, a Digital to Digital Filter Pascal Matrix Equation can be derived that is able to transform any IIR digital filter. This paper will also introduce some specific matrices to replace the inverse matrix, which is difficult to determine due to the larger size of the matrix in the current method. This will make computing and hand calculation easier when transforming from one IIR digital filter to another in the digital domain.

Keywords: bilinear z-transformation, frequency transformation, inverse bilinear z-transformation, IIR digital filters

Procedia PDF Downloads 386
6908 Digital Economy as an Alternative for Post-Pandemic Recovery in Latin America: A Literature Review

Authors: Armijos-Orellana Ana, González-Calle María, Maldonado-Matute Juan, Guerrero-Maxi Pedro

Abstract:

Nowadays, the digital economy represents a fundamental element to guarantee economic and social development, whose importance increased significantly with the arrival of the COVID-19 pandemic. However, despite the benefits it offers, it can also be detrimental to those developing countries characterized by a wide digital divide. It is for this reason that the objective of this research was to identify and describe the main characteristics, benefits, and obstacles of the digital economy for Latin American countries. Through a bibliographic review, using the analytical-synthetic method in the period 1995-2021, it was determined that the digital economy could give way to structural changes, reduce inequality, and promote processes of social inclusion, as well as promote the construction and participatory development of organizational structures and institutional capacities in Latin American countries. However, the results showed that the digital economy is still incipient in the region and at least three factors are needed to establish it: joint work between academia, the business sector and the State, greater emphasis on learning and application of digital transformation and the creation of policies that encourage the creation of digital organizations.

Keywords: developing countries, digital divide, digital economy, digital literacy, digital transformation

Procedia PDF Downloads 102
6907 A Coordinate-Based Heuristic Route Search Algorithm for Delivery Truck Routing Problem

Authors: Ahmed Tarek, Ahmed Alveed

Abstract:

Vehicle routing problem is a well-known re-search avenue in computing. Modern vehicle routing is more focused with the GPS-based coordinate system, as the state-of-the-art vehicle, and trucking systems are equipped with digital navigation. In this paper, a new two dimensional coordinate-based algorithm for addressing the vehicle routing problem for a supply chain network is proposed and explored, and the algorithm is compared with other available, and recently devised heuristics. For the algorithms discussed, which includes the pro-posed coordinate-based search heuristic as well, the advantages and the disadvantages associated with the heuristics are explored. The proposed algorithm is studied from the stand point of a small supermarket chain delivery network that supplies to its stores in four different states around the East Coast area, and is trying to optimize its trucking delivery cost. Minimizing the delivery cost for the supply network of a supermarket chain is important to ensure its business success.

Keywords: coordinate-based optimal routing, Hamiltonian Circuit, heuristic algorithm, traveling salesman problem, vehicle routing problem

Procedia PDF Downloads 115
6906 Factors Underlying the Digital Divide for Disabled People: Focus on a Korean Case Study

Authors: Soungwan Kim

Abstract:

This study identifies factors underlying the digital divide that is faced by the disabled. The results of its analysis showed that the digital divide in PC use is affected by age, number of years of education, employment status, and household income of more than KRW 3 million. The digital divide in smart device use is affected by sex, age, number of years of education, time when disability struck, and household income of more than KRW 3 million. Based on these results, this study proposes methods for bridging the digital divide faced by the disabled.

Keywords: digital divide, digital divide for the disabled, information accessibility for PCs and smart devices, information accessibility

Procedia PDF Downloads 225
6905 An Image Segmentation Algorithm for Gradient Target Based on Mean-Shift and Dictionary Learning

Authors: Yanwen Li, Shuguo Xie

Abstract:

In electromagnetic imaging, because of the diffraction limited system, the pixel values could change slowly near the edge of the image targets and they also change with the location in the same target. Using traditional digital image segmentation methods to segment electromagnetic gradient images could result in lots of errors because of this change in pixel values. To address this issue, this paper proposes a novel image segmentation and extraction algorithm based on Mean-Shift and dictionary learning. Firstly, the preliminary segmentation results from adaptive bandwidth Mean-Shift algorithm are expanded, merged and extracted. Then the overlap rate of the extracted image block is detected before determining a segmentation region with a single complete target. Last, the gradient edge of the extracted targets is recovered and reconstructed by using a dictionary-learning algorithm, while the final segmentation results are obtained which are very close to the gradient target in the original image. Both the experimental results and the simulated results show that the segmentation results are very accurate. The Dice coefficients are improved by 70% to 80% compared with the Mean-Shift only method.

Keywords: gradient image, segmentation and extract, mean-shift algorithm, dictionary iearning

Procedia PDF Downloads 235
6904 Evaluation of Three Digital Graphical Methods of Baseflow Separation Techniques in the Tekeze Water Basin in Ethiopia

Authors: Alebachew Halefom, Navsal Kumar, Arunava Poddar

Abstract:

The purpose of this work is to specify the parameter values, the base flow index (BFI), and to rank the methods that should be used for base flow separation. Three different digital graphical approaches are chosen and used in this study for the purpose of comparison. The daily time series discharge data were collected from the site for a period of 30 years (1986 up to 2015) and were used to evaluate the algorithms. In order to separate the base flow and the surface runoff, daily recorded streamflow (m³/s) data were used to calibrate procedures and get parameter values for the basin. Additionally, the performance of the model was assessed by the use of the standard error (SE), the coefficient of determination (R²), and the flow duration curve (FDC) and baseflow indexes. The findings indicate that, in general, each strategy can be used worldwide to differentiate base flow; however, the Sliding Interval Method (SIM) performs significantly better than the other two techniques in this basin. The average base flow index was calculated to be 0.72 using the local minimum method, 0.76 using the fixed interval method, and 0.78 using the sliding interval method, respectively.

Keywords: baseflow index, digital graphical methods, streamflow, Emba Madre Watershed

Procedia PDF Downloads 48
6903 Unravelling the Knot: Towards a Definition of ‘Digital Labor’

Authors: Marta D'Onofrio

Abstract:

The debate on the digitalization of the economy has raised questions about how both labor and the regulation of work processes are changing due to the introduction of digital technologies in the productive system. Within the literature, the term ‘digital labor’ is commonly used to identify the impact of digitalization on labor. Despite the wide use of this term, it is still not available an unambiguous definition of it, and this could create confusion in the use of terminology and in the attempts of classification. As a consequence, the purpose of this paper is to provide for a definition and to propose a classification of ‘digital labor’, resorting to the theoretical approach of organizational studies.

Keywords: digital labor, digitalization, data-driven algorithms, big data, organizational studies

Procedia PDF Downloads 117
6902 Development of Star Image Simulator for Star Tracker Algorithm Validation

Authors: Zoubida Mahi

Abstract:

A successful satellite mission in space requires a reliable attitude and orbit control system to command, control and position the satellite in appropriate orbits. Several sensors are used for attitude control, such as magnetic sensors, earth sensors, horizon sensors, gyroscopes, and solar sensors. The star tracker is the most accurate sensor compared to other sensors, and it is able to offer high-accuracy attitude control without the need for prior attitude information. There are mainly three approaches in star sensor research: digital simulation, hardware in the loop simulation, and field test of star observation. In the digital simulation approach, all of the processes are done in software, including star image simulation. Hence, it is necessary to develop star image simulation software that could simulate real space environments and various star sensor configurations. In this paper, we present a new stellar image simulation tool that is used to test and validate the stellar sensor algorithms; the developed tool allows to simulate of stellar images with several types of noise, such as background noise, gaussian noise, Poisson noise, multiplicative noise, and several scenarios that exist in space such as the presence of the moon, the presence of optical system problem, illumination and false objects. On the other hand, we present in this paper a new star extraction algorithm based on a new centroid calculation method. We compared our algorithm with other star extraction algorithms from the literature, and the results obtained show the star extraction capability of the proposed algorithm.

Keywords: star tracker, star simulation, star detection, centroid, noise, scenario

Procedia PDF Downloads 54
6901 Mitigating Denial of Service Attacks in Information Centric Networking

Authors: Bander Alzahrani

Abstract:

Information-centric networking (ICN) using architectures such as Publish-Subscribe Internet Routing Paradigm (PSIRP) is one of the promising candidates for a future Internet, has recently been under the spotlight by the research community to investigate the possibility of redesigning the current Internet architecture to solve many issues such as routing scalability, security, and quality of services issues.. The Bloom filter-based forwarding is a source-routing approach that is used in the PSIRP architecture. This mechanism is vulnerable to brute force attacks which may lead to denial-of-service (DoS) attacks. In this work, we present a new forwarding approach that keeps the advantages of Bloom filter-based forwarding while mitigates attacks on the forwarding mechanism. In practice, we introduce a special type of forwarding nodes called Edge-FW to be placed at the edge of the network. The role of these node is to add an extra security layer by validating and inspecting packets at the edge of the network against brute-force attacks and check whether the packet contains a legitimate forwarding identifier (FId) or not. We leverage Certificateless Aggregate Signature (CLAS) scheme with a small size of 64-bit which is used to sign the FId. Hence, this signature becomes bound to a specific FId. Therefore, malicious nodes that inject packets with random FIds will be easily detected and dropped at the Edge-FW node when the signature verification fails. Our preliminary security analysis suggests that with the proposed approach, the forwarding plane is able to resist attacks such as DoS with very high probability.

Keywords: bloom filter, certificateless aggregate signature, denial-of-service, information centric network

Procedia PDF Downloads 168
6900 Prediction of Bodyweight of Cattle by Artificial Neural Networks Using Digital Images

Authors: Yalçın Bozkurt

Abstract:

Prediction models were developed for accurate prediction of bodyweight (BW) by using Digital Images of beef cattle body dimensions by Artificial Neural Networks (ANN). For this purpose, the animal data were collected at a private slaughter house and the digital images and the weights of each live animal were taken just before they were slaughtered and the body dimensions such as digital wither height (DJWH), digital body length (DJBL), digital body depth (DJBD), digital hip width (DJHW), digital hip height (DJHH) and digital pin bone length (DJPL) were determined from the images, using the data with 1069 observations for each traits. Then, prediction models were developed by ANN. Digital body measurements were analysed by ANN for body prediction and R2 values of DJBL, DJWH, DJHW, DJBD, DJHH and DJPL were approximately 94.32, 91.31, 80.70, 83.61, 89.45 and 70.56 % respectively. It can be concluded that in management situations where BW cannot be measured it can be predicted accurately by measuring DJBL and DJWH alone or both DJBD and even DJHH and different models may be needed to predict BW in different feeding and environmental conditions and breeds

Keywords: artificial neural networks, bodyweight, cattle, digital body measurements

Procedia PDF Downloads 336
6899 Using Printouts as Social Media Evidence and Its Authentication in the Courtroom

Authors: Chih-Ping Chang

Abstract:

Different from traditional objective evidence, social media evidence has its own characteristics with easily tampering, recoverability, and cannot be read without using other devices (such as a computer). Simply taking a screenshot from social network sites must be questioned its original identity. When the police search and seizure digital information, a common way they use is to directly print out digital data obtained and ask the signature of the parties at the presence, without taking original digital data back. In addition to the issue on its original identity, this conduct to obtain evidence may have another two results. First, it will easily allege that is tampering evidence because the police wanted to frame the suspect and falsified evidence. Second, it is not easy to discovery hidden information. The core evidence associated with crime may not appear in the contents of files. Through discovery the original file, data related to the file, such as the original producer, creation time, modification date, and even GPS location display can be revealed from hidden information. Therefore, how to show this kind of evidence in the courtroom will be arguably the most important task for ruling social media evidence. This article, first, will introduce forensic software, like EnCase, TCT, FTK, and analyze their function to prove the identity with another digital data. Then turning back to the court, the second part of this article will discuss legal standard for authentication of social media evidence and application of that forensic software in the courtroom. As the conclusion, this article will provide a rethinking, that is, what kind of authenticity is this rule of evidence chase for. Does legal system automatically operate the transcription of scientific knowledge? Or furthermore, it wants to better render justice, not only under scientific fact, but through multivariate debating.

Keywords: federal rule of evidence, internet forensic, printouts as evidence, social media evidence, United States v. Vayner

Procedia PDF Downloads 264
6898 Patient-Specific Modeling Algorithm for Medical Data Based on AUC

Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper

Abstract:

Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.

Keywords: approach instance-based, area under the ROC curve, patient-specific decision path, clinical predictions

Procedia PDF Downloads 446
6897 Blind Watermarking Using Discrete Wavelet Transform Algorithm with Patchwork

Authors: Toni Maristela C. Estabillo, Michaela V. Matienzo, Mikaela L. Sabangan, Rosette M. Tienzo, Justine L. Bahinting

Abstract:

This study is about blind watermarking on images with different categories and properties using two algorithms namely, Discrete Wavelet Transform and Patchwork Algorithm. A program is created to perform watermark embedding, extraction and evaluation. The evaluation is based on three watermarking criteria namely: image quality degradation, perceptual transparency and security. Image quality is measured by comparing the original properties with the processed one. Perceptual transparency is measured by a visual inspection on a survey. Security is measured by implementing geometrical and non-geometrical attacks through a pass or fail testing. Values used to measure the following criteria are mostly based on Mean Squared Error (MSE) and Peak Signal to Noise Ratio (PSNR). The results are based on statistical methods used to interpret and collect data such as averaging, z Test and survey. The study concluded that the combined DWT and Patchwork algorithms were less efficient and less capable of watermarking than DWT algorithm only.

Keywords: blind watermarking, discrete wavelet transform algorithm, patchwork algorithm, digital watermark

Procedia PDF Downloads 241
6896 Sorting Fish by Hu Moments

Authors: J. M. Hernández-Ontiveros, E. E. García-Guerrero, E. Inzunza-González, O. R. López-Bonilla

Abstract:

This paper presents the implementation of an algorithm that identifies and accounts different fish species: Catfish, Sea bream, Sawfish, Tilapia, and Totoaba. The main contribution of the method is the fusion of the characteristics of invariance to the position, rotation and scale of the Hu moments, with the proper counting of fish. The identification and counting is performed, from an image under different noise conditions. From the experimental results obtained, it is inferred the potentiality of the proposed algorithm to be applied in different scenarios of aquaculture production.

Keywords: counting fish, digital image processing, invariant moments, pattern recognition

Procedia PDF Downloads 381
6895 Theorizing Digital Transformation, Digitization and Digitalization in Africa Emerging Research in Digital Business: A Critical Review of the Current Scholarship

Authors: Ayanda Magida

Abstract:

The paper aims to provide a critical review of the current state-of-the-art literature on emerging digital business theories. They are specifically focusing on the emergent theories on digital transformation, digitization, and digitalization and their importance in the global south. Digital business is an emergent field that cuts across the different existing disciplines. The paper is threefold- to provide the conceptual and theoretical definition of the DT, digitization and digitization. There is a growing need to provide some of the differences between digitalization, digitization and digital transformation from a theoretical and conceptual basis. These tend to be confused and often use interchangeably the second aim is to focus on the emerging theories on digital transformation and digital business. Finally, the paper provides some critical review of the importance of scholarship in the field from the global south. The systematic review of the literature was conducted through the different research databases to provide some of the major theories in the field of digital business and critically argue for the global south stance. Much of the research on the development and adoption of digital technologies, specifically digital transformation, has been done in the west and developed countries. There is thus a dearth of research conducted in developing countries and the global south.

Keywords: digital transformation, digitization, digital business, digitalization

Procedia PDF Downloads 232
6894 Comparison of Existing Predictor and Development of Computational Method for S- Palmitoylation Site Identification in Arabidopsis Thaliana

Authors: Ayesha Sanjana Kawser Parsha

Abstract:

S-acylation is an irreversible bond in which cysteine residues are linked to fatty acids palmitate (74%) or stearate (22%), either at the COOH or NH2 terminal, via a thioester linkage. There are several experimental methods that can be used to identify the S-palmitoylation site; however, since they require a lot of time, computational methods are becoming increasingly necessary. There aren't many predictors, however, that can locate S- palmitoylation sites in Arabidopsis Thaliana with sufficient accuracy. This research is based on the importance of building a better prediction tool. To identify the type of machine learning algorithm that predicts this site more accurately for the experimental dataset, several prediction tools were examined in this research, including the GPS PALM 6.0, pCysMod, GPS LIPID 1.0, CSS PALM 4.0, and NBA PALM. These analyses were conducted by constructing the receiver operating characteristics plot and the area under the curve score. An AI-driven deep learning-based prediction tool has been developed utilizing the analysis and three sequence-based input data, such as the amino acid composition, binary encoding profile, and autocorrelation features. The model was developed using five layers, two activation functions, associated parameters, and hyperparameters. The model was built using various combinations of features, and after training and validation, it performed better when all the features were present while using the experimental dataset for 8 and 10-fold cross-validations. While testing the model with unseen and new data, such as the GPS PALM 6.0 plant and pCysMod mouse, the model performed better, and the area under the curve score was near 1. It can be demonstrated that this model outperforms the prior tools in predicting the S- palmitoylation site in the experimental data set by comparing the area under curve score of 10-fold cross-validation of the new model with the established tools' area under curve score with their respective training sets. The objective of this study is to develop a prediction tool for Arabidopsis Thaliana that is more accurate than current tools, as measured by the area under the curve score. Plant food production and immunological treatment targets can both be managed by utilizing this method to forecast S- palmitoylation sites.

Keywords: S- palmitoylation, ROC PLOT, area under the curve, cross- validation score

Procedia PDF Downloads 42
6893 Improved Processing Speed for Text Watermarking Algorithm in Color Images

Authors: Hamza A. Al-Sewadi, Akram N. A. Aldakari

Abstract:

Copyright protection and ownership proof of digital multimedia are achieved nowadays by digital watermarking techniques. A text watermarking algorithm for protecting the property rights and ownership judgment of color images is proposed in this paper. Embedding is achieved by inserting texts elements randomly into the color image as noise. The YIQ image processing model is found to be faster than other image processing methods, and hence, it is adopted for the embedding process. An optional choice of encrypting the text watermark before embedding is also suggested (in case required by some applications), where, the text can is encrypted using any enciphering technique adding more difficulty to hackers. Experiments resulted in embedding speed improvement of more than double the speed of other considered systems (such as least significant bit method, and separate color code methods), and a fairly acceptable level of peak signal to noise ratio (PSNR) with low mean square error values for watermarking purposes.

Keywords: steganography, watermarking, time complexity measurements, private keys

Procedia PDF Downloads 115
6892 The Role of Business Process Management in Driving Digital Transformation: Insurance Company Case Study

Authors: Dalia Suša Vugec, Ana-Marija Stjepić, Darija Ivandić Vidović

Abstract:

Digital transformation is one of the latest trends on the global market. In order to maintain the competitive advantage and sustainability, increasing number of organizations are conducting digital transformation processes. Those organizations are changing their business processes and creating new business models with the help of digital technologies. In that sense, one should also observe the role of business process management (BPM) and its maturity in driving digital transformation. Therefore, the goal of this paper is to investigate the role of BPM in digital transformation process within one organization. Since experiences from practice show that organizations from financial sector could be observed as leaders in digital transformation, an insurance company has been selected to participate in the study. That company has been selected due to the high level of its BPM maturity and the fact that it has previously been through a digital transformation process. In order to fulfill the goals of the paper, several interviews, as well as questionnaires, have been conducted within the selected company. The results are presented in a form of a case study. Results indicate that digital transformation process within the observed company has been successful, with special focus on the development of digital strategy, BPM and change management. The role of BPM in the digital transformation of the observed company is further discussed in the paper.

Keywords: business process management, case study, Croatia, digital transformation, insurance company

Procedia PDF Downloads 156
6891 Handshake Algorithm for Minimum Spanning Tree Construction

Authors: Nassiri Khalid, El Hibaoui Abdelaaziz et Hajar Moha

Abstract:

In this paper, we introduce and analyse a probabilistic distributed algorithm for a construction of a minimum spanning tree on network. This algorithm is based on the handshake concept. Firstly, each network node is considered as a sub-spanning tree. And at each round of the execution of our algorithm, a sub-spanning trees are merged. The execution continues until all sub-spanning trees are merged into one. We analyze this algorithm by a stochastic process.

Keywords: Spanning tree, Distributed Algorithm, Handshake Algorithm, Matching, Probabilistic Analysis

Procedia PDF Downloads 628
6890 Robust Data Image Watermarking for Data Security

Authors: Harsh Vikram Singh, Ankur Rai, Anand Mohan

Abstract:

In this paper, we propose secure and robust data hiding algorithm based on DCT by Arnold transform and chaotic sequence. The watermark image is scrambled by Arnold cat map to increases its security and then the chaotic map is used for watermark signal spread in middle band of DCT coefficients of the cover image The chaotic map can be used as pseudo-random generator for digital data hiding, to increase security and robustness .Performance evaluation for robustness and imperceptibility of proposed algorithm has been made using bit error rate (BER), normalized correlation (NC), and peak signal to noise ratio (PSNR) value for different watermark and cover images such as Lena, Girl, Tank images and gain factor .We use a binary logo image and text image as watermark. The experimental results demonstrate that the proposed algorithm achieves higher security and robustness against JPEG compression as well as other attacks such as addition of noise, low pass filtering and cropping attacks compared to other existing algorithm using DCT coefficients. Moreover, to recover watermarks in proposed algorithm, there is no need to original cover image.

Keywords: data hiding, watermarking, DCT, chaotic sequence, arnold transforms

Procedia PDF Downloads 482
6889 Acoustic Echo Cancellation Using Different Adaptive Algorithms

Authors: Hamid Sharif, Nazish Saleem Abbas, Muhammad Haris Jamil

Abstract:

An adaptive filter is a filter that self-adjusts its transfer function according to an optimization algorithm driven by an error signal. Because of the complexity of the optimization algorithms, most adaptive filters are digital filters. Adaptive filtering constitutes one of the core technologies in digital signal processing and finds numerous application areas in science as well as in industry. Adaptive filtering techniques are used in a wide range of applications, including adaptive noise cancellation and echo cancellation. Acoustic echo cancellation is a common occurrence in today’s telecommunication systems. The signal interference caused by acoustic echo is distracting to both users and causes a reduction in the quality of the communication. In this paper, we review different techniques of adaptive filtering to reduce this unwanted echo. In this paper, we see the behavior of techniques and algorithms of adaptive filtering like Least Mean Square (LMS), Normalized Least Mean Square (NLMS), Variable Step-Size Least Mean Square (VSLMS), Variable Step-Size Normalized Least Mean Square (VSNLMS), New Varying Step Size LMS Algorithm (NVSSLMS) and Recursive Least Square (RLS) algorithms to reduce this unwanted echo, to increase communication quality.

Keywords: adaptive acoustic, echo cancellation, LMS algorithm, adaptive filter, normalized least mean square (NLMS), variable step-size least mean square (VSLMS)

Procedia PDF Downloads 45
6888 Evaluating Key Attributes of Effective Digital Games in Tertiary Education

Authors: Roopali Kulkarni, Yuliya Khrypko

Abstract:

A major problem in educational digital game design is that game developers are often focused on maintaining the fun and playability of an educational game, whereas educators are more concerned with the learning aspect of the game rather than its entertaining characteristics. There is a clear need to understand what key aspects of digital learning games make them an effective learning medium in tertiary education. Through a systematic literature review and content analysis, this paper identifies, evaluates, and summarizes twenty-three key attributes of digital games used in tertiary education and presents a summary digital game-based learning (DGBL) model for designing and evaluating an educational digital game of any genre that promotes effective learning in tertiary education. The proposed solution overcomes limitations of previously designed models for digital game evaluation, such as a small number of game attributes considered or applicability to a specific genre of digital games. The proposed DGBL model can be used to assist game designers and educators with creating effective and engaging educational digital games for the tertiary education curriculum.

Keywords: DGBL model, digital games, educational games, game-based learning, tertiary education

Procedia PDF Downloads 236
6887 Horizontal Circular Curve Computations Using a Developed Calculator

Authors: Adil Hassabo

Abstract:

In this paper, a horizontal circular curve computations calculator is developed in Microsoft Windows. The developed calculator can be used for determining the necessary information required for setting out horizontal curves. Three methods are applied in the developed program namely: incremental chord method, total chord method, and the coordinates method. Computations of horizontal curves by the developed calculator is faster, easier, accurate, and less subject to errors comparable to the traditional method of calculations. Finally, the results obtained by the traditional method and by the developed calculator are presented for checking the behavior of the developed calculator.

Keywords: calculator, circular, computations, curve

Procedia PDF Downloads 125
6886 Role of Digital Economy in the Emerging Countries Like Nigeria

Authors: Aminu Fagge Muhammad

Abstract:

The digital economy is fast becoming the most innovative and widest reaching economy in the world, especially in developing countries. The paper aimed at examining role of digital economy in the emerging countries like Nigeria. The methodology used in the study is Business Model Perspective: lying between the process and structural perspectives, bring in the idea of the new business models that are being enabled e.g. e-business or e-commerce. The paper concluded that, the policy objectives and measures, and processes and structures necessary to enhance digital economy growth and its contribution to socio-economic development. The finding reveals that, digital infrastructure is in part incomplete, costly and poorly-performing in emerging economies like Nigeria. The wider digital ecosystem suffers a shortfall in human capabilities, weak financing, and poor governance. It is also found that, Growth in the digital economy is exacerbating digital exclusion, inequality, adverse incorporation and other digital harms. It is recommended that, government in partnership with private sector should build strong local infrastructure to enable broadband availability and accessibility and to create an enabling environment for strong competition in the telecom and technology ecosystem.

Keywords: Digital Economy, Emerging Countries, Business Model , Nigeria

Procedia PDF Downloads 89
6885 Refuge(e)s in Digital Diaspora: Reimagining and Reimaging ‘Ethnically Cleansed’ Villages as ‘Cyber Villages’

Authors: Hariz Halilovich

Abstract:

Based on conventional and digital ethnography, this paper discusses the ways Bosnian refugees utilise digital technologies and new media to recreate, synchronise and sustain their identities and memories in the aftermath of ‘ethnic cleansing’ and genocide and in the contexts of their new emplacements and home-making practices in diaspora. In addition to discussing representations of displacement and emplacement in the ‘digital age’, the paper also aims to make a contribution to the understanding and application of digital ethnography as an emerging method of inquiry in anthropology and related social science disciplines. While some researchers see digital ethnography as an exclusively online–based research, the author of this paper argues that it is critical to understand the online world in the context of the real world—made of real people, places, and social relations.

Keywords: Bosnia, cyber villages, digital diaspora, refugees

Procedia PDF Downloads 211
6884 Quantifying the Second-Level Digital Divide on Sub-National Level with a Composite Index

Authors: Vladimir Korovkin, Albert Park, Evgeny Kaganer

Abstract:

The paper studies the second-level digital divide (the one defined by the way how digital technology is used in everyday life) between regions of the Russian Federation. The paper offers a systemic review of literature on the measurement of the digital divide; based upon this it suggests a composite Digital Life Index, that captures the complex multi-dimensional character of the phenomenon. The model of the index studies separately the digital supply and demand across seven independent dimensions providing for 14 subindices. The Index is based on Internet-borne data, a distinction from traditional research approaches that rely on official statistics or surveys. Regression analysis is used to determine the relative importance of factors like income, human capital, and policy in determining the digital divide. The result of the analysis suggests that the digital divide is driven more by the differences in demand (defined by consumer competencies) than in supply; the role of income is insignificant, and the quality of human capital is the key determinant of the divide. The paper advances the existing methodological literature on the issue and can also inform practical decision-making regarding the strategies of national and regional digital development.

Keywords: digital transformation, second-level digital divide, composite index, digital policy, regional development, Russia

Procedia PDF Downloads 152
6883 Identity-Based Encryption: A Comparison of Leading Classical and Post-Quantum Implementations in an Enterprise Setting

Authors: Emily Stamm, Neil Smyth, Elizabeth O'Sullivan

Abstract:

In Identity-Based Encryption (IBE), an identity, such as a username, email address, or domain name, acts as the public key. IBE consolidates the PKI by eliminating the repetitive process of requesting public keys for each message encryption. Two of the most popular schemes are Sakai-Kasahara (SAKKE), which is based on elliptic curve pairings, and the Ducas, Lyubashevsky, and Prest lattice scheme (DLP- Lattice), which is based on quantum-secure lattice cryptography. In or- der to embed the schemes in a standard enterprise setting, both schemes are implemented as shared system libraries and integrated into a REST service that functions at the enterprise level. The performance of both schemes as libraries and services is compared, and the practicalities of implementation and application are discussed. Our performance results indicate that although SAKKE has the smaller key and ciphertext sizes, DLP-Lattice is significantly faster overall and we recommend it for most enterprise use cases.

Keywords: identity-based encryption, post-quantum cryptography, lattice-based cryptography, IBE

Procedia PDF Downloads 83