Search results for: elliptic curve digital signature algorithm
6724 Seismic Fragility Curves Methodologies for Bridges: A Review
Authors: Amirmozafar Benshams, Khatere Kashmari, Farzad Hatami, Mesbah Saybani
Abstract:
As a part of the transportation network, bridges are one of the most vulnerable structures. In order to investigate the vulnerability and seismic evaluation of bridges performance, identifying of bridge associated with various state of damage is important. Fragility curves provide important data about damage states and performance of bridges against earthquakes. The development of vulnerability information in the form of fragility curves is a widely practiced approach when the information is to be developed accounting for a multitude of uncertain source involved. This paper presents the fragility curve methodologies for bridges and investigates the practice and applications relating to the seismic fragility assessment of bridges.Keywords: fragility curve, bridge, uncertainty, NLTHA, IDA
Procedia PDF Downloads 2826723 An Adaptive Back-Propagation Network and Kalman Filter Based Multi-Sensor Fusion Method for Train Location System
Authors: Yu-ding Du, Qi-lian Bao, Nassim Bessaad, Lin Liu
Abstract:
The Global Navigation Satellite System (GNSS) is regarded as an effective approach for the purpose of replacing the large amount used track-side balises in modern train localization systems. This paper describes a method based on the data fusion of a GNSS receiver sensor and an odometer sensor that can significantly improve the positioning accuracy. A digital track map is needed as another sensor to project two-dimensional GNSS position to one-dimensional along-track distance due to the fact that the train’s position can only be constrained on the track. A model trained by BP neural network is used to estimate the trend positioning error which is related to the specific location and proximate processing of the digital track map. Considering that in some conditions the satellite signal failure will lead to the increase of GNSS positioning error, a detection step for GNSS signal is applied. An adaptive weighted fusion algorithm is presented to reduce the standard deviation of train speed measurement. Finally an Extended Kalman Filter (EKF) is used for the fusion of the projected 1-D GNSS positioning data and the 1-D train speed data to get the estimate position. Experimental results suggest that the proposed method performs well, which can reduce positioning error notably.Keywords: multi-sensor data fusion, train positioning, GNSS, odometer, digital track map, map matching, BP neural network, adaptive weighted fusion, Kalman filter
Procedia PDF Downloads 2526722 Digital Forensics Showdown: Encase and FTK Head-to-Head
Authors: Rida Nasir, Waseem Iqbal
Abstract:
Due to the constant revolution in technology and the increase in anti-forensic techniques used by attackers to remove their traces, professionals often struggle to choose the best tool to be used in digital forensic investigations. This paper compares two of the most well-known and widely used licensed commercial tools, i.e., Encase & FTK. The comparison was drawn on various parameters and features to provide an authentic evaluation of licensed versions of these well-known commercial tools against various real-world scenarios. In order to discover the popularity of these tools within the digital forensic community, a survey was conducted publicly to determine the preferred choice. The dataset used is the Computer Forensics Reference Dataset (CFReDS). A total of 70 features were selected from various categories. Upon comparison, both FTK and EnCase produce remarkable results. However, each tool has some limitations, and none of the tools is declared best. The comparison drawn is completely unbiased, based on factual data.Keywords: digital forensics, commercial tools, investigation, forensic evaluation
Procedia PDF Downloads 196721 Simulation and Controller Tunning in a Photo-Bioreactor Applying by Taguchi Method
Authors: Hosein Ghahremani, MohammadReza Khoshchehre, Pejman Hakemi
Abstract:
This study involves numerical simulations of a vertical plate-type photo-bioreactor to investigate the performance of Microalgae Spirulina and Control and optimization of parameters for the digital controller by Taguchi method that MATLAB software and Qualitek-4 has been made. Since the addition of parameters such as temperature, dissolved carbon dioxide, biomass, and ... Some new physical parameters such as light intensity and physiological conditions like photosynthetic efficiency and light inhibitors are involved in biological processes, control is facing many challenges. Not only facilitate the commercial production photo-bioreactor Microalgae as feed for aquaculture and food supplements are efficient systems but also as a possible platform for the production of active molecules such as antibiotics or innovative anti-tumor agents, carbon dioxide removal and removal of heavy metals from wastewater is used. Digital controller is designed for controlling the light bioreactor until Microalgae growth rate and carbon dioxide concentration inside the bioreactor is investigated. The optimal values of the controller parameters of the S/N and ANOVA analysis software Qualitek-4 obtained With Reaction curve, Cohen-Con and Ziegler-Nichols method were compared. The sum of the squared error obtained for each of the control methods mentioned, the Taguchi method as the best method for controlling the light intensity was selected photo-bioreactor. This method compared to control methods listed the higher stability and a shorter interval to be answered.Keywords: photo-bioreactor, control and optimization, Light intensity, Taguchi method
Procedia PDF Downloads 3946720 Self-Marketing on Line Person-to-Person Social Media
Authors: Chih-Ping Chen
Abstract:
Today, technology does not necessitate change; rather, social media has afforded a new arena and digital tools for users/individuals to be symbolized and marketed in meaningful exchanges of digital identities. We argue that these symbolic interactions may afford individuals the ability to create and present less restricted Line person-to-person (P2P) chats than would be possible in face-to-face communications. Individuals can select flexible influence strategies to market themselves, which enables them to create and present their digital identities and impressions in alternative ways within a dynamic sociocultural context. Therefore, this paper aims to explore the novel phenomenon of how individuals market themselves to manage their digital identities and impressions to connect with other users through the symbolic interactions created by new digital tools (e.g., stickers). A netnographic approach was developed by applying a triangulated methodology consisting of user self-diary reports, in-depth interviews, and observations. Totally, 20 participants (10 females and 10 males) were of Taiwanese origin, and their ages ranged from 20–47 years old. The findings of this research showed that individuals on Line P2P social media where traditional cultural gender norms have shifted. Both male and female participants market their modern digital identities by adopting a combination of flexible influence tactics/strategies when using digital stickers. Some findings showed that their influence tactics/strategies often flouted Taiwanese cultural gender norms or skirted traditional rules to fit individual or P2P needs. Finally, these findings potentially contributed to the literature regarding the consumer culture theory and symbolic interaction theory in digital marketing and social media fields.Keywords: Consumer culture theory, Digital sticker, Self-marketing, Impression, Symbolic interaciton
Procedia PDF Downloads 816719 Multi-Objective Variable Neighborhood Search Algorithm to Solving Scheduling Problem with Transportation Times
Authors: Majid Khalili
Abstract:
This paper deals with a bi-objective hybrid no-wait flowshop scheduling problem minimizing the makespan and total weighted tardiness, in which we consider transportation times between stages. Obtaining an optimal solution for this type of complex, large-sized problem in reasonable computational time by using traditional approaches and optimization tools is extremely difficult. This paper presents a new multi-objective variable neighborhood algorithm (MOVNS). A set of experimental instances are carried out to evaluate the algorithm by advanced multi-objective performance measures. The algorithm is carefully evaluated for its performance against available algorithm by means of multi-objective performance measures and statistical tools. The related results show that a variant of our proposed MOVNS provides sound performance comparing with other algorithms. Procedia PDF Downloads 4186718 The Digital Living Archive and the Construction of a Participatory Cultural Memory in the DARE-UIA Project: Digital Environment for Collaborative Alliances to Regenerate Urban Ecosystems in Middle-Sized Cities
Authors: Giulia Cardoni, Francesca Fabbrii
Abstract:
Living archives perform a function of social memory sharing, which contributes to building social bonds, communities, and identities. This potential lies in the ability to live archives to put together an archival function, which allows the conservation and transmission of memory with an artistic, performative and creative function linked to the present. As part of the DARE-UIA (Digital environment for collaborative alliances to regenerate urban ecosystems in middle-sized cities) project the creation of a living digital archive made it possible to create a narrative that would consolidate the cultural memory of the Darsena district of the city of Ravenna. The aim of the project is to stimulate the urban regeneration of a suburban area of a city, enhancing its cultural memory and identity heritage through digital heritage tools. The methodology used involves various digital storytelling actions necessary for the overall narrative using georeferencing systems (GIS), storymaps and 3D reconstructions for a transversal narration of historical content such as personal and institutional historical photos and to enhance the industrial archeology heritage of the neighborhood. The aim is the creation of an interactive and replicable narrative in similar contexts to the Darsena district in Ravenna. The living archive, in which all the digital contents are inserted, finds its manifestation towards the outside in the form of a museum spread throughout the neighborhood, making the contents usable on smartphones via QR codes and totems inserted on-site, creating thematic itineraries spread around the neighborhood. The construction of an interactive and engaging digital narrative has made it possible to enhance the material and immaterial heritage of the neighborhood by recreating the community that has historically always distinguished it.Keywords: digital living archive, digital storytelling, GIS, 3D, open-air museum, urban regeneration, cultural memory
Procedia PDF Downloads 1066717 Simulation of 3-D Direction-of-Arrival Estimation Using MUSIC Algorithm
Authors: Duckyong Kim, Jong Kang Park, Jong Tae Kim
Abstract:
DOA (Direction of Arrival) estimation is an important method in array signal processing and has a wide range of applications such as direction finding, beam forming, and so on. In this paper, we briefly introduce the MUSIC (Multiple Signal Classification) Algorithm, one of DOA estimation methods for analyzing several targets. Then we apply the MUSIC algorithm to the two-dimensional antenna array to analyze DOA estimation in 3D space through MATLAB simulation. We also analyze the design factors that can affect the accuracy of DOA estimation through simulation, and proceed with further consideration on how to apply the system.Keywords: DOA estimation, MUSIC algorithm, spatial spectrum, array signal processing
Procedia PDF Downloads 3796716 Digital Skepticism In A Legal Philosophical Approach
Authors: dr. Bendes Ákos
Abstract:
Digital skepticism, a critical stance towards digital technology and its pervasive influence on society, presents significant challenges when analyzed from a legal philosophical perspective. This abstract aims to explore the intersection of digital skepticism and legal philosophy, emphasizing the implications for justice, rights, and the rule of law in the digital age. Digital skepticism arises from concerns about privacy, security, and the ethical implications of digital technology. It questions the extent to which digital advancements enhance or undermine fundamental human values. Legal philosophy, which interrogates the foundations and purposes of law, provides a framework for examining these concerns critically. One key area where digital skepticism and legal philosophy intersect is in the realm of privacy. Digital technologies, particularly data collection and surveillance mechanisms, pose substantial threats to individual privacy. Legal philosophers must grapple with questions about the limits of state power and the protection of personal autonomy. They must consider how traditional legal principles, such as the right to privacy, can be adapted or reinterpreted in light of new technological realities. Security is another critical concern. Digital skepticism highlights vulnerabilities in cybersecurity and the potential for malicious activities, such as hacking and cybercrime, to disrupt legal systems and societal order. Legal philosophy must address how laws can evolve to protect against these new forms of threats while balancing security with civil liberties. Ethics plays a central role in this discourse. Digital technologies raise ethical dilemmas, such as the development and use of artificial intelligence and machine learning algorithms that may perpetuate biases or make decisions without human oversight. Legal philosophers must evaluate the moral responsibilities of those who design and implement these technologies and consider the implications for justice and fairness. Furthermore, digital skepticism prompts a reevaluation of the concept of the rule of law. In an increasingly digital world, maintaining transparency, accountability, and fairness becomes more complex. Legal philosophers must explore how legal frameworks can ensure that digital technologies serve the public good and do not entrench power imbalances or erode democratic principles. Finally, the intersection of digital skepticism and legal philosophy has practical implications for policy-making. Legal scholars and practitioners must work collaboratively to develop regulations and guidelines that address the challenges posed by digital technology. This includes crafting laws that protect individual rights, ensure security, and promote ethical standards in technology development and deployment. In conclusion, digital skepticism provides a crucial lens for examining the impact of digital technology on law and society. A legal philosophical approach offers valuable insights into how legal systems can adapt to protect fundamental values in the digital age. By addressing privacy, security, ethics, and the rule of law, legal philosophers can help shape a future where digital advancements enhance, rather than undermine, justice and human dignity.Keywords: legal philosophy, privacy, security, ethics, digital skepticism
Procedia PDF Downloads 446715 Economic Growth and Transport Carbon Dioxide Emissions in New Zealand: A Co-Integration Analysis of the Environmental Kuznets Curve
Authors: Mingyue Sheng, Basil Sharp
Abstract:
Greenhouse gas (GHG) emissions from national transport account for the largest share of emissions from energy use in New Zealand. Whether the environmental Kuznets curve (EKC) relationship exists between environmental degradation indicators from the transport sector and economic growth in New Zealand remains unclear. This paper aims at exploring the causality relationship between CO₂ emissions from the transport sector, fossil fuel consumption, and the Gross Domestic Product (GDP) per capita in New Zealand, using annual data for the period 1977 to 2013. First, conventional unit root tests (Augmented Dickey–Fuller and Phillips–Perron tests), and a unit root test with the breakpoint (Zivot-Andrews test) are employed to examine the stationarity of the variables. Second, the autoregressive distributed lag (ARDL) bounds test for co-integration, followed by Granger causality investigated causality among the variables. Empirical results of the study reveal that, in the short run, there is a unidirectional causality between economic growth and transport CO₂ emissions with direction from economic growth to transport CO₂ emissions, as well as a bidirectional causality from transport CO₂ emissions to road energy consumption.Keywords: economic growth, transport carbon dioxide emissions, environmental Kuznets curve, causality
Procedia PDF Downloads 3006714 A Laundry Algorithm for Colored Textiles
Authors: H. E. Budak, B. Arslan-Ilkiz, N. Cakmakci, I. Gocek, U. K. Sahin, H. Acikgoz-Tufan, M. H. Arslan
Abstract:
The aim of this study is to design a novel laundry algorithm for colored textiles which have significant decoloring problem. During the experimental work, bleached knitted single jersey fabric made of 100% cotton and dyed with reactive dyestuff was utilized, since according to a conducted survey textiles made of cotton are the most demanded textile products in the textile market by the textile consumers and for coloration of textiles reactive dyestuffs are the ones that are the most commonly used in the textile industry for dyeing cotton-made products. Therefore, the fabric used in this study was selected and purchased in accordance with the survey results. The fabric samples cut out of this fabric were dyed with different dyeing parameters by using Remazol Brilliant Red 3BS dyestuff in Gyrowash machine at laboratory conditions. From the alternative reactive-dyed cotton fabric samples, the ones that have high tendency to color loss were determined and examined. Accordingly, the parameters of the dyeing process used for these fabric samples were evaluated and the dyeing process which was chosen to be used for causing high tendency to color loss for the cotton fabrics was determined in order to reveal the level of improvement in color loss during this study clearly. Afterwards, all of the untreated fabric samples cut out of the fabric purchased were dyed with the dyeing process selected. When dyeing process was completed, an experimental design was created for the laundering process by using Minitab® program considering temperature, time and mechanical action as parameters. All of the washing experiments were performed in domestic washing machine. 16 washing experiments were performed with 8 different experimental conditions and 2 repeats for each condition. After each of the washing experiments, water samples of the main wash of the laundering process were measured with UV spectrophotometer. The values obtained were compared with the calibration curve of the materials used for the dyeing process. The results of the washing experiments were statistically analyzed with Minitab® program. According to the results, the most suitable washing algorithm to be used in terms of the parameters temperature, time and mechanical action for domestic washing machines for minimizing fabric color loss was chosen. The laundry algorithm proposed in this study have the ability of minimalizing the problem of color loss of colored textiles in washing machines by eliminating the negative effects of the parameters of laundering process on color of textiles without compromising the fundamental effects of basic cleaning action being performed properly. Therefore, since fabric color loss is minimized with this washing algorithm, dyestuff residuals will definitely be lower in the grey water released from the laundering process. In addition to this, with this laundry algorithm it is possible to wash and clean other types of textile products with proper cleaning effect and minimized color loss.Keywords: color loss, laundry algorithm, textiles, domestic washing process
Procedia PDF Downloads 3576713 A Good Start for Digital Transformation of the Companies: A Literature and Experience-Based Predefined Roadmap
Authors: Batuhan Kocaoglu
Abstract:
Nowadays digital transformation is a hot topic both in service and production business. For the companies who want to stay alive in the following years, they should change how they do their business. Industry leaders started to improve their ERP (Enterprise Resource Planning) like backbone technologies to digital advances such as analytics, mobility, sensor-embedded smart devices, AI (Artificial Intelligence) and more. Selecting the appropriate technology for the related business problem also is a hot topic. Besides this, to operate in the modern environment and fulfill rapidly changing customer expectations, a digital transformation of the business is required and change the way the business runs, affect how they do their business. Even the digital transformation term is trendy the literature is limited and covers just the philosophy instead of a solid implementation plan. Current studies urge firms to start their digital transformation, but few tell us how to do. The huge investments scare companies with blur definitions and concepts. The aim of this paper to solidify the steps of the digital transformation and offer a roadmap for the companies and academicians. The proposed roadmap is developed based upon insights from the literature review, semi-structured interviews, and expert views to explore and identify crucial steps. We introduced our roadmap in the form of 8 main steps: Awareness; Planning; Operations; Implementation; Go-live; Optimization; Autonomation; Business Transformation; including a total of 11 sub-steps with examples. This study also emphasizes four dimensions of the digital transformation mainly: Readiness assessment; Building organizational infrastructure; Building technical infrastructure; Maturity assessment. Finally, roadmap corresponds the steps with three main terms used in digital transformation literacy as Digitization; Digitalization; and Digital Transformation. The resulted model shows that 'business process' and 'organizational issues' should be resolved before technology decisions and 'digitization'. Companies can start their journey with the solid steps, using the proposed roadmap to increase the success of their project implementation. Our roadmap is also adaptable for relevant Industry 4.0 and enterprise application projects. This roadmap will be useful for companies to persuade their top management for investments. Our results can be used as a baseline for further researches related to readiness assessment and maturity assessment studies.Keywords: digital transformation, digital business, ERP, roadmap
Procedia PDF Downloads 1706712 Data Hiding by Vector Quantization in Color Image
Authors: Yung Gi Wu
Abstract:
With the growing of computer and network, digital data can be spread to anywhere in the world quickly. In addition, digital data can also be copied or tampered easily so that the security issue becomes an important topic in the protection of digital data. Digital watermark is a method to protect the ownership of digital data. Embedding the watermark will influence the quality certainly. In this paper, Vector Quantization (VQ) is used to embed the watermark into the image to fulfill the goal of data hiding. This kind of watermarking is invisible which means that the users will not conscious the existing of embedded watermark even though the embedded image has tiny difference compared to the original image. Meanwhile, VQ needs a lot of computation burden so that we adopt a fast VQ encoding scheme by partial distortion searching (PDS) and mean approximation scheme to speed up the data hiding process. The watermarks we hide to the image could be gray, bi-level and color images. Texts are also can be regarded as watermark to embed. In order to test the robustness of the system, we adopt Photoshop to fulfill sharpen, cropping and altering to check if the extracted watermark is still recognizable. Experimental results demonstrate that the proposed system can resist the above three kinds of tampering in general cases.Keywords: data hiding, vector quantization, watermark, color image
Procedia PDF Downloads 3646711 Reactive Power Cost Evaluation with FACTS Devices in Restructured Power System
Authors: A. S. Walkey, N. P. Patidar
Abstract:
It is not always economical to provide reactive power using synchronous alternators. The cost of reactive power can be minimized by optimal placing of FACTS devices in power systems. In this paper a Particle Swarm Optimization- Sequential Quadratic Programming (PSO-SQP) algorithm is applied to minimize the cost of reactive power generation along with real power generation to alleviate the bus voltage violations. The effectiveness of proposed approach tested on IEEE-14 bus systems. In this paper in addition to synchronous generators, an opportunity of FACTS devices are also proposed to procure the reactive power demands in the power system.Keywords: reactive power, reactive power cost, voltage security margins, capability curve, FACTS devices
Procedia PDF Downloads 5066710 AI-Based Information System for Hygiene and Safety Management of Shared Kitchens
Authors: Jongtae Rhee, Sangkwon Han, Seungbin Ji, Junhyeong Park, Byeonghun Kim, Taekyung Kim, Byeonghyeon Jeon, Jiwoo Yang
Abstract:
The shared kitchen is a concept that transfers the value of the sharing economy to the kitchen. It is a type of kitchen equipped with cooking facilities that allows multiple companies or chefs to share time and space and use it jointly. These shared kitchens provide economic benefits and convenience, such as reduced investment costs and rent, but also increase the risk of safety management, such as cross-contamination of food ingredients. Therefore, to manage the safety of food ingredients and finished products in a shared kitchen where several entities jointly use the kitchen and handle various types of food ingredients, it is critical to manage followings: the freshness of food ingredients, user hygiene and safety and cross-contamination of cooking equipment and facilities. In this study, it propose a machine learning-based system for hygiene safety and cross-contamination management, which are highly difficult to manage. User clothing management and user access management, which are most relevant to the hygiene and safety of shared kitchens, are solved through machine learning-based methodology, and cutting board usage management, which is most relevant to cross-contamination management, is implemented as an integrated safety management system based on artificial intelligence. First, to prevent cross-contamination of food ingredients, we use images collected through a real-time camera to determine whether the food ingredients match a given cutting board based on a real-time object detection model, YOLO v7. To manage the hygiene of user clothing, we use a camera-based facial recognition model to recognize the user, and real-time object detection model to determine whether a sanitary hat and mask are worn. In addition, to manage access for users qualified to enter the shared kitchen, we utilize machine learning based signature recognition module. By comparing the pairwise distance between the contract signature and the signature at the time of entrance to the shared kitchen, access permission is determined through a pre-trained signature verification model. These machine learning-based safety management tasks are integrated into a single information system, and each result is managed in an integrated database. Through this, users are warned of safety dangers through the tablet PC installed in the shared kitchen, and managers can track the cause of the sanitary and safety accidents. As a result of system integration analysis, real-time safety management services can be continuously provided by artificial intelligence, and machine learning-based methodologies are used for integrated safety management of shared kitchens that allows dynamic contracts among various users. By solving this problem, we were able to secure the feasibility and safety of the shared kitchen business.Keywords: artificial intelligence, food safety, information system, safety management, shared kitchen
Procedia PDF Downloads 696709 Medial Axis Analysis of Valles Marineris
Authors: Dan James
Abstract:
The Medial Axis of the Main Canyon of Valles Marineris is determined geometrically with maximally inscribed discs aligned with the boundaries or rims of the Main Canyon. Inscribed discs are placed at evenly spaced longitude intervals and, using the radius function, the locus of the centre of all discs is determined, together with disc centre co-ordinates. These centre co-ordinates result in arrays of x, y co-ordinates which are curve fitted to a Sinusoidal function and residuals appropriate for nonlinear regression are evaluated using the R-squared value (R2) and the Root Mean Squared Error (RMSE). This evaluation demonstrates that a Sinusoidal Curve closely fits to the co-ordinate dataKeywords: medial axis, MAT, valles marineris, sinusoidal
Procedia PDF Downloads 1006708 Part of Speech Tagging Using Statistical Approach for Nepali Text
Authors: Archit Yajnik
Abstract:
Part of Speech Tagging has always been a challenging task in the era of Natural Language Processing. This article presents POS tagging for Nepali text using Hidden Markov Model and Viterbi algorithm. From the Nepali text, annotated corpus training and testing data set are randomly separated. Both methods are employed on the data sets. Viterbi algorithm is found to be computationally faster and accurate as compared to HMM. The accuracy of 95.43% is achieved using Viterbi algorithm. Error analysis where the mismatches took place is elaborately discussed.Keywords: hidden markov model, natural language processing, POS tagging, viterbi algorithm
Procedia PDF Downloads 3296707 Digital Transformation in Production Planning and Control: Evaluation of the Organizational Readiness
Authors: Tobias Wissing, Peter Burggräf, Johannes Wagner
Abstract:
Cost pressure, competitiveness and the increasing turbulence of globalized saturated markets has been the driver for a variety of research activities in the field of production planning and control (PPC) during the past decades. For some time past an increasing awareness for innovative technologies in terms of Industry 4.0 can be noticed. Although there are many promising approaches a solely installation of those smart solutions will not maximize the PPC performance. To accelerate the successful digital transformation the cooperation between employee and technology also has to be adapted. The existing processes and organizational structures might be not sufficient to maximize the utilization of technological innovations. This paper presents the key results of an extensive study which was conducted by the Laboratory for Machine Tools and Production Engineering (WZL) of the RWTH Aachen University to evaluate the current situation and examine the organizational readiness for this digital transformation.Keywords: cyber-physical production system, digital transformation, industry 4.0, production planning and control
Procedia PDF Downloads 3536706 Hybrid Gravity Gradient Inversion-Ant Colony Optimization Algorithm for Motion Planning of Mobile Robots
Authors: Meng Wu
Abstract:
Motion planning is a common task required to be fulfilled by robots. A strategy combining Ant Colony Optimization (ACO) and gravity gradient inversion algorithm is proposed for motion planning of mobile robots. In this paper, in order to realize optimal motion planning strategy, the cost function in ACO is designed based on gravity gradient inversion algorithm. The obstacles around mobile robot can cause gravity gradient anomalies; the gradiometer is installed on the mobile robot to detect the gravity gradient anomalies. After obtaining the anomalies, gravity gradient inversion algorithm is employed to calculate relative distance and orientation between mobile robot and obstacles. The relative distance and orientation deduced from gravity gradient inversion algorithm is employed as cost function in ACO algorithm to realize motion planning. The proposed strategy is validated by the simulation and experiment results.Keywords: motion planning, gravity gradient inversion algorithm, ant colony optimization
Procedia PDF Downloads 1376705 Incorporating Anomaly Detection in a Digital Twin Scenario Using Symbolic Regression
Authors: Manuel Alves, Angelica Reis, Armindo Lobo, Valdemar Leiras
Abstract:
In industry 4.0, it is common to have a lot of sensor data. In this deluge of data, hints of possible problems are difficult to spot. The digital twin concept aims to help answer this problem, but it is mainly used as a monitoring tool to handle the visualisation of data. Failure detection is of paramount importance in any industry, and it consumes a lot of resources. Any improvement in this regard is of tangible value to the organisation. The aim of this paper is to add the ability to forecast test failures, curtailing detection times. To achieve this, several anomaly detection algorithms were compared with a symbolic regression approach. To this end, Isolation Forest, One-Class SVM and an auto-encoder have been explored. For the symbolic regression PySR library was used. The first results show that this approach is valid and can be added to the tools available in this context as a low resource anomaly detection method since, after training, the only requirement is the calculation of a polynomial, a useful feature in the digital twin context.Keywords: anomaly detection, digital twin, industry 4.0, symbolic regression
Procedia PDF Downloads 1206704 The Impact of Bitcoin and Cryptocurrency on the Development of Community
Authors: Felib Ayman Shawky Salem
Abstract:
Nowadays crypto currency has become a global phenomenon known to most people. People using this alternative digital money to do a transaction in many ways (e.g. Used for online shopping, wealth management, and fundraising). However, this digital asset also widely used in criminal activities since its use decentralized control as opposed to centralized electronic money and central banking systems and this makes a user, who used this currency invisible. The high-value exchange of these digital currencies also has been a target to criminal activities. The crypto currency crimes have become a challenge for the law enforcement to analyze and to proof the evidence as criminal devices. In this paper, our focus is more on bitcoin crypto currency and the possible artifacts that can be obtained from the different type of digital wallet, which is software and browser-based application. The process memory and physical hard disk are examined with the aims of identifying and recovering potential digital evidence. The stage of data acquisition divided by three states which are the initial creation of the wallet, transaction that consists transfer and receiving a coin and the last state is after the wallet is being deleted. Findings from this study suggest that both data from software and browser type of wallet process memory is a valuable source of evidence, and many of the artifacts found in process memory are also available from the application and wallet files on the client computer storage.Keywords: cryptocurrency, bitcoin, payment methods, blockchain, appropriation, online retailers, TOE framework, disappropriation, non-appropriationBitCoin, financial protection, crypto currency, money laundering cryptocurrency, digital wallet, digital forensics
Procedia PDF Downloads 426703 Writing a Parametric Design Algorithm Based on Recreation and Structural Analysis of Patkane Model: The Case Study of Oshtorjan Mosque
Authors: Behnoush Moghiminia, Jesus Anaya Diaz
Abstract:
The current study attempts to present the relationship between the structure development and Patkaneh as one of the Iranian geometric patterns and parametric algorithms by introducing two practical methods. While having a structural function, Patkaneh is also used as an ornamental element. It can be helpful in the scientific and practical review of Patkaneh. The current study aims to use Patkaneh as a parametric form generator based on the algorithm. The current paper attempts to express how can a more complete algorithm of this covering be obtained based on the parametric study and analysis of a sample of a Patkaneh and also investigate the relationship between the development of the geometrical pattern of Patkaneh as a structural-decorative element of Iranian architecture and digital design. In this regard, to achieve the research purposes, researchers investigated the oldest type of Patkaneh in the architecture history of Iran, such as the Northern Entrance Patkaneh of Oshtorjan Jame’ Mosque. An accurate investigation was done on the history of the background to answer the questions. Then, by investigating the structural behavior of Patkaneh, the decorative or structural-decorative role of Patkaneh was investigated to eliminate the ambiguity. Then, the geometrical structure of Patkaneh was analyzed by introducing two practical methods. The first method is based on the constituent units of Patkaneh (Square and diamond) and investigating the interactive relationships between them in 2D and 3D. This method is appropriate for cases where there are rational and regular geometrical relationships. The second method is based on the separation of the floors and the investigation of their interrelation. It is practical when the constituent units are not geometrically regular and have numerous diversity. Finally, the parametric form algorithm of these methods was codified.Keywords: geometric properties, parametric design, Patkaneh, structural analysis
Procedia PDF Downloads 1516702 Penguins Search Optimization Algorithm for Chaotic Synchronization System
Authors: Sofiane Bououden, Ilyes Boulkaibet
Abstract:
In terms of security of the information signal, the meta-heuristic Penguins Search Optimization Algorithm (PeSOA) is applied to synchronize chaotic encryption communications in the case of sensitive dependence on initial conditions in chaotic generator oscillator. The objective of this paper is the use of the PeSOA algorithm to exploring search space with random and iterative processes for synchronization of symmetric keys in both transmission and reception. Simulation results show the effectiveness of the PeSOA algorithm in generating symmetric keys of the encryption process and synchronizing.Keywords: meta-heuristic, PeSOA, chaotic systems, encryption, synchronization optimization
Procedia PDF Downloads 1956701 A Genetic Algorithm Based Permutation and Non-Permutation Scheduling Heuristics for Finite Capacity Material Requirement Planning Problem
Authors: Watchara Songserm, Teeradej Wuttipornpun
Abstract:
This paper presents a genetic algorithm based permutation and non-permutation scheduling heuristics (GAPNP) to solve a multi-stage finite capacity material requirement planning (FCMRP) problem in automotive assembly flow shop with unrelated parallel machines. In the algorithm, the sequences of orders are iteratively improved by the GA characteristics, whereas the required operations are scheduled based on the presented permutation and non-permutation heuristics. Finally, a linear programming is applied to minimize the total cost. The presented GAPNP algorithm is evaluated by using real datasets from automotive companies. The required parameters for GAPNP are intently tuned to obtain a common parameter setting for all case studies. The results show that GAPNP significantly outperforms the benchmark algorithm about 30% on average.Keywords: capacitated MRP, genetic algorithm, linear programming, automotive industries, flow shop, application in industry
Procedia PDF Downloads 4906700 CT Medical Images Denoising Based on New Wavelet Thresholding Compared with Curvelet and Contourlet
Authors: Amir Moslemi, Amir movafeghi, Shahab Moradi
Abstract:
One of the most important challenging factors in medical images is nominated as noise.Image denoising refers to the improvement of a digital medical image that has been infected by Additive White Gaussian Noise (AWGN). The digital medical image or video can be affected by different types of noises. They are impulse noise, Poisson noise and AWGN. Computed tomography (CT) images are subjected to low quality due to the noise. The quality of CT images is dependent on the absorbed dose to patients directly in such a way that increase in absorbed radiation, consequently absorbed dose to patients (ADP), enhances the CT images quality. In this manner, noise reduction techniques on the purpose of images quality enhancement exposing no excess radiation to patients is one the challenging problems for CT images processing. In this work, noise reduction in CT images was performed using two different directional 2 dimensional (2D) transformations; i.e., Curvelet and Contourlet and Discrete wavelet transform(DWT) thresholding methods of BayesShrink and AdaptShrink, compared to each other and we proposed a new threshold in wavelet domain for not only noise reduction but also edge retaining, consequently the proposed method retains the modified coefficients significantly that result in good visual quality. Data evaluations were accomplished by using two criterions; namely, peak signal to noise ratio (PSNR) and Structure similarity (Ssim).Keywords: computed tomography (CT), noise reduction, curve-let, contour-let, signal to noise peak-peak ratio (PSNR), structure similarity (Ssim), absorbed dose to patient (ADP)
Procedia PDF Downloads 4416699 Hyperspectral Image Classification Using Tree Search Algorithm
Authors: Shreya Pare, Parvin Akhter
Abstract:
Remotely sensing image classification becomes a very challenging task owing to the high dimensionality of hyperspectral images. The pixel-wise classification methods fail to take the spatial structure information of an image. Therefore, to improve the performance of classification, spatial information can be integrated into the classification process. In this paper, the multilevel thresholding algorithm based on a modified fuzzy entropy function is used to perform the segmentation of hyperspectral images. The fuzzy parameters of the MFE function have been optimized by using a new meta-heuristic algorithm based on the Tree-Search algorithm. The segmented image is classified by a large distribution machine (LDM) classifier. Experimental results are shown on a hyperspectral image dataset. The experimental outputs indicate that the proposed technique (MFE-TSA-LDM) achieves much higher classification accuracy for hyperspectral images when compared to state-of-art classification techniques. The proposed algorithm provides accurate segmentation and classification maps, thus becoming more suitable for image classification with large spatial structures.Keywords: classification, hyperspectral images, large distribution margin, modified fuzzy entropy function, multilevel thresholding, tree search algorithm, hyperspectral image classification using tree search algorithm
Procedia PDF Downloads 1776698 Digital Platforms: Creating Value through Network Effects under Pandemic Conditions
Authors: S. Łęgowik-Świącik
Abstract:
This article is a contribution to the research into the determinants of value creation via digital platforms in variable operating conditions. The dynamics of the market environment caused by the COVID-19 pandemic have made enterprises built on digital platforms financially successful. While many classic companies are struggling with the uncertainty of conducting a business and difficulties in the process of value creation, digital platforms create value by modifying the existing business model to meet the changing needs of customers. Therefore, the objective of this publication is to understand and explain the relationship between value creation and the conversion of the business model built on digital platforms under pandemic conditions. The considerations relating to the conceptual framework and determining the research objective allowed for adopting the hypothesis, assuming that the processes of value creation are evolving, and the measurement of these processes allows for the protection of value created and enables its growth in changing circumstances. The research methods, such as critical literature analysis and case study, were applied to accomplish the objective pursued and verify the hypothesis formulated. The empirical research was carried out based on the data from enterprises listed on the Nasdaq Stock Exchange: Amazon, Alibaba, and Facebook. The research period was the years 2018-2021. The surveyed enterprises were chosen based on the targeted selection. The problem discussed is important and current since the lack of in-depth theoretical research results in few attempts to identify the determinants of value creation via digital platforms. The above arguments led to an attempt at theoretical analysis and empirical research to fill in the gap perceived by deepening the understanding of the process of value creation through network effects via digital platforms under pandemic conditions.Keywords: business model, digital platforms, enterprise management, pandemic conditions, value creation process
Procedia PDF Downloads 1286697 Consumer Load Profile Determination with Entropy-Based K-Means Algorithm
Authors: Ioannis P. Panapakidis, Marios N. Moschakis
Abstract:
With the continuous increment of smart meter installations across the globe, the need for processing of the load data is evident. Clustering-based load profiling is built upon the utilization of unsupervised machine learning tools for the purpose of formulating the typical load curves or load profiles. The most commonly used algorithm in the load profiling literature is the K-means. While the algorithm has been successfully tested in a variety of applications, its drawback is the strong dependence in the initialization phase. This paper proposes a novel modified form of the K-means that addresses the aforementioned problem. Simulation results indicate the superiority of the proposed algorithm compared to the K-means.Keywords: clustering, load profiling, load modeling, machine learning, energy efficiency and quality
Procedia PDF Downloads 1646696 Digital Learning Repositories for Vocational Teaching and Knowledge Sharing
Authors: Prachyanun Nilsook, Panita Wannapiroon
Abstract:
The purpose of this research is to study a Digital Learning Repository System (DLRS) on vocational teachers and teaching in Thailand. The innobpcd.net is a DLRS being utilized by the Office of Vocational Education Commission and operationalized by the Bureau of Personnel Competency Development for vocational education teachers. The aim of the system is to support and enhance the process of vocational teaching and to improve staff development by providing teachers with a variety of network connections and information. The system provides centralized hosting and access to content, and the ability to share digital objects or files, to set permissions and controls for access to content that can be used vocational education teachers for their teaching and for their own development. The elements of DLRS include; Digital learning system, Media Library, Knowledge-based system and Mobile Application. The system aims to link vocational teachers to the most effective emerging technologies available for learning, so they are better resourced to support their vocational students. The initial results from this evaluation indicate that there is a range of services provided by the system being used by vocational teachers and this paper indicates which facilities have the greatest usage and impact on vocational teaching in Thailand.Keywords: digital learning repositories, vocational education, knowledge sharing, learning objects
Procedia PDF Downloads 4666695 The Impact of Artificial Intelligence on Digital Construction
Authors: Omil Nady Mahrous Maximous
Abstract:
The construction industry is currently experiencing a shift towards digitisation. This transformation is driven by adopting technologies like Building Information Modelling (BIM), drones, and augmented reality (AR). These advancements are revolutionizing the process of designing, constructing, and operating projects. BIM, for instance, is a new way of communicating and exploiting technology such as software and machinery. It enables the creation of a replica or virtual model of buildings or infrastructure projects. It facilitates simulating construction procedures, identifying issues beforehand, and optimizing designs accordingly. Drones are another tool in this revolution, as they can be utilized for site surveys, inspections, and even deliveries. Moreover, AR technology provides real-time information to workers involved in the project. Implementing these technologies in the construction industry has brought about improvements in efficiency, safety measures, and sustainable practices. BIM helps minimize rework and waste materials, while drones contribute to safety by reducing workers' exposure to areas. Additionally, AR plays a role in worker safety by delivering instructions and guidance during operations. Although the digital transformation within the construction industry is still in its early stages, it holds the potential to reshape project delivery methods entirely. By embracing these technologies, construction companies can boost their profitability while simultaneously reducing their environmental impact and ensuring safer practices.Keywords: architectural education, construction industry, digital learning environments, immersive learning BIM, digital construction, construction technologies, digital transformation artificial intelligence, collaboration, digital architecture, digital design theory, material selection, space construction
Procedia PDF Downloads 58