Search results for: data security assurance
25509 Big Data Applications for the Transport Sector
Authors: Antonella Falanga, Armando Cartenì
Abstract:
Today, an unprecedented amount of data coming from several sources, including mobile devices, sensors, tracking systems, and online platforms, characterizes our lives. The term “big data” not only refers to the quantity of data but also to the variety and speed of data generation. These data hold valuable insights that, when extracted and analyzed, facilitate informed decision-making. The 4Vs of big data - velocity, volume, variety, and value - highlight essential aspects, showcasing the rapid generation, vast quantities, diverse sources, and potential value addition of these kinds of data. This surge of information has revolutionized many sectors, such as business for improving decision-making processes, healthcare for clinical record analysis and medical research, education for enhancing teaching methodologies, agriculture for optimizing crop management, finance for risk assessment and fraud detection, media and entertainment for personalized content recommendations, emergency for a real-time response during crisis/events, and also mobility for the urban planning and for the design/management of public and private transport services. Big data's pervasive impact enhances societal aspects, elevating the quality of life, service efficiency, and problem-solving capacities. However, during this transformative era, new challenges arise, including data quality, privacy, data security, cybersecurity, interoperability, the need for advanced infrastructures, and staff training. Within the transportation sector (the one investigated in this research), applications span planning, designing, and managing systems and mobility services. Among the most common big data applications within the transport sector are, for example, real-time traffic monitoring, bus/freight vehicle route optimization, vehicle maintenance, road safety and all the autonomous and connected vehicles applications. Benefits include a reduction in travel times, road accidents and pollutant emissions. Within these issues, the proper transport demand estimation is crucial for sustainable transportation planning. Evaluating the impact of sustainable mobility policies starts with a quantitative analysis of travel demand. Achieving transportation decarbonization goals hinges on precise estimations of demand for individual transport modes. Emerging technologies, offering substantial big data at lower costs than traditional methods, play a pivotal role in this context. Starting from these considerations, this study explores the usefulness impact of big data within transport demand estimation. This research focuses on leveraging (big) data collected during the COVID-19 pandemic to estimate the evolution of the mobility demand in Italy. Estimation results reveal in the post-COVID-19 era, more than 96 million national daily trips, about 2.6 trips per capita, with a mobile population of more than 37.6 million Italian travelers per day. Overall, this research allows us to conclude that big data better enhances rational decision-making for mobility demand estimation, which is imperative for adeptly planning and allocating investments in transportation infrastructures and services.Keywords: big data, cloud computing, decision-making, mobility demand, transportation
Procedia PDF Downloads 6225508 The Impact of System and Data Quality on Organizational Success in the Kingdom of Bahrain
Authors: Amal M. Alrayes
Abstract:
Data and system quality play a central role in organizational success, and the quality of any existing information system has a major influence on the effectiveness of overall system performance.Given the importance of system and data quality to an organization, it is relevant to highlight their importance on organizational performance in the Kingdom of Bahrain. This research aims to discover whether system quality and data quality are related, and to study the impact of system and data quality on organizational success. A theoretical model based on previous research is used to show the relationship between data and system quality, and organizational impact. We hypothesize, first, that system quality is positively associated with organizational impact, secondly that system quality is positively associated with data quality, and finally that data quality is positively associated with organizational impact. A questionnaire was conducted among public and private organizations in the Kingdom of Bahrain. The results show that there is a strong association between data and system quality, that affects organizational success.Keywords: data quality, performance, system quality, Kingdom of Bahrain
Procedia PDF Downloads 49325507 New Practical and Non-Malleable Elgamal Encryption for E-Voting Protoco
Authors: Karima Djebaili, Lamine Melkemi
Abstract:
Elgamal encryption is a fundamental public-key encryption in cryptography, which is based on the difficulty of discrete logarithm problem and the Diffie-Hellman problem. Supposing the Diffie–Hellman problem is computationally infeasible then Elgamal is secure under a chosen plaintext attack, where security indicates it is difficult for the attacker, given the ciphertext, to restore the whole of the plaintext. However, although it is secure against chosen plaintext attack, Elgamal is absolutely malleable i.e. is not secure against an adaptive chosen ciphertext attack, where the attacker can recover the plaintext. We present a extension on Elgamal encryption which result in non-malleability against adaptive chosen plaintext attack using concatenation and a cryptographic hash function, our evidence utilizes the device of plaintext aware. The algorithm proposed can be used in cryptography voting protocol given its level security. Our protocol protects the confidentiality of voters because each voter encrypts their choice before casting their vote, offers public verifiability using a signing algorithm, the final result is correctly computed using homomorphic property, and works even in the presence of an adversary due to the propriety of non-malleability. Moreover, the protocol prevents some parties colluding to fix the vote results.Keywords: Elgamal encryption, non-malleability, plaintext aware, e-voting
Procedia PDF Downloads 45125506 A Real-World Roadmap and Exploration of Quantum Computers Capacity to Trivialise Internet Security
Authors: James Andrew Fitzjohn
Abstract:
This paper intends to discuss and explore the practical aspects of cracking encrypted messages with quantum computers. The theory of this process has been shown and well described both in academic papers and headline-grabbing news articles, but with all theory and hyperbole, we must be careful to assess the practicalities of these claims. Therefore, we will use real-world devices and proof of concept code to prove or disprove the notion that quantum computers will render the encryption technologies used by many websites unfit for purpose. It is time to discuss and implement the practical aspects of the process as many advances in quantum computing hardware/software have recently been made. This paper will set expectations regarding the useful lifespan of RSA and cipher lengths and propose alternative encryption technologies. We will set out comprehensive roadmaps describing when and how encryption schemes can be used, including when they can no longer be trusted. The cost will also be factored into our investigation; for example, it would make little financial sense to spend millions of dollars on a quantum computer to factor a private key in seconds when a commodity GPU could perform the same task in hours. It is hoped that the real-world results depicted in this paper will help influence the owners of websites who can take appropriate actions to improve the security of their provisions.Keywords: quantum computing, encryption, RSA, roadmap, real world
Procedia PDF Downloads 13125505 To Ensure Maximum Voter Privacy in E-Voting Using Blockchain, Convolutional Neural Network, and Quantum Key Distribution
Authors: Bhaumik Tyagi, Mandeep Kaur, Kanika Singla
Abstract:
The advancement of blockchain has facilitated scholars to remodel e-voting systems for future generations. Server-side attacks like SQL injection attacks and DOS attacks are the most common attacks nowadays, where malicious codes are injected into the system through user input fields by illicit users, which leads to data leakage in the worst scenarios. Besides, quantum attacks are also there which manipulate the transactional data. In order to deal with all the above-mentioned attacks, integration of blockchain, convolutional neural network (CNN), and Quantum Key Distribution is done in this very research. The utilization of blockchain technology in e-voting applications is not a novel concept. But privacy and security issues are still there in a public and private blockchains. To solve this, the use of a hybrid blockchain is done in this research. This research proposed cryptographic signatures and blockchain algorithms to validate the origin and integrity of the votes. The convolutional neural network (CNN), a normalized version of the multilayer perceptron, is also applied in the system to analyze visual descriptions upon registration in a direction to enhance the privacy of voters and the e-voting system. Quantum Key Distribution is being implemented in order to secure a blockchain-based e-voting system from quantum attacks using quantum algorithms. Implementation of e-voting blockchain D-app and providing a proposed solution for the privacy of voters in e-voting using Blockchain, CNN, and Quantum Key Distribution is done.Keywords: hybrid blockchain, secure e-voting system, convolutional neural networks, quantum key distribution, one-time pad
Procedia PDF Downloads 9425504 An Approach of Computer Modalities for Exploration of Hieroglyphics Substantial in an Investigation
Authors: Aditi Chauhan, Neethu S. Mohan
Abstract:
In the modern era, the advancement and digitalization in technology have taken place during an investigation of crime scene. The rapid enhancement and investigative techniques have changed the mean of identification of suspect. Identification of the person is one of the significant aspects, and personal authentication is the key of security and reliability in society. Since early 90 s, people have relied on comparing handwriting through its class and individual characteristics. But in today’s 21st century we need more reliable means to identify individual through handwriting. An approach employing computer modalities have lately proved itself auspicious enough in exploration of hieroglyphics substantial in investigating the case. Various software’s such as FISH, WRITEON, and PIKASO, CEDAR-FOX SYSTEM identify and verify the associated quantitative measure of the similarity between two samples. The research till date has been confined to identify the authorship of the concerned samples. But prospects associated with the use of computational modalities might help to identify disguised writing, forged handwriting or say altered or modified writing. Considering the applications of such modal, similar work is sure to attract plethora of research in immediate future. It has a promising role in national security too. Documents exchanged among terrorist can also be brought under the radar of surveillance, bringing forth their source of existence.Keywords: documents, identity, computational system, suspect
Procedia PDF Downloads 17625503 Cross-border Data Transfers to and from South Africa
Authors: Amy Gooden, Meshandren Naidoo
Abstract:
Genetic research and transfers of big data are not confined to a particular jurisdiction, but there is a lack of clarity regarding the legal requirements for importing and exporting such data. Using direct-to-consumer genetic testing (DTC-GT) as an example, this research assesses the status of data sharing into and out of South Africa (SA). While SA laws cover the sending of genetic data out of SA, prohibiting such transfer unless a legal ground exists, the position where genetic data comes into the country depends on the laws of the country from where it is sent – making the legal position less clear.Keywords: cross-border, data, genetic testing, law, regulation, research, sharing, South Africa
Procedia PDF Downloads 12525502 Standard Resource Parameter Based Trust Model in Cloud Computing
Authors: Shyamlal Kumawat
Abstract:
Cloud computing is shifting the approach IT capital are utilized. Cloud computing dynamically delivers convenient, on-demand access to shared pools of software resources, platform and hardware as a service through internet. The cloud computing model—made promising by sophisticated automation, provisioning and virtualization technologies. Users want the ability to access these services including infrastructure resources, how and when they choose. To accommodate this shift in the consumption model technology has to deal with the security, compatibility and trust issues associated with delivering that convenience to application business owners, developers and users. Absent of these issues, trust has attracted extensive attention in Cloud computing as a solution to enhance the security. This paper proposes a trusted computing technology through Standard Resource parameter Based Trust Model in Cloud Computing to select the appropriate cloud service providers. The direct trust of cloud entities is computed on basis of the interaction evidences in past and sustained on its present performances. Various SLA parameters between consumer and provider are considered in trust computation and compliance process. The simulations are performed using CloudSim framework and experimental results show that the proposed model is effective and extensible.Keywords: cloud, Iaas, Saas, Paas
Procedia PDF Downloads 33025501 Data Integration with Geographic Information System Tools for Rural Environmental Monitoring
Authors: Tamas Jancso, Andrea Podor, Eva Nagyne Hajnal, Peter Udvardy, Gabor Nagy, Attila Varga, Meng Qingyan
Abstract:
The paper deals with the conditions and circumstances of integration of remotely sensed data for rural environmental monitoring purposes. The main task is to make decisions during the integration process when we have data sources with different resolution, location, spectral channels, and dimension. In order to have exact knowledge about the integration and data fusion possibilities, it is necessary to know the properties (metadata) that characterize the data. The paper explains the joining of these data sources using their attribute data through a sample project. The resulted product will be used for rural environmental analysis.Keywords: remote sensing, GIS, metadata, integration, environmental analysis
Procedia PDF Downloads 12025500 Design of an Ensemble Learning Behavior Anomaly Detection Framework
Authors: Abdoulaye Diop, Nahid Emad, Thierry Winter, Mohamed Hilia
Abstract:
Data assets protection is a crucial issue in the cybersecurity field. Companies use logical access control tools to vault their information assets and protect them against external threats, but they lack solutions to counter insider threats. Nowadays, insider threats are the most significant concern of security analysts. They are mainly individuals with legitimate access to companies information systems, which use their rights with malicious intents. In several fields, behavior anomaly detection is the method used by cyber specialists to counter the threats of user malicious activities effectively. In this paper, we present the step toward the construction of a user and entity behavior analysis framework by proposing a behavior anomaly detection model. This model combines machine learning classification techniques and graph-based methods, relying on linear algebra and parallel computing techniques. We show the utility of an ensemble learning approach in this context. We present some detection methods tests results on an representative access control dataset. The use of some explored classifiers gives results up to 99% of accuracy.Keywords: cybersecurity, data protection, access control, insider threat, user behavior analysis, ensemble learning, high performance computing
Procedia PDF Downloads 12825499 Enhancing the Safety Climate and Reducing Violence against Staff in Closed Hospital Wards
Authors: Valerie Isaak
Abstract:
This study examines the effectiveness of an intervention program aimed at enhancing a unit-level safety climate as a way to minimize the risk of employees being injured by patient violence. The intervention program conducted in maximum security units in one of the psychiatric hospitals in Israel included a three day workshop. Safety climate was examined before and after the implementation of the intervention. We also collected data regarding incidents involving patient violence. Six months after the intervention a significant improvement in employees’ perceptions regarding management’s commitment to safety were found as well as a marginally significant improvement in communication concerning safety issues. Our research shows that an intervention program aimed at enhancing a safety climate is associated with a decrease in the number of aggressive incidents. We conclude that such an intervention program is likely to return the sense of safety and reduce the scope of violence.Keywords: violence, intervention, safety climate, performance, public sector
Procedia PDF Downloads 35325498 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic
Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi
Abstract:
In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing
Procedia PDF Downloads 29925497 Forthcoming Big Data on Smart Buildings and Cities: An Experimental Study on Correlations among Urban Data
Authors: Yu-Mi Song, Sung-Ah Kim, Dongyoun Shin
Abstract:
Cities are complex systems of diverse and inter-tangled activities. These activities and their complex interrelationships create diverse urban phenomena. And such urban phenomena have considerable influences on the lives of citizens. This research aimed to develop a method to reveal the causes and effects among diverse urban elements in order to enable better understanding of urban activities and, therefrom, to make better urban planning strategies. Specifically, this study was conducted to solve a data-recommendation problem found on a Korean public data homepage. First, a correlation analysis was conducted to find the correlations among random urban data. Then, based on the results of that correlation analysis, the weighted data network of each urban data was provided to people. It is expected that the weights of urban data thereby obtained will provide us with insights into cities and show us how diverse urban activities influence each other and induce feedback.Keywords: big data, machine learning, ontology model, urban data model
Procedia PDF Downloads 41825496 Data-driven Decision-Making in Digital Entrepreneurship
Authors: Abeba Nigussie Turi, Xiangming Samuel Li
Abstract:
Data-driven business models are more typical for established businesses than early-stage startups that strive to penetrate a market. This paper provided an extensive discussion on the principles of data analytics for early-stage digital entrepreneurial businesses. Here, we developed data-driven decision-making (DDDM) framework that applies to startups prone to multifaceted barriers in the form of poor data access, technical and financial constraints, to state some. The startup DDDM framework proposed in this paper is novel in its form encompassing startup data analytics enablers and metrics aligning with startups' business models ranging from customer-centric product development to servitization which is the future of modern digital entrepreneurship.Keywords: startup data analytics, data-driven decision-making, data acquisition, data generation, digital entrepreneurship
Procedia PDF Downloads 32925495 Indonesian Food Safety Policy for Local Commodity against ASEAN Economic Community: An Uneven Battle in the Global War
Authors: Wahyu Riawanti
Abstract:
Food safety is the one of a prominent issue for globalization era. The more concern is paid in international food and agriculture trade; the more consumers will consider raising the standard of food safety. For this reason, the role of the issue is not only in term of added value but since then also the main requirement in import export activity, including agriculture products. Unfortunately, Indonesia and other developing countries found it difficult to fulfill some of the technical issues and end it up with the lower export activity. In this case, the technical requirements of food safety become an obstacle rather than challenging. Furthermore for local farmers’ activity, food safety is more or less a threat. The study is aimed to reveal on how Indonesian government had dealt with the certification regulation to face problem on competitiveness of Indonesian products. Local government has conducted the regulation of food certification. The study used the case of Salak Pondoh fruit (Salacca zalacca) certification process on Sleman District- Yogyakarta. Triangulation method was used to analyze the effectiveness of the certification program. The quantitative data series taken from 7 farmer groups during the certification processes were used for the research main data. The supporting qualitative data was obtained from in-depth interview with the members of farmers group. The pre-research result has shown that the impact varied from different groups. Conclusively the certification regulation has partly failed to make a significant change in local farmers’ competitiveness. Even the profit was increased, the highly amount budget of the program did not significantly increase the economic incentives for local farmers.Keywords: economic incentive, food security, government regulation, international trade, local commodity, Salacca zalacca
Procedia PDF Downloads 27625494 Real Time Detection of Application Layer DDos Attack Using Log Based Collaborative Intrusion Detection System
Authors: Farheen Tabassum, Shoab Ahmed Khan
Abstract:
The brutality of attacks on networks and decisive infrastructures are on the climb over recent years and appears to continue to do so. Distributed Denial of service attack is the most prevalent and easy attack on the availability of a service due to the easy availability of large botnet computers at cheap price and the general lack of protection against these attacks. Application layer DDoS attack is DDoS attack that is targeted on wed server, application server or database server. These types of attacks are much more sophisticated and challenging as they get around most conventional network security devices because attack traffic often impersonate normal traffic and cannot be recognized by network layer anomalies. Conventional techniques of single-hosted security systems are becoming gradually less effective in the face of such complicated and synchronized multi-front attacks. In order to protect from such attacks and intrusion, corporation among all network devices is essential. To overcome this issue, a collaborative intrusion detection system (CIDS) is proposed in which multiple network devices share valuable information to identify attacks, as a single device might not be capable to sense any malevolent action on its own. So it helps us to take decision after analyzing the information collected from different sources. This novel attack detection technique helps to detect seemingly benign packets that target the availability of the critical infrastructure, and the proposed solution methodology shall enable the incident response teams to detect and react to DDoS attacks at the earliest stage to ensure that the uptime of the service remain unaffected. Experimental evaluation shows that the proposed collaborative detection approach is much more effective and efficient than the previous approaches.Keywords: Distributed Denial-of-Service (DDoS), Collaborative Intrusion Detection System (CIDS), Slowloris, OSSIM (Open Source Security Information Management tool), OSSEC HIDS
Procedia PDF Downloads 35425493 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation
Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk
Abstract:
The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set
Procedia PDF Downloads 21925492 The Synopsis of the AI-Powered Therapy Web Platform ‘Free AI Therapist'
Authors: Arwa Alnowaiser, Hala Shoukri
Abstract:
The ‘FreeAITherapist’ is an artificial intelligence application that uses the power of AI to offer advice and mental health counseling to its users through its chatbot services. The AI therapist is designed to understand users' issues, concerns, and problems and respond appropriately; it provides empathy and guidance and uses evidence-based therapeutic techniques. With its user-friendly platform, it ensures accessibility for individuals in need, regardless of their geographical location. This website was created in direct response to the growing demand for mental health support, aiming to provide a cost-effective and confidential solution. Through promising confidentiality, it considers user privacy and data security. The ‘FreeAITherapist’ strives to bridge the gap in mental health services, offering a reliable resource for individuals seeking guidance and counseling to improve their overall well-being.Keywords: artificial intelligence, mental health, AI therapist, website, counseling
Procedia PDF Downloads 4425491 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion
Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro
Abstract:
In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment
Procedia PDF Downloads 4325490 Digital Cinema Watermarking State of Art and Comparison
Authors: H. Kelkoul, Y. Zaz
Abstract:
Nowadays, the vigorous popularity of video processing techniques has resulted in an explosive growth of multimedia data illegal use. So, watermarking security has received much more attention. The purpose of this paper is to explore some watermarking techniques in order to observe their specificities and select the finest methods to apply in digital cinema domain against movie piracy by creating an invisible watermark that includes the date, time and the place where the hacking was done. We have studied three principal watermarking techniques in the frequency domain: Spread spectrum, Wavelet transform domain and finally the digital cinema watermarking transform domain. In this paper, a detailed technique is presented where embedding is performed using direct sequence spread spectrum technique in DWT transform domain. Experiment results shows that the algorithm provides high robustness and good imperceptibility.Keywords: digital cinema, watermarking, wavelet DWT, spread spectrum, JPEG2000 MPEG4
Procedia PDF Downloads 25125489 Growing Acts of Terrorism in Local Conflicts: A Dire Need for International Attention
Authors: Yusuf Abubakar Mamud
Abstract:
Highlighting the imperatives of local conflicts considering the dangerous dimensions of terrorism they are assuming in Africa has not attracted serious academic and political attention. The discourse about conflict in Africa was discussed within five identified conflict zones in the continent. The threats from these local conflicts are diverse and complex and the acts of terrorism in these local conflicts are driven by certain attitudes and behaviours linked to the African leadership. The paper examined and noted that the current conflict resolution model of the African Union (AU) was robust with requisite institutions to address the trends in local conflicts. However, it was observed that the AU peace and security framework lacked the requisite structural and technical capabilities to proactively address the drivers of local conflicts in Africa. It was found that the persistence of local conflicts in the African region may deny her the opportunities of achievement of the targets envisioned in the Sustainable Development Goals (SDGs). Consequently, the paper called on the international community to support Africa through provision of capacity. It urged the African leaders themselves to develop the political will to ensure that all issues concerning peace and security in the continent were guided by the provisions of the AU Constitutive Act. The need to strengthen the APRM in the light of the current trends in local conflicts was also highlighted.Keywords: conflicts, local conflicts, terrorism, sustainable development
Procedia PDF Downloads 27625488 A Collection of Voices on Higher Educational Access, Quality and Equity in Africa: A Systematic Review
Authors: Araba A. Z. Osei-Tutu, Ebenezer Odame, Joseph Bawa, Samuel Amponsah
Abstract:
Education is recognized as a fundamental human right and a catalyst for development. Despite progress in the provision of higher education on the African continent, there persist challenges with the tripartite areas of access, equity and quality. Therefore, this systematic review aimed at providing a comprehensive overview of conversations and voices of scholars on these three concepts in HE in Africa. The systematic review employed a thematic analysis approach, synthesizing findings from 38 selected sources. After a critical analysis of the sources included in the systematic review, deficits in access, quality, and equity were outlined, focusing on infrastructure, regional disparities, and privatization challenges. The review also revealed the weak enforcement of quality assurance measures. Strategies for improvement, proffered by the study, include expanding public sector HE, deregulating the educational sector, promoting open and distance learning, implementing preferential admission policies, and enhancing financial aid. This research contributes valuable insights for policymakers, educators, and stakeholders, fostering a collaborative approach to address challenges and promote holistic development in African higher education.Keywords: access, equity, quality, higher education, Africa, systematic review, strategies
Procedia PDF Downloads 7125487 A Pattern Recognition Neural Network Model for Detection and Classification of SQL Injection Attacks
Authors: Naghmeh Moradpoor Sheykhkanloo
Abstract:
Structured Query Language Injection (SQLI) attack is a code injection technique in which malicious SQL statements are inserted into a given SQL database by simply using a web browser. Losing data, disclosing confidential information or even changing the value of data are the severe damages that SQLI attack can cause on a given database. SQLI attack has also been rated as the number-one attack among top ten web application threats on Open Web Application Security Project (OWASP). OWASP is an open community dedicated to enabling organisations to consider, develop, obtain, function, and preserve applications that can be trusted. In this paper, we propose an effective pattern recognition neural network model for detection and classification of SQLI attacks. The proposed model is built from three main elements of: a Uniform Resource Locator (URL) generator in order to generate thousands of malicious and benign URLs, a URL classifier in order to: 1) classify each generated URL to either a benign URL or a malicious URL and 2) classify the malicious URLs into different SQLI attack categories, and an NN model in order to: 1) detect either a given URL is a malicious URL or a benign URL and 2) identify the type of SQLI attack for each malicious URL. The model is first trained and then evaluated by employing thousands of benign and malicious URLs. The results of the experiments are presented in order to demonstrate the effectiveness of the proposed approach.Keywords: neural networks, pattern recognition, SQL injection attacks, SQL injection attack classification, SQL injection attack detection
Procedia PDF Downloads 46925486 The Impact of Illegal Firearms Possession, Limited Security Staff and Porosity of Border on Human Security in Ipokia Local Government Area, Ogun State
Authors: Ogunmefun Folorunsho Muyideen, Aluko Tolulope Evelyn
Abstract:
One of the trending menaces faced in the world today is centered on the porosity of borders and proliferation of illegal weapons among the state members without the state authorizations. The proliferation of weapons along porous borders remains a germane and unsolvable question among developed and developing nations due to crisis degenerated from the menace (loss of lives, properties, traumatization, civil unrest and retrogressive economic development). A mixed method was adopted while the survey method was used for communities’ selection (Oke-Odan, Ajilete, Illaise, Lanlate) at Ipokia Local Government as a sample frame. Multi-stage sampling was employed to break down the site into wards, streets, and different house numbers before randomizing administration of the questionnaires using face to face method, while purposive sampling was used for collecting verbal information through an in-depth interviews method. The population size for the site is 150.398, while 399 was the sample size derived from the use of Yamane sample size formula. After retrieval of structured questionnaires, 346 were found useful, while 10 percent (399) of the quantitative instruments was summed to 30 participants that were interviewed using the in-depth interviews technique. The result of the first hypothesis shows a composite relationship between the variables tested (independents and dependent). The result indicated that the porosity of the border, illegal possession of guns, and limited security staff jointly predispose insecurity among the residents of the selected study site. The result of the second hypothesis deciphers that the illegal gun possession (independent) variable predict business outcome among the residents of the study site because sporadic gun shoot will regress the business activities in the study area. The result of third result indicated that the independent (porosity of borders) variable predict social bonding network because a high level of insecurity will destroy the level of trust in the communication among the residents of the study area. The last questions give comprehensive meaning to one of the recommendations derived using content systematic analysis, which explains that out of 30 participants interviewed, 18 submitted individual involvement in monitoring communities will solve the problem, 7 out of 30 opines that governmental agents are to be trained for effective combat, 3 participants out 30 submits that the fight is for both government and the citizens while 2 participants out of 30 claimed that there must be an agreement between Nigerian and neighbouring countries on border security. International donors must totally control the sales of weapons to unauthorized personalities. Criminal cases must be treated with deterrence measures and target hardened procedures through decoying and blending, stakeout, and sting tactics.Keywords: human security, illegal weapons, porous borders, development
Procedia PDF Downloads 17925485 Sustainability through Resilience: How Emergency Responders Cope with Stressors
Authors: Sophie Kroeling, Agnetha Schuchardt
Abstract:
Striving for sustainability brings a lot of challenges for different fields of interest, e. g. security or health concerns. In Germany, civil protection is predominantly carried out by emergency responders who perform essential tasks of civil protection. Based on theoretical concepts of different psychological stress theories this contribution focuses on the question, how the resilience of emergency responders can be improved. The goal is to identify resources and successful coping strategies that help to prevent and reduce negative outcomes during or after stressful events. The paper will present results from a qualitative analysis of semi-structured qualitative interviews with 20 emergency responders. These results provide insights into the complexity of coping processes (e. g. controlling the situation, downplaying perceived personal threats through humor) and show the diversity of stressors (like complexity of the disastrous situation, intrusive press and media, or lack of social support within the organization). Self-efficacy expectation was a very important resource for coping with stressful situations. The results served as a starting point for a quantitative survey (that was conducted in March 2017), the development of education and training tools for emergency responders and the improvement of critical incident stress management processes. First results from the quantitative study with more than 700 participants show that, e. g., the emergency responders use social coping within their private social network and also within their aid organization and that both are correlated to resilience. Moreover, missing information, bureaucratic problems and social conflicts within the organization are events that the majority of the participants considered very onerous. Further results from regression analysis will be presented. The proposed paper will combine findings from the qualitative study with the quantitative results, illustrating figures and correlations with respective statements from the interviews. At the end, suggestions for the improvement of the emergency responder’s resilience are given and it is discussed how this can make a contribution to strive for civil security and furthermore a sustainable development.Keywords: civil security, emergency responders, stress, resilience, resources
Procedia PDF Downloads 14425484 A Web-Based Real Property Updating System for Efficient and Sustainable Urban Development: A Case Study in Ethiopia
Authors: Eyosiyas Aga
Abstract:
The development of information communication technology has transformed the paper-based mapping and land registration processes to a computerized and networked system. The computerization and networking of real property information system play a vital role in good governance and sustainable development of emerging countries through cost effective, easy and accessible service delivery for the customer. The efficient, transparent and sustainable real property system is becoming the basic infrastructure for the urban development thus improve the data management system and service delivery in the organizations. In Ethiopia, the real property administration is paper based as a result, it confronted problems of data management, illegal transactions, corruptions, and poor service delivery. In order to solve this problem and to facilitate real property market, the implementation of web-based real property updating system is crucial. A web-based real property updating is one of the automation (computerizations) methods to facilitate data sharing, reduce time and cost of the service delivery in real property administration system. In additions, it is useful for the integration of data onto different information systems and organizations. This system is designed by combining open source software which supported by open Geo-spatial consortium. The web-based system is mainly designed by using open source software with the help of open Geo-spatial Consortium. The Open Geo-spatial Consortium standards such as the Web Feature Service and Web Map Services are the most widely used standards to support and improves web-based real property updating. These features allow the integration of data from different sources, and it can be used to maintain consistency of data throughout transactions. The PostgreSQL and Geoserver are used to manage and connect a real property data to the flex viewer and user interface. The system is designed for both internal updating system (municipality); which is mainly updating of spatial and textual information, and the external system (customer) which focus on providing and interacting with the customer. This research assessed the potential of open source web applications and adopted this technology for real property updating system in Ethiopia through simple, cost effective and secured way. The system is designed by combining and customizing open source software to enhance the efficiency of the system in cost effective way. The existing workflow for real property updating is analyzed to identify the bottlenecks, and the new workflow is designed for the system. The requirement is identified through questionnaire and literature review, and the system is prototype for the study area. The research mainly aimed to integrate human resource with technology in designing of the system to reduce data inconsistency and security problems. In additions, the research reflects on the current situation of real property administration and contributions of effective data management system for efficient, transparent and sustainable urban development in Ethiopia.Keywords: cadaster, real property, sustainable, transparency, web feature service, web map service
Procedia PDF Downloads 26725483 Data Mining Approach for Commercial Data Classification and Migration in Hybrid Storage Systems
Authors: Mais Haj Qasem, Maen M. Al Assaf, Ali Rodan
Abstract:
Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.Keywords: hybrid storage system, data mining, recurrent neural network, support vector machine
Procedia PDF Downloads 30825482 Impact of Climate on Sugarcane Yield Over Belagavi District, Karnataka Using Statistical Mode
Authors: Girish Chavadappanavar
Abstract:
The impact of climate on agriculture could result in problems with food security and may threaten the livelihood activities upon which much of the population depends. In the present study, the development of a statistical yield forecast model has been carried out for sugarcane production over Belagavi district, Karnataka using weather variables of crop growing season and past observed yield data for the period of 1971 to 2010. The study shows that this type of statistical yield forecast model could efficiently forecast yield 5 weeks and even 10 weeks in advance of the harvest for sugarcane within an acceptable limit of error. The performance of the model in predicting yields at the district level for sugarcane crops is found quite satisfactory for both validation (2007 and 2008) as well as forecasting (2009 and 2010).In addition to the above study, the climate variability of the area has also been studied, and hence, the data series was tested for Mann Kendall Rank Statistical Test. The maximum and minimum temperatures were found to be significant with opposite trends (decreasing trend in maximum and increasing in minimum temperature), while the other three are found in significant with different trends (rainfall and evening time relative humidity with increasing trend and morning time relative humidity with decreasing trend).Keywords: climate impact, regression analysis, yield and forecast model, sugar models
Procedia PDF Downloads 7125481 Crime Victim Support Services in Bangladesh: An Analysis
Authors: Mohammad Shahjahan, Md. Monoarul Haque
Abstract:
In the research work information and data were collected from both types of sources, direct and indirect. Numerological, qualitative and participatory analysis methods have been followed. There were two principal sources of collecting information and data. Firstly, the data provided by the service recipients (300 nos. of women and children victims) in the Victim Support Centre and service providing policemen, executives and staffs (60 nos.). Secondly, data collected from Specialists, Criminologists and Sociologists involved in victim support services through Consultative Interview, KII, Case Study and FGD etc. The initial data collection has been completed with the help of questionnaires as per strategic variations and with the help of guidelines. It is to be noted that the main objective of this research was to determine whether services provided to the victims for their facilities, treatment/medication and rehabilitation by different government/non-government organizations was veritable at all. At the same time socio-economic background and demographic characteristics of the victims have also been revealed through this research. The results of the study show that although the number of victims has increased gradually due to socio-economic, political and cultural realities in Bangladesh, the number of victim support centers has not increased as expected. Awareness among the victims about the effectiveness of the 8 centers working in this regard is also not up to the mark. Two thirds of the victims coming to get service were not cognizant regarding the victim support services at all before getting the service. Most of those who have finally been able to come under the services of the Victim Support Center through various means, have received sheltering (15.5%), medical services (13.32%), counseling services (13.10%) and legal aid (12.66%). The opportunity to stay in security custody and psycho-physical services were also notable. Usually, women and children from relatively poor and marginalized families of the society come to victim support center for getting services. Among the women, young unmarried women are the biggest victims of crime. Again, women and children employed as domestic workers are more affected. A number of serious negative impacts fall on the lives of the victims. Being deprived of employment opportunities (26.62%), suffering from psycho-somatic disorder (20.27%), carrying sexually transmitted diseases (13.92%) are among them. It seems apparent to urgently enact distinct legislation, increase the number of Victim Support Centers, expand the area and purview of services and take initiative to increase public awareness and to create mass movement.Keywords: crime, victim, support, Bangladesh
Procedia PDF Downloads 8925480 Discussion on Big Data and One of Its Early Training Application
Authors: Fulya Gokalp Yavuz, Mark Daniel Ward
Abstract:
This study focuses on a contemporary and inevitable topic of Data Science and its exemplary application for early career building: Big Data and Leaving Learning Community (LLC). ‘Academia’ and ‘Industry’ have a common sense on the importance of Big Data. However, both of them are in a threat of missing the training on this interdisciplinary area. Some traditional teaching doctrines are far away being effective on Data Science. Practitioners needs some intuition and real-life examples how to apply new methods to data in size of terabytes. We simply explain the scope of Data Science training and exemplified its early stage application with LLC, which is a National Science Foundation (NSF) founded project under the supervision of Prof. Ward since 2014. Essentially, we aim to give some intuition for professors, researchers and practitioners to combine data science tools for comprehensive real-life examples with the guides of mentees’ feedback. As a result of discussing mentoring methods and computational challenges of Big Data, we intend to underline its potential with some more realization.Keywords: Big Data, computation, mentoring, training
Procedia PDF Downloads 362