Search results for: visual media and computer network etc
788 Understanding Cyber Kill Chains: Optimal Allocation of Monitoring Resources Using Cooperative Game Theory
Authors: Roy. H. A. Lindelauf
Abstract:
Cyberattacks are complex processes consisting of multiple interwoven tasks conducted by a set of agents. Interdictions and defenses against such attacks often rely on cyber kill chain (CKC) models. A CKC is a framework that tries to capture the actions taken by a cyber attacker. There exists a growing body of literature on CKCs. Most of this work either a) describes the CKC with respect to one or more specific cyberattacks or b) discusses the tools and technologies used by the attacker at each stage of the CKC. Defenders, facing scarce resources, have to decide where to allocate their resources given the CKC and partial knowledge on the tools and techniques attackers use. In this presentation CKCs are analyzed through the lens of covert projects, i.e., interrelated tasks that have to be conducted by agents (human and/or computer) with the aim of going undetected. Various aspects of covert project models have been studied abundantly in the operations research and game theory domain, think of resource-limited interdiction actions that maximally delay completion times of a weapons project for instance. This presentation has investigated both cooperative and non-cooperative game theoretic covert project models and elucidated their relation to CKC modelling. To view a CKC as a covert project each step in the CKC is broken down into tasks and there are players of which each one is capable of executing a subset of the tasks. Additionally, task inter-dependencies are represented by a schedule. Using multi-glove cooperative games it is shown how a defender can optimize the allocation of his scarce resources (what, where and how to monitor) against an attacker scheduling a CKC. This study presents and compares several cooperative game theoretic solution concepts as metrics for assigning resources to the monitoring of agents.Keywords: cyber defense, cyber kill chain, game theory, information warfare techniques
Procedia PDF Downloads 140787 Ebola Virus Glycoprotein Inhibitors from Natural Compounds: Computer-Aided Drug Design
Authors: Driss Cherqaoui, Nouhaila Ait Lahcen, Ismail Hdoufane, Mehdi Oubahmane, Wissal Liman, Christelle Delaite, Mohammed M. Alanazi
Abstract:
The Ebola virus is a highly contagious and deadly pathogen that causes Ebola virus disease. The Ebola virus glycoprotein (EBOV-GP) is a key factor in viral entry into host cells, making it a critical target for therapeutic intervention. Using a combination of computational approaches, this study focuses on the identification of natural compounds that could serve as potent inhibitors of EBOV-GP. The 3D structure of EBOV-GP was selected, with missing residues modeled, and this structure was minimized and equilibrated. Two large natural compound databases, COCONUT and NPASS, were chosen and filtered based on toxicity risks and Lipinski’s Rule of Five to ensure drug-likeness. Following this, a pharmacophore model, built from 22 reported active inhibitors, was employed to refine the selection of compounds with a focus on structural relevance to known Ebola inhibitors. The filtered compounds were subjected to virtual screening via molecular docking, which identified ten promising candidates (five from each database) with strong binding affinities to EBOV-GP. These compounds were then validated through molecular dynamics simulations to evaluate their binding stability and interactions with the target. The top three compounds from each database were further analyzed using ADMET profiling, confirming their favorable pharmacokinetic properties, stability, and safety. These results suggest that the selected compounds have the potential to inhibit EBOV-GP, offering new avenues for antiviral drug development against the Ebola virus.Keywords: EBOV-GP, Ebola virus glycoprotein, high-throughput drug screening, molecular docking, molecular dynamics, natural compounds, pharmacophore modeling, virtual screening
Procedia PDF Downloads 21786 Meta Model for Optimum Design Objective Function of Steel Frames Subjected to Seismic Loads
Authors: Salah R. Al Zaidee, Ali S. Mahdi
Abstract:
Except for simple problems of statically determinate structures, optimum design problems in structural engineering have implicit objective functions where structural analysis and design are essential within each searching loop. With these implicit functions, the structural engineer is usually enforced to write his/her own computer code for analysis, design, and searching for optimum design among many feasible candidates and cannot take advantage of available software for structural analysis, design, and searching for the optimum solution. The meta-model is a regression model used to transform an implicit objective function into objective one and leads in turn to decouple the structural analysis and design processes from the optimum searching process. With the meta-model, well-known software for structural analysis and design can be used in sequence with optimum searching software. In this paper, the meta-model has been used to develop an explicit objective function for plane steel frames subjected to dead, live, and seismic forces. Frame topology is assumed as predefined based on architectural and functional requirements. Columns and beams sections and different connections details are the main design variables in this study. Columns and beams are grouped to reduce the number of design variables and to make the problem similar to that adopted in engineering practice. Data for the implicit objective function have been generated based on analysis and assessment for many design proposals with CSI SAP software. These data have been used later in SPSS software to develop a pure quadratic nonlinear regression model for the explicit objective function. Good correlations with a coefficient, R2, in the range from 0.88 to 0.99 have been noted between the original implicit functions and the corresponding explicit functions generated with meta-model.Keywords: meta-modal, objective function, steel frames, seismic analysis, design
Procedia PDF Downloads 243785 Business Program Curriculum with Industry-Recognized Certifications: An Empirical Study of Exam Results and Program Curriculum
Authors: Thomas J. Bell III
Abstract:
Pursuing a business degree is fraught with perplexing questions regarding the rising tuition cost and the immediate value of earning a degree. Any decision to pursue an undergraduate business degree is perceived to have value if it facilitates post-graduate job placement. Business programs have decreased value in the absence of innovation in business programs that close the skills gap between recent graduates and employment opportunities. Industry-based certifications are seemingly becoming a requirement differentiator among job applicants. Texas Wesleyan University offers a Computer Information System (CIS) program with an innovative curriculum that integrates industry-recognized certification training into its traditional curriculum with core subjects and electives. This paper explores a culture of innovation in the CIS business program curriculum that creates sustainable stakeholder value for students, employers, the community, and the university. A quantitative research methodology surveying over one-hundred students in the CIS program will be used to examine factors influencing the success or failure of students taking certification exams. Researchers will analyze control variables to identify specific correlations between practice exams, teaching pedagogy, study time, age, work experience, etc. This study compared various exam preparation techniques to corresponding exam results across several industry certification exams. The findings will aid in understanding control variables with correlations that positively and negatively impact exam results. Such discovery may provide useful insight into pedagogical impact indicators that positively contribute to certification exam success and curriculum enhancement.Keywords: taking certification exams, exam training, testing skills, exam study aids, certification exam curriculum
Procedia PDF Downloads 88784 Applying Big Data Analysis to Efficiently Exploit the Vast Unconventional Tight Oil Reserves
Authors: Shengnan Chen, Shuhua Wang
Abstract:
Successful production of hydrocarbon from unconventional tight oil reserves has changed the energy landscape in North America. The oil contained within these reservoirs typically will not flow to the wellbore at economic rates without assistance from advanced horizontal well and multi-stage hydraulic fracturing. Efficient and economic development of these reserves is a priority of society, government, and industry, especially under the current low oil prices. Meanwhile, society needs technological and process innovations to enhance oil recovery while concurrently reducing environmental impacts. Recently, big data analysis and artificial intelligence become very popular, developing data-driven insights for better designs and decisions in various engineering disciplines. However, the application of data mining in petroleum engineering is still in its infancy. The objective of this research aims to apply intelligent data analysis and data-driven models to exploit unconventional oil reserves both efficiently and economically. More specifically, a comprehensive database including the reservoir geological data, reservoir geophysical data, well completion data and production data for thousands of wells is firstly established to discover the valuable insights and knowledge related to tight oil reserves development. Several data analysis methods are introduced to analysis such a huge dataset. For example, K-means clustering is used to partition all observations into clusters; principle component analysis is applied to emphasize the variation and bring out strong patterns in the dataset, making the big data easy to explore and visualize; exploratory factor analysis (EFA) is used to identify the complex interrelationships between well completion data and well production data. Different data mining techniques, such as artificial neural network, fuzzy logic, and machine learning technique are then summarized, and appropriate ones are selected to analyze the database based on the prediction accuracy, model robustness, and reproducibility. Advanced knowledge and patterned are finally recognized and integrated into a modified self-adaptive differential evolution optimization workflow to enhance the oil recovery and maximize the net present value (NPV) of the unconventional oil resources. This research will advance the knowledge in the development of unconventional oil reserves and bridge the gap between the big data and performance optimizations in these formations. The newly developed data-driven optimization workflow is a powerful approach to guide field operation, which leads to better designs, higher oil recovery and economic return of future wells in the unconventional oil reserves.Keywords: big data, artificial intelligence, enhance oil recovery, unconventional oil reserves
Procedia PDF Downloads 283783 Electronic Commerce in Georgia: Problems and Development Perspectives
Authors: Nika GorgoShadze, Anri Shainidze, Bachuki Katamadze
Abstract:
In parallel to the development of the digital economy in the world, electronic commerce is also widely developing. Internet and ICT (information and communication technology) have created new business models as well as promoted to market consolidation, sustainability of the business environment, creation of digital economy, facilitation of business and trade, business dynamism, higher competitiveness, etc. Electronic commerce involves internet technology which is sold via the internet. Nowadays electronic commerce is a field of business which is used by leading world brands very effectively. After the research of internet market in Georgia, it was found out that quality of internet is high in Tbilisi and is low in the regions. The internet market of Tbilisi can be evaluated as high-speed internet service, competitive and cost effective internet market. Development of electronic commerce in Georgia is connected with organizational and methodological as well as legal problems. First of all, a legal framework should be developed which will regulate responsibilities of organizations. The Ministry of Economy and Sustainable Development will play a crucial role in creating legal framework. Ministry of Justice will also be involved in this process as well as agency for data exchange. Measures should be taken in order to make electronic commerce in Georgia easier. Business companies may be offered some model to get low-cost and complex service. A service centre should be created which will provide all kinds of online-shopping. This will be a rather interesting innovation which will facilitate online-shopping in Georgia. Development of electronic business in Georgia requires modernized infrastructure of telecommunications (especially in the regions) as well as solution of institutional and socio-economic problems. Issues concerning internet availability and computer skills are also important.Keywords: electronic commerce, internet market, electronic business, information technology, information society, electronic systems
Procedia PDF Downloads 384782 Design and Creation of a BCI Videogame for Training and Measure of Sustained Attention in Children with ADHD
Authors: John E. Muñoz, Jose F. Lopez, David S. Lopez
Abstract:
Attention Deficit Hyperactivity Disorder (ADHD) is a disorder that affects 1 out of 5 Colombian children, converting into a real public health problem in the country. Conventional treatments such as medication and neuropsychological therapy have been proved to be insufficient in order to decrease high incidence levels of ADHD in the principal Colombian cities. This work demonstrates a design and development of a videogame that uses a brain computer interface not only to serve as an input device but also as a tool to monitor neurophysiologic signal. The video game named “The Harvest Challenge” puts a cultural scene of a Colombian coffee grower in its context, where a player can use his/her avatar in three mini games created in order to reinforce four fundamental aspects: i) waiting ability, ii) planning ability, iii) ability to follow instructions and iv) ability to achieve objectives. The details of this collaborative designing process of the multimedia tool according to the exact clinic necessities and the description of interaction proposals are presented through the mental stages of attention and relaxation. The final videogame is presented as a tool for sustained attention training in children with ADHD using as an action mechanism the neuromodulation of Beta and Theta waves through an electrode located in the central part of the front lobe of the brain. The processing of an electroencephalographic signal is produced automatically inside the videogame allowing to generate a report of the theta/beta ratio evolution - a biological marker, which has been demonstrated to be a sufficient measure to discriminate of children with deficit and without.Keywords: BCI, neuromodulation, ADHD, videogame, neurofeedback, theta/beta ratio
Procedia PDF Downloads 371781 An Ensemble System of Classifiers for Computer-Aided Volcano Monitoring
Authors: Flavio Cannavo
Abstract:
Continuous evaluation of the status of potentially hazardous volcanos plays a key role for civil protection purposes. The importance of monitoring volcanic activity, especially for energetic paroxysms that usually come with tephra emissions, is crucial not only for exposures to the local population but also for airline traffic. Presently, real-time surveillance of most volcanoes worldwide is essentially delegated to one or more human experts in volcanology, who interpret data coming from different kind of monitoring networks. Unfavorably, the high nonlinearity of the complex and coupled volcanic dynamics leads to a large variety of different volcanic behaviors. Moreover, continuously measured parameters (e.g. seismic, deformation, infrasonic and geochemical signals) are often not able to fully explain the ongoing phenomenon, thus making the fast volcano state assessment a very puzzling task for the personnel on duty at the control rooms. With the aim of aiding the personnel on duty in volcano surveillance, here we introduce a system based on an ensemble of data-driven classifiers to infer automatically the ongoing volcano status from all the available different kind of measurements. The system consists of a heterogeneous set of independent classifiers, each one built with its own data and algorithm. Each classifier gives an output about the volcanic status. The ensemble technique allows weighting the single classifier output to combine all the classifications into a single status that maximizes the performance. We tested the model on the Mt. Etna (Italy) case study by considering a long record of multivariate data from 2011 to 2015 and cross-validated it. Results indicate that the proposed model is effective and of great power for decision-making purposes.Keywords: Bayesian networks, expert system, mount Etna, volcano monitoring
Procedia PDF Downloads 246780 Finite Element Analysis of the Drive Shaft and Jacking Frame Interaction in Micro-Tunneling Method: Case Study of Tehran Sewerage
Authors: B. Mohammadi, A. Riazati, P. Soltan Sanjari, S. Azimbeik
Abstract:
The ever-increasing development of civic demands on one hand; and the urban constrains for newly establish of infrastructures, on the other hand, perforce the engineering committees to apply non-conflicting methods in order to optimize the results. One of these optimized procedures to establish the main sewerage networks is the pipe jacking and micro-tunneling method. The raw information and researches are based on the experiments of the slurry micro-tunneling project of the Tehran main sewerage network that it has executed by the KAYSON co. The 4985 meters route of the mentioned project that is located nearby the Azadi square and the most vital arteries of Tehran is faced to 45% physical progress nowadays. The boring machine is made by the Herrenknecht and the diameter of the using concrete-polymer pipes are 1600 and 1800 millimeters. Placing and excavating several shafts on the ground and direct Tunnel boring between the axes of issued shafts is one of the requirements of the micro-tunneling. Considering the stream of the ground located shafts should care the hydraulic circumstances, civic conditions, site geography, traffic cautions and etc. The profile length has to convert to many shortened segment lines so the generated angle between the segments will be based in the manhole centers. Each segment line between two continues drive and receive the shaft, displays the jack location, driving angle and the path straight, thus, the diversity of issued angle causes the variety of jack positioning in the shaft. The jacking frame fixing conditions and it's associated dynamic load direction produces various patterns of Stress and Strain distribution and creating fatigues in the shaft wall and the soil surrounded the shaft. This pattern diversification makes the shaft wall transformed, unbalanced subsidence and alteration in the pipe jacking Stress Contour. This research is based on experiments of the Tehran's west sewerage plan and the numerical analysis the interaction of the soil around the shaft, shaft walls and the Jacking frame direction and finally, the suitable or unsuitable location of the pipe jacking shaft will be determined.Keywords: underground structure, micro-tunneling, fatigue analysis, dynamic-soil–structure interaction, underground water, finite element analysis
Procedia PDF Downloads 318779 Virtual Reality and Avatars in Education
Authors: Michael Brazley
Abstract:
Virtual Reality (VR) and 3D videos are the most current generation of learning technology today. Virtual Reality and 3D videos are being used in professional offices and Schools now for marketing and education. Technology in the field of design has progress from two dimensional drawings to 3D models, using computers and sophisticated software. Virtual Reality is being used as collaborative means to allow designers and others to meet and communicate inside models or VR platforms using avatars. This research proposes to teach students from different backgrounds how to take a digital model into a 3D video, then into VR, and finally VR with multiple avatars communicating with each other in real time. The next step would be to develop the model where people from three or more different locations can meet as avatars in real time, in the same model and talk to each other. This research is longitudinal, studying the use of 3D videos in graduate design and Virtual Reality in XR (Extended Reality) courses. The research methodology is a combination of quantitative and qualitative methods. The qualitative methods begin with the literature review and case studies. The quantitative methods come by way of student’s 3D videos, survey, and Extended Reality (XR) course work. The end product is to develop a VR platform with multiple avatars being able to communicate in real time. This research is important because it will allow multiple users to remotely enter your model or VR platform from any location in the world and effectively communicate in real time. This research will lead to improved learning and training using Virtual Reality and Avatars; and is generalizable because most Colleges, Universities, and many citizens own VR equipment and computer labs. This research did produce a VR platform with multiple avatars having the ability to move and speak to each other in real time. Major implications of the research include but not limited to improved: learning, teaching, communication, marketing, designing, planning, etc. Both hardware and software played a major role in project success.Keywords: virtual reality, avatars, education, XR
Procedia PDF Downloads 98778 Analysis of Fuel Adulteration Consequences in Bangladesh
Authors: Mahadehe Hassan
Abstract:
In most countries manufacturing, trading and distribution of gasoline and diesel fuels belongs to the most important sectors of national economy. For Bangladesh, a robust, well-functioning, secure and smartly managed national fuel distribution chain is an essential precondition for achieving Government top priorities in development and modernization of transportation infrastructure, protection of national environment and population health as well as, very importantly, securing due tax revenue for the State Budget. Bangladesh is a developing country with complex fuel supply network, high fuel taxes incidence and – till now - limited possibilities in application of modern, automated technologies for Government national fuel market control. Such environment allows dishonest physical and legal persons and organized criminals to build and profit from illegal fuel distribution schemes and fuel illicit trade. As a result, the market transparency and the country attractiveness for foreign investments, law-abiding economic operators, national consumers, State Budget and the Government ability to finance development projects, and the country at large suffer significantly. Research shows that over 50% of retail petrol stations in major agglomerations of Bangladesh sell adulterated fuels and/or cheat customers on the real volume of the fuel pumped into their vehicles. Other forms of detected fuel illicit trade practices include misdeclaration of fuel quantitative and qualitative parameters during internal transit and selling of non-declared and smuggled fuels. The aim of the study is to recommend the implementation of a National Fuel Distribution Integrity Program (FDIP) in Bangladesh to address and resolve fuel adulteration and illicit trade problems. The program should be customized according to the specific needs of the country and implemented in partnership with providers of advanced technologies. FDIP should enable and further enhance capacity of respective Bangladesh Government authorities in identification and elimination of all forms of fuel illicit trade swiftly and resolutely. FDIP high-technology, IT and automation systems and secure infrastructures should be aimed at the following areas (1) fuel adulteration, misdeclaration and non-declaration; (2) fuel quality and; (3) fuel volume manipulation at retail level. Furthermore, overall concept of FDIP delivery and its interaction with the reporting and management systems used by the Government shall be aligned with and support objectives of the Vision 2041 and Smart Bangladesh Government programs.Keywords: fuel adulteration, octane, kerosene, diesel, petrol, pollution, carbon emissions
Procedia PDF Downloads 74777 Computer-Aided Depression Screening: A Literature Review on Optimal Methodologies for Mental Health Screening
Authors: Michelle Nighswander
Abstract:
Suicide can be a tragic response to mental illness. It is difficult for people to disclose or discuss suicidal impulses. The stigma surrounding mental health can create a reluctance to seek help for mental illness. Patients may feel pressure to exhibit a socially desirable demeanor rather than reveal these issues, especially if they sense their healthcare provider is pressed for time or does not have an extensive history with their provider. Overcoming these barriers can be challenging. Although there are several validated depression and suicide risk instruments, varying processes used to administer these tools may impact the truthfulness of the responses. A literature review was conducted to find evidence of the impact of the environment on the accuracy of depression screening. Many investigations do not describe the environment and fewer studies use a comparison design. However, three studies demonstrated that computerized self-reporting might be more likely to elicit truthful and accurate responses due to increased privacy when responding compared to a face-to-face interview. These studies showed patients reported positive reactions to computerized screening for other stigmatizing health conditions such as alcohol use during pregnancy. Computerized self-screening for depression offers the possibility of more privacy and patient reflection, which could then send a targeted message of risk to the healthcare provider. This could potentially increase the accuracy while also increasing time efficiency for the clinic. Considering the persistent effects of mental health stigma, how these screening questions are posed can impact patients’ responses. This literature review analyzes trends in depression screening methodologies, the impact of setting on the results and how this may assist in overcoming one barrier caused by stigma.Keywords: computerized self-report, depression, mental health stigma, suicide risk
Procedia PDF Downloads 129776 Developing an Edutainment Game for Children with ADHD Based on SAwD and VCIA Model
Authors: Bruno Gontijo Batista
Abstract:
This paper analyzes how the Socially Aware Design (SAwD) and the Value-oriented and Culturally Informed Approach (VCIA) design model can be used to develop an edutainment game for children with Attention Deficit Hyperactivity Disorder (ADHD). The SAwD approach seeks a design that considers new dimensions in human-computer interaction, such as culture, aesthetics, emotional and social aspects of the user's everyday experience. From this perspective, the game development was VCIA model-based, including the users in the design process through participatory methodologies, considering their behavioral patterns, culture, and values. This is because values, beliefs, and behavioral patterns influence how technology is understood and used and the way it impacts people's lives. This model can be applied at different stages of design, which goes from explaining the problem and organizing the requirements to the evaluation of the prototype and the final solution. Thus, this paper aims to understand how this model can be used in the development of an edutainment game for children with ADHD. In the area of education and learning, children with ADHD have difficulties both in behavior and in school performance, as they are easily distracted, which is reflected both in classes and on tests. Therefore, they must perform tasks that are exciting or interesting for them, once the pleasure center in the brain is activated, it reinforces the center of attention, leaving the child more relaxed and focused. In this context, serious games have been used as part of the treatment of ADHD in children aiming to improve focus and attention, stimulate concentration, as well as be a tool for improving learning in areas such as math and reading, combining education and entertainment (edutainment). Thereby, as a result of the research, it was developed, in a participatory way, applying the VCIA model, an edutainment game prototype, for a mobile platform, for children between 8 and 12 years old.Keywords: ADHD, edutainment, SAwD, VCIA
Procedia PDF Downloads 190775 Automatic Aggregation and Embedding of Microservices for Optimized Deployments
Authors: Pablo Chico De Guzman, Cesar Sanchez
Abstract:
Microservices are a software development methodology in which applications are built by composing a set of independently deploy-able, small, modular services. Each service runs a unique process and it gets instantiated and deployed in one or more machines (we assume that different microservices are deployed into different machines). Microservices are becoming the de facto standard for developing distributed cloud applications due to their reduced release cycles. In principle, the responsibility of a microservice can be as simple as implementing a single function, which can lead to the following issues: - Resource fragmentation due to the virtual machine boundary. - Poor communication performance between microservices. Two composition techniques can be used to optimize resource fragmentation and communication performance: aggregation and embedding of microservices. Aggregation allows the deployment of a set of microservices on the same machine using a proxy server. Aggregation helps to reduce resource fragmentation, and is particularly useful when the aggregated services have a similar scalability behavior. Embedding deals with communication performance by deploying on the same virtual machine those microservices that require a communication channel (localhost bandwidth is reported to be about 40 times faster than cloud vendor local networks and it offers better reliability). Embedding can also reduce dependencies on load balancer services since the communication takes place on a single virtual machine. For example, assume that microservice A has two instances, a1 and a2, and it communicates with microservice B, which also has two instances, b1 and b2. One embedding can deploy a1 and b1 on machine m1, and a2 and b2 are deployed on a different machine m2. This deployment configuration allows each pair (a1-b1), (a2-b2) to communicate using the localhost interface without the need of a load balancer between microservices A and B. Aggregation and embedding techniques are complex since different microservices might have incompatible runtime dependencies which forbid them from being installed on the same machine. There is also a security concern since the attack surface between microservices can be larger. Luckily, container technology allows to run several processes on the same machine in an isolated manner, solving the incompatibility of running dependencies and the previous security concern, thus greatly simplifying aggregation/embedding implementations by just deploying a microservice container on the same machine as the aggregated/embedded microservice container. Therefore, a wide variety of deployment configurations can be described by combining aggregation and embedding to create an efficient and robust microservice architecture. This paper presents a formal method that receives a declarative definition of a microservice architecture and proposes different optimized deployment configurations by aggregating/embedding microservices. The first prototype is based on i2kit, a deployment tool also submitted to ICWS 2018. The proposed prototype optimizes the following parameters: network/system performance, resource usage, resource costs and failure tolerance.Keywords: aggregation, deployment, embedding, resource allocation
Procedia PDF Downloads 203774 Big Data Applications for the Transport Sector
Authors: Antonella Falanga, Armando Cartenì
Abstract:
Today, an unprecedented amount of data coming from several sources, including mobile devices, sensors, tracking systems, and online platforms, characterizes our lives. The term “big data” not only refers to the quantity of data but also to the variety and speed of data generation. These data hold valuable insights that, when extracted and analyzed, facilitate informed decision-making. The 4Vs of big data - velocity, volume, variety, and value - highlight essential aspects, showcasing the rapid generation, vast quantities, diverse sources, and potential value addition of these kinds of data. This surge of information has revolutionized many sectors, such as business for improving decision-making processes, healthcare for clinical record analysis and medical research, education for enhancing teaching methodologies, agriculture for optimizing crop management, finance for risk assessment and fraud detection, media and entertainment for personalized content recommendations, emergency for a real-time response during crisis/events, and also mobility for the urban planning and for the design/management of public and private transport services. Big data's pervasive impact enhances societal aspects, elevating the quality of life, service efficiency, and problem-solving capacities. However, during this transformative era, new challenges arise, including data quality, privacy, data security, cybersecurity, interoperability, the need for advanced infrastructures, and staff training. Within the transportation sector (the one investigated in this research), applications span planning, designing, and managing systems and mobility services. Among the most common big data applications within the transport sector are, for example, real-time traffic monitoring, bus/freight vehicle route optimization, vehicle maintenance, road safety and all the autonomous and connected vehicles applications. Benefits include a reduction in travel times, road accidents and pollutant emissions. Within these issues, the proper transport demand estimation is crucial for sustainable transportation planning. Evaluating the impact of sustainable mobility policies starts with a quantitative analysis of travel demand. Achieving transportation decarbonization goals hinges on precise estimations of demand for individual transport modes. Emerging technologies, offering substantial big data at lower costs than traditional methods, play a pivotal role in this context. Starting from these considerations, this study explores the usefulness impact of big data within transport demand estimation. This research focuses on leveraging (big) data collected during the COVID-19 pandemic to estimate the evolution of the mobility demand in Italy. Estimation results reveal in the post-COVID-19 era, more than 96 million national daily trips, about 2.6 trips per capita, with a mobile population of more than 37.6 million Italian travelers per day. Overall, this research allows us to conclude that big data better enhances rational decision-making for mobility demand estimation, which is imperative for adeptly planning and allocating investments in transportation infrastructures and services.Keywords: big data, cloud computing, decision-making, mobility demand, transportation
Procedia PDF Downloads 62773 Strategies and Approaches for Curriculum Development and Training of Faculty in Cybersecurity Education
Authors: Lucy Tsado
Abstract:
As cybercrime and cyberattacks continue to increase, the need to respond will follow suit. When cybercrimes occur, the duty to respond sometimes falls on law enforcement. However, criminal justice students are not taught concepts in cybersecurity and digital forensics. There is, therefore, an urgent need for many more institutions to begin teaching cybersecurity and related courses to social science students especially criminal justice students. However, many faculty in universities, colleges, and high schools are not equipped to teach these courses or do not have the knowledge and resources to teach important concepts in cybersecurity or digital forensics to criminal justice students. This research intends to develop curricula and training programs to equip faculty with the skills to meet this need. There is a current call to involve non-technical fields to fill the cybersecurity skills gap, according to experts. There is a general belief among non-technical fields that cybersecurity education is only attainable within computer science and technologically oriented fields. As seen from current calls, this is not entirely the case. Transitioning into the field is possible through curriculum development, training, certifications, internships and apprenticeships, and competitions. There is a need to identify how a cybersecurity eco-system can be created at a university to encourage/start programs that will lead to an interest in cybersecurity education as well as attract potential students. A short-term strategy can address this problem through curricula development, while a long-term strategy will address developing training faculty to teach cybersecurity and digital forensics. Therefore this research project addresses this overall problem in two parts, through curricula development for the criminal justice discipline; and training of faculty in criminal justice to teaching the important concepts of cybersecurity and digital forensics.Keywords: cybersecurity education, criminal justice, curricula development, nontechnical cybersecurity, cybersecurity, digital forensics
Procedia PDF Downloads 105772 An Experimental Approach to the Influence of Tipping Points and Scientific Uncertainties in the Success of International Fisheries Management
Authors: Jules Selles
Abstract:
The Atlantic and Mediterranean bluefin tuna fishery have been considered as the archetype of an overfished and mismanaged fishery. This crisis has demonstrated the role of public awareness and the importance of the interactions between science and management about scientific uncertainties. This work aims at investigating the policy making process associated with a regional fisheries management organization. We propose a contextualized computer-based experimental approach, in order to explore the effects of key factors on the cooperation process in a complex straddling stock management setting. Namely, we analyze the effects of the introduction of a socio-economic tipping point and the uncertainty surrounding the estimation of the resource level. Our approach is based on a Gordon-Schaefer bio-economic model which explicitly represents the decision making process. Each participant plays the role of a stakeholder of ICCAT and represents a coalition of fishing nations involved in the fishery and decide unilaterally a harvest policy for the coming year. The context of the experiment induces the incentives for exploitation and collaboration to achieve common sustainable harvest plans at the Atlantic bluefin tuna stock scale. Our rigorous framework allows testing how stakeholders who plan the exploitation of a fish stock (a common pool resource) respond to two kinds of effects: i) the inclusion of a drastic shift in the management constraints (beyond a socio-economic tipping point) and ii) an increasing uncertainty in the scientific estimation of the resource level.Keywords: economic experiment, fisheries management, game theory, policy making, Atlantic Bluefin tuna
Procedia PDF Downloads 253771 Computing Machinery and Legal Intelligence: Towards a Reflexive Model for Computer Automated Decision Support in Public Administration
Authors: Jacob Livingston Slosser, Naja Holten Moller, Thomas Troels Hildebrandt, Henrik Palmer Olsen
Abstract:
In this paper, we propose a model for human-AI interaction in public administration that involves legal decision-making. Inspired by Alan Turing’s test for machine intelligence, we propose a way of institutionalizing a continuous working relationship between man and machine that aims at ensuring both good legal quality and higher efficiency in decision-making processes in public administration. We also suggest that our model enhances the legitimacy of using AI in public legal decision-making. We suggest that case loads in public administration could be divided between a manual and an automated decision track. The automated decision track will be an algorithmic recommender system trained on former cases. To avoid unwanted feedback loops and biases, part of the case load will be dealt with by both a human case worker and the automated recommender system. In those cases an experienced human case worker will have the role of an evaluator, choosing between the two decisions. This model will ensure that the algorithmic recommender system is not compromising the quality of the legal decision making in the institution. It also enhances the legitimacy of using algorithmic decision support because it provides justification for its use by being seen as superior to human decisions when the algorithmic recommendations are preferred by experienced case workers. The paper outlines in some detail the process through which such a model could be implemented. It also addresses the important issue that legal decision making is subject to legislative and judicial changes and that legal interpretation is context sensitive. Both of these issues requires continuous supervision and adjustments to algorithmic recommender systems when used for legal decision making purposes.Keywords: administrative law, algorithmic decision-making, decision support, public law
Procedia PDF Downloads 217770 Assessing the Spatial Distribution of Urban Parks Using Remote Sensing and Geographic Information Systems Techniques
Authors: Hira Jabbar, Tanzeel-Ur Rehman
Abstract:
Urban parks and open spaces play a significant role in improving physical and mental health of the citizens, strengthen the societies and make the cities more attractive places to live and work. As the world’s cities continue to grow, continuing to value green space in cities is vital but is also a challenge, particularly in developing countries where there is pressure for space, resources, and development. Offering equal opportunity of accessibility to parks is one of the important issues of park distribution. The distribution of parks should allow all inhabitants to have close proximity to their residence. Remote sensing and Geographic information systems (GIS) can provide decision makers with enormous opportunities to improve the planning and management of Park facilities. This study exhibits the capability of GIS and RS techniques to provide baseline knowledge about the distribution of parks, level of accessibility and to help in identification of potential areas for such facilities. For this purpose Landsat OLI imagery for year 2016 was acquired from USGS Earth Explorer. Preprocessing models were applied using Erdas Imagine 2014v for the atmospheric correction and NDVI model was developed and applied to quantify the land use/land cover classes including built up, barren land, water, and vegetation. The parks amongst total public green spaces were selected based on their signature in remote sensing image and distribution. Percentages of total green and parks green were calculated for each town of Lahore City and results were then synchronized with the recommended standards. ANGSt model was applied to calculate the accessibility from parks. Service area analysis was performed using Network Analyst tool. Serviceability of these parks has been evaluated by employing statistical indices like service area, service population and park area per capita. Findings of the study may contribute in helping the town planners for understanding the distribution of parks, demands for new parks and potential areas which are deprived of parks. The purpose of present study is to provide necessary information to planners, policy makers and scientific researchers in the process of decision making for the management and improvement of urban parks.Keywords: accessible natural green space standards (ANGSt), geographic information systems (GIS), remote sensing (RS), United States geological survey (USGS)
Procedia PDF Downloads 339769 Theta-Phase Gamma-Amplitude Coupling as a Neurophysiological Marker in Neuroleptic-Naive Schizophrenia
Authors: Jun Won Kim
Abstract:
Objective: Theta-phase gamma-amplitude coupling (TGC) was used as a novel evidence-based tool to reflect the dysfunctional cortico-thalamic interaction in patients with schizophrenia. However, to our best knowledge, no studies have reported the diagnostic utility of the TGC in the resting-state electroencephalographic (EEG) of neuroleptic-naive patients with schizophrenia compared to healthy controls. Thus, the purpose of this EEG study was to understand the underlying mechanisms in patients with schizophrenia by comparing the TGC at rest between two groups and to evaluate the diagnostic utility of TGC. Method: The subjects included 90 patients with schizophrenia and 90 healthy controls. All patients were diagnosed with schizophrenia according to the criteria of Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV) by two independent psychiatrists using semi-structured clinical interviews. Because patients were either drug-naïve (first episode) or had not been taking psychoactive drugs for one month before the study, we could exclude the influence of medications. Five frequency bands were defined for spectral analyses: delta (1–4 Hz), theta (4–8 Hz), slow alpha (8–10 Hz), fast alpha (10–13.5 Hz), beta (13.5–30 Hz), and gamma (30-80 Hz). The spectral power of the EEG data was calculated with fast Fourier Transformation using the 'spectrogram.m' function of the signal processing toolbox in Matlab. An analysis of covariance (ANCOVA) was performed to compare the TGC results between the groups, which were adjusted using a Bonferroni correction (P < 0.05/19 = 0.0026). Receiver operator characteristic (ROC) analysis was conducted to examine the discriminating ability of the TGC data for schizophrenia diagnosis. Results: The patients with schizophrenia showed a significant increase in the resting-state TGC at all electrodes. The delta, theta, slow alpha, fast alpha, and beta powers showed low accuracies of 62.2%, 58.4%, 56.9%, 60.9%, and 59.0%, respectively, in discriminating the patients with schizophrenia from the healthy controls. The ROC analysis performed on the TGC data generated the most accurate result among the EEG measures, displaying an overall classification accuracy of 92.5%. Conclusion: As TGC includes phase, which contains information about neuronal interactions from the EEG recording, TGC is expected to be useful for understanding the mechanisms the dysfunctional cortico-thalamic interaction in patients with schizophrenia. The resting-state TGC value was increased in the patients with schizophrenia compared to that in the healthy controls and had a higher discriminating ability than the other parameters. These findings may be related to the compensatory hyper-arousal patterns of the dysfunctional default-mode network (DMN) in schizophrenia. Further research exploring the association between TGC and medical or psychiatric conditions that may confound EEG signals will help clarify the potential utility of TGC.Keywords: quantitative electroencephalography (QEEG), theta-phase gamma-amplitude coupling (TGC), schizophrenia, diagnostic utility
Procedia PDF Downloads 143768 Metabolic Changes during Reprogramming of Wheat and Triticale Microspores
Authors: Natalia Hordynska, Magdalena Szechynska-Hebda, Miroslaw Sobczak, Elzbieta Rozanska, Joanna Troczynska, Zofia Banaszak, Maria Wedzony
Abstract:
Albinism is a common problem encountered in wheat and triticale breeding programs, which require in vitro culture steps e.g. generation of doubled haploids via androgenesis process. Genetic factor is a major determinant of albinism, however, environmental conditions such as temperature and media composition influence the frequency of albino plant formation. Cold incubation of wheat and triticale spikes induced a switch from gametophytic to sporophytic development. Further, androgenic structures formed from anthers of the genotypes susceptible to androgenesis or treated with cold stress, had a pool of structurally primitive plastids, with small starch granules or swollen thylakoids. High temperature was a factor inducing andro-genesis of wheat and triticale, but at the same time, it was a factor favoring the formation of albino plants. In genotypes susceptible to albinism or after heat stress conditions, cells formed from anthers were vacuolated, and plastids were eliminated. Partial or complete loss of chlorophyll pigments and incomplete differentiation of chloroplast membranes result in formation of tissues or whole plant unable to perform photosynthesis. Indeed, susceptibility to the andro-genesis process was associated with an increase of total concentration of photosynthetic pigments in anthers, spikes and regenerated plants. The proper balance of the synthesis of various pigments, was the starting point for their proper incorporation into photosynthetic membranes. In contrast, genotypes resistant to the androgenesis process and those treated with heat, contained 100 times lower content of photosynthetic pigments. In particular, the synthesis of violaxanthin, zeaxanthin, lutein and chlorophyll b was limited. Furthermore, deregulation of starch and lipids synthesis, which led to the formation of very complex starch granules and an increased number of oleosomes, respectively, correlated with the reduction of the efficiency of androgenesis. The content of other sugars varied depending on the genotype and the type of stress. The highest content of various sugars was found for genotypes susceptible to andro-genesis, and highly reduced for genotypes resistant to androgenesis. The most important sugars seem to be glucose and fructose. They are involved in sugar sensing and signaling pathways, which affect the expression of various genes and regulate plant development. Sucrose, on the other hand, seems to have minor effect at each stage of the androgenesis. The sugar metabolism was related to metabolic activity of microspores. The genotypes susceptible to androgenesis process had much faster mitochondrium- and chloroplast-dependent energy conversion and higher heat production by tissues. Thus, the effectiveness of metabolic processes, their balance and the flexibility under the stress was a factor determining the direction of microspore development, and in the later stages of the androgenesis process, a factor supporting the induction of androgenic structures, chloroplast formation and the regeneration of green plants. The work was financed by Ministry of Agriculture and Rural Development within Program: ‘Biological Progress in Plant Production’, project no HOR.hn.802.15.2018.Keywords: androgenesis, chloroplast, metabolism, temperature stress
Procedia PDF Downloads 260767 Teachers' Disability Disclosure: A Multiple Perspective
Authors: N. Tal-Alon, O. Shapira-Lishchinsky
Abstract:
Disability disclosure is one of the most complicated dilemmas that people with invisible disabilities face. There are only a few research studies that have focused on the difficulties and dilemmas of teachers who have different disabilities. In addition, there are currently no research studies focusing specifically on the different aspects of disability disclosure, which are unique to teachers. This research has, therefore, broadened the knowledge base and understanding of the dilemma of disability disclosure among teachers with invisible physical disabilities. In addition, it has shed light on the ways this issue is perceived by different groups: the perspective of school principals, the perspective of colleagues, and the perspective of teachers with physical disabilities themselves. The study sample included 12 teachers with invisible physical disabilities, 10 school principals who employ at least one teacher with an invisible physical disability, and 10 professional colleagues of at least one teacher with an invisible physical disability. This particular research study was conducted using a qualitative approach through the Narralizer computer program based on a series of in-depth interviews. The data analysis was carried out by grouping major points of interest into specific categories and sub-categories. The findings of this research suggest that teachers with disabilities struggle with the dilemma of whether or not to reveal their disability to the school staff and to their students. It was found that there were considerable differences between the issues that faculty members considered regarding this dilemma and the ones that teachers with disabilities considered. While the principals and professional colleagues focused solely on their own interests, the teachers with a disability emphasized more on the ways that they might have a positive influence on their students, as well as their own individual interests. In addition, school principals on a whole tended to view negatively the option of disclosing the disability to the students and were often critical towards teachers who concealed their disability from the school staff. The importance of this research is in its potential to influence policy decisions that can be implemented by the Ministry of Education regarding the support system for teachers with invisible physical disabilities.Keywords: education, employment, invisible disabilities, teachers
Procedia PDF Downloads 102766 Isolation of Bacterial Species with Potential Capacity for Siloxane Removal in Biogas Upgrading
Authors: Ellana Boada, Eric Santos-Clotas, Alba Cabrera-Codony, Maria Martin, Lluis Baneras, Frederic Gich
Abstract:
Volatile methylsiloxanes (VMS) are a group of manmade silicone compounds widely used in household and industrial applications that end up on the biogas produced through the anaerobic digestion of organic matter in landfills and wastewater treatment plants. The presence of VMS during the biogas energy conversion can cause damage on the engines, reducing the efficiency of this renewable energy source. Non regenerative adsorption onto activated carbon is the most widely used technology to remove siloxanes from biogas, while new trends point out that biotechnology offers a low-cost and environmentally friendly alternative to conventional technologies. The first objective of this research was to enrich, isolate and identify bacterial species able to grow using siloxane molecules as a sole carbon source: anoxic wastewater sludge was used as initial inoculum in liquid anoxic enrichments, adding D4 (as representative siloxane compound) previously adsorbed on activated carbon. After several months of acclimatization, liquid enrichments were plated onto solid media containing D4 and thirty-four bacterial isolates were obtained. 16S rRNA gene sequencing allowed the identification of strains belonging to the following species: Ciceribacter lividus, Alicycliphilus denitrificans, Pseudomonas aeruginosa and Pseudomonas citronellolis which are described to be capable to degrade toxic volatile organic compounds. Kinetic assays with 8 representative strains revealed higher cell growth in the presence of D4 compared to the control. Our second objective was to characterize the community composition and diversity of the microbial community present in the enrichments and to elucidate whether the isolated strains were representative members of the community or not. DNA samples were extracted, the 16S rRNA gene was amplified (515F & 806R primer pair), and the microbiome analyzed from sequences obtained with a MiSeq PE250 platform. Results showed that the retrieved isolates only represented a minor fraction of the microorganisms present in the enrichment samples, which were represented by Alpha, Beta, and Gamma proteobacteria as dominant groups in the category class thus suggesting that other microbial species and/or consortia may be important for D4 biodegradation. These results highlight the need of additional protocols for the isolation of relevant D4 degraders. Currently, we are developing molecular tools targeting key genes involved in siloxane biodegradation to identify and quantify the capacity of the isolates to metabolize D4 in batch cultures supplied with a synthetic gas stream of air containing 60 mg m⁻³ of D4 together with other volatile organic compounds found in the biogas mixture (i.e. toluene, hexane and limonene). The isolates were used as inoculum in a biotrickling filter containing lava rocks and activated carbon to assess their capacity for siloxane removal. Preliminary results of biotrickling filter performance showed 35% of siloxane biodegradation in a contact time of 14 minutes, denoting that biological siloxane removal is a promising technology for biogas upgrading.Keywords: bacterial cultivation, biogas upgrading, microbiome, siloxanes
Procedia PDF Downloads 258765 ESRA: An End-to-End System for Re-identification and Anonymization of Swiss Court Decisions
Authors: Joel Niklaus, Matthias Sturmer
Abstract:
The publication of judicial proceedings is a cornerstone of many democracies. It enables the court system to be made accountable by ensuring that justice is made in accordance with the laws. Equally important is privacy, as a fundamental human right (Article 12 in the Declaration of Human Rights). Therefore, it is important that the parties (especially minors, victims, or witnesses) involved in these court decisions be anonymized securely. Today, the anonymization of court decisions in Switzerland is performed either manually or semi-automatically using primitive software. While much research has been conducted on anonymization for tabular data, the literature on anonymization for unstructured text documents is thin and virtually non-existent for court decisions. In 2019, it has been shown that manual anonymization is not secure enough. In 21 of 25 attempted Swiss federal court decisions related to pharmaceutical companies, pharmaceuticals, and legal parties involved could be manually re-identified. This was achieved by linking the decisions with external databases using regular expressions. An automated re-identification system serves as an automated test for the safety of existing anonymizations and thus promotes the right to privacy. Manual anonymization is very expensive (recurring annual costs of over CHF 20M in Switzerland alone, according to an estimation). Consequently, many Swiss courts only publish a fraction of their decisions. An automated anonymization system reduces these costs substantially, further leading to more capacity for publishing court decisions much more comprehensively. For the re-identification system, topic modeling with latent dirichlet allocation is used to cluster an amount of over 500K Swiss court decisions into meaningful related categories. A comprehensive knowledge base with publicly available data (such as social media, newspapers, government documents, geographical information systems, business registers, online address books, obituary portal, web archive, etc.) is constructed to serve as an information hub for re-identifications. For the actual re-identification, a general-purpose language model is fine-tuned on the respective part of the knowledge base for each category of court decisions separately. The input to the model is the court decision to be re-identified, and the output is a probability distribution over named entities constituting possible re-identifications. For the anonymization system, named entity recognition (NER) is used to recognize the tokens that need to be anonymized. Since the focus lies on Swiss court decisions in German, a corpus for Swiss legal texts will be built for training the NER model. The recognized named entities are replaced by the category determined by the NER model and an identifier to preserve context. This work is part of an ongoing research project conducted by an interdisciplinary research consortium. Both a legal analysis and the implementation of the proposed system design ESRA will be performed within the next three years. This study introduces the system design of ESRA, an end-to-end system for re-identification and anonymization of Swiss court decisions. Firstly, the re-identification system tests the safety of existing anonymizations and thus promotes privacy. Secondly, the anonymization system substantially reduces the costs of manual anonymization of court decisions and thus introduces a more comprehensive publication practice.Keywords: artificial intelligence, courts, legal tech, named entity recognition, natural language processing, ·privacy, topic modeling
Procedia PDF Downloads 148764 A Comprehensive Theory of Communication with Biological and Non-Biological Intelligence for a 21st Century Curriculum
Authors: Thomas Schalow
Abstract:
It is commonly recognized that our present curriculum is not preparing students to function in the 21st century. This is particularly true in regard to communication needs across cultures - both human and non-human. In this paper, a comprehensive theory of communication-based on communication with non-human cultures and intelligences is presented to meet the following three imminent contingencies: communicating with sentient biological intelligences, communicating with extraterrestrial intelligences, and communicating with artificial super-intelligences. The paper begins with the argument that we need to become much more serious about communicating with the non-human, intelligent life forms that already exists around us here on Earth. We need to broaden our definition of communication and reach out to other sentient life forms in order to provide humanity with a better perspective of its place within our ecosystem. The paper next examines the science and philosophy behind CETI (communication with extraterrestrial intelligences) and how it could prove useful even in the absence of contact with alien life. However, CETI’s assumptions and methodology need to be revised in accordance with the communication theory being proposed in this paper if we are truly serious about finding and communicating with life beyond Earth. The final theme explored in this paper is communication with non-biological super-intelligences. Humanity has never been truly compelled to converse with other species, and our failure to seriously consider such intercourse has left us largely unprepared to deal with communication in a future that will be mediated and controlled by computer algorithms. Fortunately, our experience dealing with other cultures can provide us with a framework for this communication. The basic concepts behind intercultural communication can be applied to the three types of communication envisioned in this paper if we are willing to recognize that we are in fact dealing with other cultures when we interact with other species, alien life, and artificial super-intelligence. The ideas considered in this paper will require a new mindset for humanity, but a new disposition will yield substantial gains. A curriculum that is truly ready for the 21st century needs to be aligned with this new theory of communication.Keywords: artificial intelligence, CETI, communication, language
Procedia PDF Downloads 364763 Crude Glycerol Affects Canine Spermatoa Motility: Computer Assister Semen Analysis in Vitro
Authors: P. Massanyi, L. Kichi, T. Slanina, E. Kolesar, J. Danko, N. Lukac, E. Tvrda, R. Stawarz, A. Kolesarova
Abstract:
Target of this study was the analysis of the impact of crude glycerol on canine spermatozoa motility, morphology, viability, and membrane integrity. Experiments were realized in vitro. In the study, semen from 5 large dog breeds was used. They were typical representatives of large breeds, coming from healthy rearing, regularly vaccinated and integrated to the further breeding. Semen collections were realized at the owners of animals and in the veterinary clinic. Subsequently the experiments were realized at the Department of Animal Physiology of the SUA in Nitra. The spermatozoa motility was evaluated using CASA analyzer (SpermVisionTM, Minitub, Germany) at the temperature 5 and 37°C for 5 hours. In the study, 13 motility parameters were evaluated. Generally, crude glycerol has generally negative effect on spermatozoa motility. Morphological analysis was realized using Hancock staining and the preparations were evaluated at magnification 1000x using classification tables of morphologically changed spermatozoa. Data clearly detected the highest number of morphologically changed spermatozoa in the experimental groups (know twisted tails, tail torso and tail coiling). For acrosome alterations swelled acrosomes, removed acrosomes and acrosomes with undulated membrane were detected. In this study also the effect of crude glycerol on spermatozoa membrane integrity were analyzed. The highest crude glycerol concentration significantly affects spermatozoa integrity. Results of this study show that crude glycerol has effect of spermatozoa motility, viability, and membrane integrity. Detected changes are related to crude glycerol concentration, temperature, as well as time of incubation.Keywords: dog, semen, spermatozoa, acrosome, glycerol, CASA, viability
Procedia PDF Downloads 319762 Smart Contracts: Bridging the Divide Between Code and Law
Authors: Abeeb Abiodun Bakare
Abstract:
The advent of blockchain technology has birthed a revolutionary innovation: smart contracts. These self-executing contracts, encoded within the immutable ledger of a blockchain, hold the potential to transform the landscape of traditional contractual agreements. This research paper embarks on a comprehensive exploration of the legal implications surrounding smart contracts, delving into their enforceability and their profound impact on traditional contract law. The first section of this paper delves into the foundational principles of smart contracts, elucidating their underlying mechanisms and technological intricacies. By harnessing the power of blockchain technology, smart contracts automate the execution of contractual terms, eliminating the need for intermediaries and enhancing efficiency in commercial transactions. However, this technological marvel raises fundamental questions regarding legal enforceability and compliance with traditional legal frameworks. Moving beyond the realm of technology, the paper proceeds to analyze the legal validity of smart contracts within the context of traditional contract law. Drawing upon established legal principles, such as offer, acceptance, and consideration, we examine the extent to which smart contracts satisfy the requirements for forming a legally binding agreement. Furthermore, we explore the challenges posed by jurisdictional issues as smart contracts transcend physical boundaries and operate within a decentralized network. Central to this analysis is the examination of the role of arbitration and dispute resolution mechanisms in the context of smart contracts. While smart contracts offer unparalleled efficiency and transparency in executing contractual terms, disputes inevitably arise, necessitating mechanisms for resolution. We investigate the feasibility of integrating arbitration clauses within smart contracts, exploring the potential for decentralized arbitration platforms to streamline dispute resolution processes. Moreover, this paper explores the implications of smart contracts for traditional legal intermediaries, such as lawyers and judges. As smart contracts automate the execution of contractual terms, the role of legal professionals in contract drafting and interpretation may undergo significant transformation. We assess the implications of this paradigm shift for legal practice and the broader legal profession. In conclusion, this research paper provides a comprehensive analysis of the legal implications surrounding smart contracts, illuminating the intricate interplay between code and law. While smart contracts offer unprecedented efficiency and transparency in commercial transactions, their legal validity remains subject to scrutiny within traditional legal frameworks. By navigating the complex landscape of smart contract law, we aim to provide insights into the transformative potential of this groundbreaking technology.Keywords: smart-contracts, law, blockchain, legal, technology
Procedia PDF Downloads 45761 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison
Authors: Xiangtuo Chen, Paul-Henry Cournéde
Abstract:
Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest
Procedia PDF Downloads 231760 Predicting Polyethylene Processing Properties Based on Reaction Conditions via a Coupled Kinetic, Stochastic and Rheological Modelling Approach
Authors: Kristina Pflug, Markus Busch
Abstract:
Being able to predict polymer properties and processing behavior based on the applied operating reaction conditions in one of the key challenges in modern polymer reaction engineering. Especially, for cost-intensive processes such as the high-pressure polymerization of low-density polyethylene (LDPE) with high safety-requirements, the need for simulation-based process optimization and product design is high. A multi-scale modelling approach was set-up and validated via a series of high-pressure mini-plant autoclave reactor experiments. The approach starts with the numerical modelling of the complex reaction network of the LDPE polymerization taking into consideration the actual reaction conditions. While this gives average product properties, the complex polymeric microstructure including random short- and long-chain branching is calculated via a hybrid Monte Carlo-approach. Finally, the processing behavior of LDPE -its melt flow behavior- is determined in dependence of the previously determined polymeric microstructure using the branch on branch algorithm for randomly branched polymer systems. All three steps of the multi-scale modelling approach can be independently validated against analytical data. A triple-detector GPC containing an IR, viscosimetry and multi-angle light scattering detector is applied. It serves to determine molecular weight distributions as well as chain-length dependent short- and long-chain branching frequencies. 13C-NMR measurements give average branching frequencies, and rheological measurements in shear and extension serve to characterize the polymeric flow behavior. The accordance of experimental and modelled results was found to be extraordinary, especially taking into consideration that the applied multi-scale modelling approach does not contain parameter fitting of the data. This validates the suggested approach and proves its universality at the same time. In the next step, the modelling approach can be applied to other reactor types, such as tubular reactors or industrial scale. Moreover, sensitivity analysis for systematically varying process conditions is easily feasible. The developed multi-scale modelling approach finally gives the opportunity to predict and design LDPE processing behavior simply based on process conditions such as feed streams and inlet temperatures and pressures.Keywords: low-density polyethylene, multi-scale modelling, polymer properties, reaction engineering, rheology
Procedia PDF Downloads 124759 Central Nervous System Lesion Differentiation in the Emergency Radiology Department
Authors: Angelis P. Barlampas
Abstract:
An 89 years old woman came to the emergency department complaining of long-lasting headaches and nausea. A CT examination was performed, and a homogeneous midline anterior cranial fossa lesion was revealed, which was situated near the base and measured 2,4 cm in diameter. The patient was allergic, and an i.v.c injection could not be done on the spot, and neither could an MRI exam because of metallic implants. How could someone narrow down the differential diagnosis? The interhemispheric meningioma is usually a silent midline lesion with no edema, and most often presents as a homogeneous, solid type, isodense, or slightly hyperdense mass ( usually the smallest lesions as this one ). Of them, 20-30% have some calcifications. Hyperostosis is typical for meningiomas that abut the base of the skull but is absent in the current case, presumably of a more cephalad location that is borderline away from the bone. Because further investigation could not be done, as the patient was allergic to the contrast media, some other differential options should be considered. Regarding the site of the lesion, the most common other entities to keep in mind are the following: Metastasis, tumor of skull base, abscess, primary brain tumors, meningioma, giant aneurysm of the anterior cerebral artery, olfactory neuroblastoma, interhemispheric meningioma, giant aneurysm of the anterior cerebral artery, midline lesion. Appearance will depend on whether the aneurysm is non-thrombosed, or partially, or completely thrombosed. Non-contrast: slightly hyperdense, well-defined round extra-axial mass, may demonstrate a peripheral calcified rim, olfactory neuroblastoma, midline lesion. The mass is of soft tissue attenuation and is relatively homogeneous. Focal calcifications are occasionally present. When an intracranial extension is present, peritumoral cysts between it and the overlying brain are often present. Final diagnosis interhemispheric meningioma (Known from the previous patient’s history). Meningiomas come from the meningocytes or the arachnoid cells of the meninges. They are usually found incidentally, have an indolent course, and their most common location is extra-axial, parasagittal, and supratentorial. Other locations include the sphenoid ridge, olfactory groove, juxtasellar, infratentorial, intraventricular, pineal gland area, and optic nerve meningioma. They are clinically silent entities, except for large ones, which can present with headaches, changes in personality status, paresis, or symptomatology according to their specific site and may cause edema of the surrounding brain tissue. Imaging findings include the presence of calcifications, the CSF cleft sign, hyperostosis of adjacent bone, dural tail, and white matter buckling sign. After i.v.c. injection, they enhance brightly and homogenously, except for large ones, which may exhibit necrotic areas or may be heavily calcified. Malignant or cystic variants demonstrate more heterogeneity and less intense enhancement. Sometimes, it is inevitable that the needed CT protocol cannot be performed, especially in the emergency department. In these cases, the radiologist must focus on the characteristic imaging features of the unenhanced lesion, as well as in previous examinations or a known lesion history, in order to come to the right report conclusion.Keywords: computed tomography, emergency radiology, metastasis, tumor of skull base, abscess, primary brain tumors, meningioma, giant aneurysm of the anterior cerebral artery, olfactory neuroblastoma, interhemispheric meningioma
Procedia PDF Downloads 69