Search results for: edge application framework
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12959

Search results for: edge application framework

12569 Public Economic Efficiency and Case-Based Reasoning: A Theoretical Framework to Police Performance

Authors: Javier Parra-Domínguez, Juan Manuel Corchado

Abstract:

At present, public efficiency is a concept that intends to maximize return on public investment focus on minimizing the use of resources and maximizing the outputs. The concept takes into account statistical criteria drawn up according to techniques such as DEA (Data Envelopment Analysis). The purpose of the current work is to consider, more precisely, the theoretical application of CBR (Case-Based Reasoning) from economics and computer science, as a preliminary step to improving the efficiency of law enforcement agencies (public sector). With the aim of increasing the efficiency of the public sector, we have entered into a phase whose main objective is the implementation of new technologies. Our main conclusion is that the application of computer techniques, such as CBR, has become key to the efficiency of the public sector, which continues to require economic valuation based on methodologies such as DEA. As a theoretical result and conclusion, the incorporation of CBR systems will reduce the number of inputs and increase, theoretically, the number of outputs generated based on previous computer knowledge.

Keywords: case-based reasoning, knowledge, police, public efficiency

Procedia PDF Downloads 100
12568 Legal Theories Underpinning Access to Justice for Victims of Sexual Violence in Refugee Camps in Africa

Authors: O. E. Eberechi, G. P. Stevens

Abstract:

Legal theory has been referred to as the explanation of why things do or do not happen. It also describes situations and why they ensue. It provides a normative framework by which things are regulated and a foundation for the establishment of legal mechanisms/institutions that can bring about a desired change in a society. Furthermore, it offers recommendations in resolving practical problems and describes what the law is, what the law ought to be and defines the legal landscape generally. Some legal theories provide a universal standard, e.g. human rights, while others are capable of organizing and streamlining the collective use, and, by extension, bring order to society. Legal theory is used to explain how the world works and how it does not work. This paper will argue for the application of the principles of legal theory in the achievement of access to justice for female victims of sexual violence in refugee camps in Africa through the analysis of legal theories underpinning the access to justice for these women. It is a known fact that female refugees in camps in Africa often experience some form of sexual violation. The perpetrators of these incidents may never be apprehended, prosecuted, convicted or sentenced. Where prosecution does occur, the perpetrators are either acquitted as a result of poor investigation, inept prosecution, a lack of evidence, or the case may be dismissed owing to tardiness on the part of the prosecutor, which accounts for the culture of impunity in refugee camps. In other words, victims do not have access to the justice that could ameliorate the plight of the victims. There is, thus, a need for a legal framework that will facilitate access to justice for these victims. This paper will start with an introduction, and be followed by the definition of legal theory, its functions and its application in law. Secondly, it will provide a brief explanation of the problems faced by female refugees who are victims of sexual violence in refugee camps in Africa. Thirdly, it will embark on an analysis of theories which will be a help to an understanding of the precarious situation of female refugees, why they are violated, the need for access to justice for these victims, and the principles of legal theory in its usefulness in resolving access to justice for these victims.

Keywords: access to justice, underpinning legal theory, refugee, sexual violence

Procedia PDF Downloads 401
12567 Performance Evaluation of Routing Protocols for Video Conference over MPLS VPN Network

Authors: Abdullah Al Mamun, Tarek R. Sheltami

Abstract:

Video conferencing is a highly demanding facility now a days in order to its real time characteristics, but faster communication is the prior requirement of this technology. Multi Protocol Label Switching (MPLS) IP Virtual Private Network (VPN) address this problem and it is able to make a communication faster than others techniques. However, this paper studies the performance comparison of video traffic between two routing protocols namely the Enhanced Interior Gateway Protocol(EIGRP) and Open Shortest Path First (OSPF). The combination of traditional routing and MPLS improve the forwarding mechanism, scalability and overall network performance. We will use GNS3 and OPNET Modeler 14.5 to simulate many different scenarios and metrics such as delay, jitter and mean opinion score (MOS) value are measured. The simulation result will show that OSPF and BGP-MPLS VPN offers best performance for video conferencing application.

Keywords: OSPF, BGP, EIGRP, MPLS, Video conference, Provider router, edge router, layer3 VPN

Procedia PDF Downloads 315
12566 General Framework for Price Regulation of Container Terminals

Authors: Murat Yildiz, Burcu Yildiz

Abstract:

Price Cap Regulation is a form of economic regulation designed in the 1980s in the United Kingdom. Price cap regulation sets a cap on the price that the utility provider can charge. The cap is set according to several economic factors, such as the price cap index, expected efficiency savings and inflation. It has been used by several countries as a regulatory regime in several sectors. Container port privatization is still in early stages in some countries. Lack of a general framework can be an impediment to privatization. This paper aims a general framework to comprising decisions to be made for variables which are able to accommodate the variety of container terminals. Several approaches that may be needed as well as a passage between approaches.

Keywords: Price Cap Regulation, ports privatization, container terminal price regime, earning sharing

Procedia PDF Downloads 331
12565 Deep-Learning to Generation of Weights for Image Captioning Using Part-of-Speech Approach

Authors: Tiago do Carmo Nogueira, Cássio Dener Noronha Vinhal, Gélson da Cruz Júnior, Matheus Rudolfo Diedrich Ullmann

Abstract:

Generating automatic image descriptions through natural language is a challenging task. Image captioning is a task that consistently describes an image by combining computer vision and natural language processing techniques. To accomplish this task, cutting-edge models use encoder-decoder structures. Thus, Convolutional Neural Networks (CNN) are used to extract the characteristics of the images, and Recurrent Neural Networks (RNN) generate the descriptive sentences of the images. However, cutting-edge approaches still suffer from problems of generating incorrect captions and accumulating errors in the decoders. To solve this problem, we propose a model based on the encoder-decoder structure, introducing a module that generates the weights according to the importance of the word to form the sentence, using the part-of-speech (PoS). Thus, the results demonstrate that our model surpasses state-of-the-art models.

Keywords: gated recurrent units, caption generation, convolutional neural network, part-of-speech

Procedia PDF Downloads 73
12564 Design and Implementation of Testable Reversible Sequential Circuits Optimized Power

Authors: B. Manikandan, A. Vijayaprabhu

Abstract:

The conservative reversible gates are used to designed reversible sequential circuits. The sequential circuits are flip-flops and latches. The conservative logic gates are Feynman, Toffoli, and Fredkin. The design of two vectors testable sequential circuits based on conservative logic gates. All sequential circuit based on conservative logic gates can be tested for classical unidirectional stuck-at faults using only two test vectors. The two test vectors are all 1s, and all 0s. The designs of two vectors testable latches, master-slave flip-flops and double edge triggered (DET) flip-flops are presented. We also showed the application of the proposed approach toward 100% fault coverage for single missing/additional cell defect in the quantum- dot cellular automata (QCA) layout of the Fredkin gate. The conservative logic gates are in terms of complexity, speed, and area.

Keywords: DET, QCA, reversible logic gates, POS, SOP, latches, flip flops

Procedia PDF Downloads 280
12563 Competition in Kenya: The Legal and Institutional Framework and an Appraisal of Key Market Players

Authors: Edwin Njoroge Kimani, Alan M. Munyao

Abstract:

Despite Kenya’s status as a regional economic powerhouse, it struggles with economic shocks that expose the consumers. This, however, seems not to affect major cooperates such as those in the telecommunication and energy sectors. Through their operations, they have not only been able to fluctuate prices at will but also they have been accused of curtailing their rivals from penetrating the market. This study, through literature review of the legal and institutional framework, reports and publications interrogates the law and uncovers the following; i) failings of the legal framework to define market dominance and abuse of such positions, ii) the participation of the state, iii) the inertia of the government to prosecute corporations that abuse their market dominance, iv) the role of the state as a market player and as a regulator through the Competition Authority of Kenya. This study concludes that the market distortion is as a result of weak legal and institutional framework as well as conflict of interest by the government. Not much has been researched in the field of competition law the greater East Africa. This research is intended to form part of the growing research in the field and inform legal reform.

Keywords: competition law, economic power, dominance, Kenya

Procedia PDF Downloads 187
12562 Mean Shift-Based Preprocessing Methodology for Improved 3D Buildings Reconstruction

Authors: Nikolaos Vassilas, Theocharis Tsenoglou, Djamchid Ghazanfarpour

Abstract:

In this work we explore the capability of the mean shift algorithm as a powerful preprocessing tool for improving the quality of spatial data, acquired from airborne scanners, from densely built urban areas. On one hand, high resolution image data corrupted by noise caused by lossy compression techniques are appropriately smoothed while at the same time preserving the optical edges and, on the other, low resolution LiDAR data in the form of normalized Digital Surface Map (nDSM) is upsampled through the joint mean shift algorithm. Experiments on both the edge-preserving smoothing and upsampling capabilities using synthetic RGB-z data show that the mean shift algorithm is superior to bilateral filtering as well as to other classical smoothing and upsampling algorithms. Application of the proposed methodology for 3D reconstruction of buildings of a pilot region of Athens, Greece results in a significant visual improvement of the 3D building block model.

Keywords: 3D buildings reconstruction, data fusion, data upsampling, mean shift

Procedia PDF Downloads 293
12561 Assessing the Efficacy of Artificial Intelligence Integration in the FLO Health Application

Authors: Reema Alghamdi, Rasees Aleisa, Layan Sukkar

Abstract:

The primary objective of this research is to conduct an examination of the Flo menstrual cycle application. We do that by evaluating the user experience and their satisfaction with integrated AI features. The study seeks to gather data from primary resources, primarily through surveys, to gather different insights about the application, like its usability functionality in addition to the overall user satisfaction. The focus of our project will be particularly directed towards the impact and user perspectives regarding the integration of artificial intelligence features within the application, contributing to an understanding of the holistic user experience.

Keywords: period, women health, machine learning, AI features, menstrual cycle

Procedia PDF Downloads 40
12560 Application of Response Surface Methodology (RSM) for Optimization of Fluoride Removal by Using Banana Peel

Authors: Pallavi N., Gayatri Jadhav

Abstract:

Good quality water is of prime importance for a healthy living. Fluoride is one such mineral present in water which causes many health problems in humans and specially children. Fluoride is said to be a double edge sword because lesser and higher concentration of fluoride in drinking water can cause both dental and skeletal fluorosis. Fluoride is one of the important mineral usually present at a higher concentration in ground water. There are many researches being carried out for defluoridation method. In the present research, fluoride removal is demonstrated using banana peel which is a biowaste as a biocoagulant. Response Surface Methodology (RSM) is a statistical design tool which is used to design the experiment. Central Composite Design (CCD) was used to determine the influence of the pH and dosage of the coagulant on the optimal removal of fluoride from a simulated water sample. 895 of fluoride removal were obtained in a acidic pH range of 4 – 9 and bio coagulant dosage of dosage of 18 – 20mg/L.

Keywords: Fluoride, Response Surface Methodology, Dosage, banana peel

Procedia PDF Downloads 137
12559 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework

Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe

Abstract:

This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.

Keywords: IoT, fog, cloud, data analysis, data privacy

Procedia PDF Downloads 71
12558 English Learning Speech Assistant Speak Application in Artificial Intelligence

Authors: Albatool Al Abdulwahid, Bayan Shakally, Mariam Mohamed, Wed Almokri

Abstract:

Artificial intelligence has infiltrated every part of our life and every field we can think of. With technical developments, artificial intelligence applications are becoming more prevalent. We chose ELSA speak because it is a magnificent example of Artificial intelligent applications, ELSA speak is a smartphone application that is free to download on both IOS and Android smartphones. ELSA speak utilizes artificial intelligence to help non-native English speakers pronounce words and phrases similar to a native speaker, as well as enhance their English skills. It employs speech-recognition technology that aids the application to excel the pronunciation of its users. This remarkable feature distinguishes ELSA from other voice recognition algorithms and increase the efficiency of the application. This study focused on evaluating ELSA speak application, by testing the degree of effectiveness based on survey questions. The results of the questionnaire were variable. The generality of the participants strongly agreed that ELSA has helped them enhance their pronunciation skills. However, a few participants were unconfident about the application’s ability to assist them in their learning journey.

Keywords: ELSA speak application, artificial intelligence, speech-recognition technology, language learning, english pronunciation

Procedia PDF Downloads 77
12557 An Efficient Encryption Scheme Using DWT and Arnold Transforms

Authors: Ali Abdrhman M. Ukasha

Abstract:

Data security needed in data transmission, storage, and communication to ensure the security. The color image is decomposed into red, green, and blue channels. The blue and green channels are compressed using 3-levels discrete wavelet transform. The Arnold transform uses to changes the locations of red image channel pixels as image scrambling process. Then all these channels are encrypted separately using a key image that has same original size and is generating using private keys and modulo operations. Performing the X-OR and modulo operations between the encrypted channels images for image pixel values change purpose. The extracted contours of color image recovery can be obtained with accepted level of distortion using Canny edge detector. Experiments have demonstrated that proposed algorithm can fully encrypt 2D color image and completely reconstructed without any distortion. It has shown that the color image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.

Keywords: color image, wavelet transform, edge detector, Arnold transform, lossy image encryption

Procedia PDF Downloads 453
12556 Evaluation of Sustainable Blue Economy Development Performance: Method and Case

Authors: Mingbao Chen

Abstract:

After Rio+20, the blue economy rises all over the world, and it has become the focus field of national development. At present, the blue economy has become a new growth point in the field of global economy and the direction of the development of ‘green’ in the ocean. However, in fact, the key factors affecting the development of the blue economy have not been explored in depth, and the development policies and performance of the blue economy have not been scientifically evaluated. This cannot provide useful guidance for the development of the blue economy. Therefore, it is urgent to establish a quantitative evaluation framework to measure the performance of the blue economic development. Based on the full understanding of the connotation and elements of the blue economy, and studying the literature, this article has built an universality and operability evaluation index system, including ecological environment, social justice, sustainable growth, policy measures, and so on. And this article also established a sound evaluation framework of blue economic development performance. At the same time, this article takes China as a sample to test the framework of the adaptability, and to assess the performance of China's blue economic.

Keywords: Blue economy, development performance, evaluation framework, assess method

Procedia PDF Downloads 224
12555 Optimisation of Structural Design by Integrating Genetic Algorithms in the Building Information Modelling Environment

Authors: Tofigh Hamidavi, Sepehr Abrishami, Pasquale Ponterosso, David Begg

Abstract:

Structural design and analysis is an important and time-consuming process, particularly at the conceptual design stage. Decisions made at this stage can have an enormous effect on the entire project, as it becomes ever costlier and more difficult to alter the choices made early on in the construction process. Hence, optimisation of the early stages of structural design can provide important efficiencies in terms of cost and time. This paper suggests a structural design optimisation (SDO) framework in which Genetic Algorithms (GAs) may be used to semi-automate the production and optimisation of early structural design alternatives. This framework has the potential to leverage conceptual structural design innovation in Architecture, Engineering and Construction (AEC) projects. Moreover, this framework improves the collaboration between the architectural stage and the structural stage. It will be shown that this SDO framework can make this achievable by generating the structural model based on the extracted data from the architectural model. At the moment, the proposed SDO framework is in the process of validation, involving the distribution of an online questionnaire among structural engineers in the UK.

Keywords: building information, modelling, BIM, genetic algorithm, GA, architecture-engineering-construction, AEC, optimisation, structure, design, population, generation, selection, mutation, crossover, offspring

Procedia PDF Downloads 211
12554 Research and Development of Methodology, Tools, Techniques and Methods to Analyze and Design Interface, Media, Pedagogy for Educational Topics to be Delivered via Mobile Technology

Authors: Shimaa Nagro, Russell Campion

Abstract:

Mobile devices are becoming ever more widely available, with growing functionality, and they are increasingly used as enabling technology to give students access to educational material anytime and anywhere. However, the design of educational material's user interfaces for mobile devices is beset by many unresolved research problems such as those arising from constraints associated with mobile devices or from issues linked to effective learning. The proposed research aims to produce: (i) a method framework for the design and evaluation of educational material’s interfaces to be delivered on mobile devices, in multimedia form based on Human Computer Interaction strategies; and (ii) a software tool implemented as a fast-track alternative to use the method framework in full. The investigation will combine qualitative and quantitative methods, including interviews and questionnaires for data collection and three case studies for validating the method framework. The method framework is a framework to enable an educational designer to effectively and efficiently create educational multimedia interfaces to be used on mobile devices by following a particular methodology that contains practical and usable tools and techniques. It is a method framework that accepts any educational material in its final lesson plan and deals with this plan as a static element, it will not suggest any changes in any information given in the lesson plan but it will help the instructor to design his final lesson plan in a multimedia format to be presented in mobile devices.

Keywords: mobile learning, M-Learn, HCI, educational multimedia, interface design

Procedia PDF Downloads 346
12553 Runtime Monitoring Using Policy-Based Approach to Control Information Flow for Mobile Apps

Authors: Mohamed Sarrab, Hadj Bourdoucen

Abstract:

Mobile applications are verified to check the correctness or evaluated to check the performance with respect to specific security properties such as availability, integrity, and confidentiality. Where they are made available to the end users of the mobile application is achievable only to a limited degree using software engineering static verification techniques. The more sensitive the information, such as credit card data, personal medical information or personal emails being processed by mobile application, the more important it is to ensure the confidentiality of this information. Monitoring non-trusted mobile application during execution in an environment where sensitive information is present is difficult and unnerving. The paper addresses the issue of monitoring and controlling the flow of confidential information during non-trusted mobile application execution. The approach concentrates on providing a dynamic and usable information security solution by interacting with the mobile users during the run-time of mobile application in response to information flow events.

Keywords: mobile application, run-time verification, usable security, direct information flow

Procedia PDF Downloads 358
12552 Client Hacked Server

Authors: Bagul Abhijeet

Abstract:

Background: Client-Server model is the backbone of today’s internet communication. In which normal user can not have control over particular website or server? By using the same processing model one can have unauthorized access to particular server. In this paper, we discussed about application scenario of hacking for simple website or server consist of unauthorized way to access the server database. This application emerges to autonomously take direct access of simple website or server and retrieve all essential information maintain by administrator. In this system, IP address of server given as input to retrieve user-id and password of server. This leads to breaking administrative security of server and acquires the control of server database. Whereas virus helps to escape from server security by crashing the whole server. Objective: To control malicious attack and preventing all government website, and also find out illegal work to do hackers activity. Results: After implementing different hacking as well as non-hacking techniques, this system hacks simple web sites with normal security credentials. It provides access to server database and allow attacker to perform database operations from client machine. Above Figure shows the experimental result of this application upon different servers and provides satisfactory results as required. Conclusion: In this paper, we have presented a to view to hack the server which include some hacking as well as non-hacking methods. These algorithms and methods provide efficient way to hack server database. By breaking the network security allow to introduce new and better security framework. The terms “Hacking” not only consider for its illegal activities but also it should be use for strengthen our global network.

Keywords: Hacking, Vulnerabilities, Dummy request, Virus, Server monitoring

Procedia PDF Downloads 229
12551 An Analysis of a Canadian Personalized Learning Curriculum

Authors: Ruthanne Tobin

Abstract:

The shift to a personalized learning (PL) curriculum in Canada represents an innovative approach to teaching and learning that is also evident in various initiatives across the 32-nation OECD. The premise behind PL is that empowering individual learners to have more input into how they access and construct knowledge, and express their understanding of it, will result in more meaningful school experiences and academic success. In this paper presentation, the author reports on a document analysis of the new curriculum in the province of British Columbia. Three theoretical frameworks are used to analyze the new curriculum. Framework 1 focuses on five dominant aspects (FDA) of PL at the classroom level. Framework 2 focuses on conceptualizing and enacting personalized learning (CEPL) within three spheres of influence. Framework 3 focuses on the integration of three types of knowledge (content, technological, and pedagogical). Analysis is ongoing, but preliminary findings suggest that the new curriculum addresses framework 1 quite well, which identifies five areas of personalized learning: 1) assessment for learning; 2) effective teaching and learning; 3) curriculum entitlement (choice); 4) school organization; and 5) “beyond the classroom walls” (learning in the community). Framework 2 appears to be less well developed in the new curriculum. This framework speaks to the dynamics of PL within three spheres of interaction: 1) nested agency, comprised of overarching constraints [and enablers] from policy makers, school administrators and community; 2) relational agency, which refers to a capacity for professionals to develop a network of expertise to serve shared goals; and 3) students’ personalized learning experience, which integrates differentiation with self-regulation strategies. Framework 3 appears to be well executed in the new PL curriculum, as it employs the theoretical model of technological, pedagogical content knowledge (TPACK) in which there are three interdependent bodies of knowledge. Notable within this framework is the emphasis on the pairing of technologies with excellent pedagogies to significantly assist students and teachers. This work will be of high relevance to educators interested in innovative school reform.

Keywords: curriculum reform, K-12 school change, innovations in education, personalized learning

Procedia PDF Downloads 255
12550 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm

Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy

Abstract:

IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.

Keywords: IoT, fog networks, data stewardship, dynamic access policy

Procedia PDF Downloads 32
12549 Cyber Security Enhancement via Software Defined Pseudo-Random Private IP Address Hopping

Authors: Andre Slonopas, Zona Kostic, Warren Thompson

Abstract:

Obfuscation is one of the most useful tools to prevent network compromise. Previous research focused on the obfuscation of the network communications between external-facing edge devices. This work proposes the use of two edge devices, external and internal facing, which communicate via private IPv4 addresses in a software-defined pseudo-random IP hopping. This methodology does not require additional IP addresses and/or resources to implement. Statistical analyses demonstrate that the hopping surface must be at least 1e3 IP addresses in size with a broad standard deviation to minimize the possibility of coincidence of monitored and communication IPs. The probability of breaking the hopping algorithm requires a collection of at least 1e6 samples, which for large hopping surfaces will take years to collect. The probability of dropped packets is controlled via memory buffers and the frequency of hops and can be reduced to levels acceptable for video streaming. This methodology provides an impenetrable layer of security ideal for information and supervisory control and data acquisition systems.

Keywords: moving target defense, cybersecurity, network security, hopping randomization, software defined network, network security theory

Procedia PDF Downloads 163
12548 Determinants of Success of University Industry Collaboration in the Science Academic Units at Makerere University

Authors: Mukisa Simon Peter Turker, Etomaru Irene

Abstract:

This study examined factors determining the success of University-Industry Collaboration (UIC) in the science academic units (SAUs) at Makerere University. This was prompted by concerns about weak linkages between industry and the academic units at Makerere University. The study examined institutional, relational, output, and framework factors determining the success of UIC in the science academic units at Makerere University. The study adopted a predictive cross-sectional survey design. Data was collected using a questionnaire survey from 172 academic staff from the six SAUs at Makerere University. Stratified, proportionate, and simple random sampling techniques were used to select the samples. The study used descriptive statistics and linear multiple regression analysis to analyze data. The study findings reveal a coefficient of determination (R-square) of 0.403 at a significance level of 0.000, suggesting that UIC success was 40.3% at a standardized error of estimate of 0.60188. The strength of association between Institutional factors, Relational factors, Output factors, and Framework factors, taking into consideration all interactions among the study variables, was at 64% (R= 0.635). Institutional, Relational, Output and Framework factors accounted for 34% of the variance in the level of UIC success (adjusted R2 = 0.338). The remaining variance of 66% is explained by factors other than Institutional, Relational, Output, and Framework factors. The standardized coefficient statistics revealed that Relational factors (β = 0.454, t = 5.247, p = 0.000) and Framework factors (β = 0.311, t = 3.770, p = 0.000) are the only statistically significant determinants of the success of UIC in the SAU in Makerere University. Output factors (β = 0.082, t =1.096, p = 0.275) and Institutional factors β = 0.023, t = 0.292, p = 0.771) turned out to be statistically insignificant determinants of the success of UIC in the science academic units at Makerere University. The study concludes that Relational Factors and Framework Factors positively and significantly determine the success of UIC, but output factors and institutional factors are not statistically significant determinants of UIC in the SAUs at Makerere University. The study recommends strategies to consolidate Relational and Framework Factors to enhance UIC at Makerere University and further research on the effects of Institutional and Output factors on the success of UIC in universities.

Keywords: university-industry collaboration, output factors, relational factors, framework factors, institutional factors

Procedia PDF Downloads 29
12547 Developing a Simulation-Based Optimization Framework to Perform Energy Simulation for Indian Buildings

Authors: Sujoy Anirudha Das, Albert Thomas

Abstract:

Building sector is a major consumer of energy globally, and it has corresponding effects to the environment with respect to the carbon emissions. Given the fact that India is expected to add 40-billion square meter of new buildings till 2050, we need frameworks that help in reducing the overall energy consumption in the building sector. Even though several simulation-based frameworks that help in analyzing the building energy consumption are developed globally, in the Indian context, to the best of our knowledge, there is a lack of a comprehensive, yet user-friendly framework to simulate and optimize the effects of various energy influencing factors, specifically for Indian buildings. Therefore, this study is aimed at developing a simulation-based optimization framework to model the energy interactions in different types of Indian buildings by considering the dynamic nature of various energy influencing factors. This comprehensive framework can be used by various building stakeholders to test the energy effects of different factors such as, but not limited to, the various building materials, the orientation, the weather fluctuations, occupancy changes and the type of the building (e.g., office, residential). The results from the case study involving several building types would help us in gaining insights to build new energy-efficient buildings as well as retrofit the existing structures in a more convenient way to consume less energy, exclusively for an Indian scenario.

Keywords: building energy consumption, building energy simulations, energy efficient buildings, optimization framework

Procedia PDF Downloads 143
12546 Plasma Actuator Application to Control Surfaces of a Model Aircraft

Authors: Yuta Moriyama, Etsuo Morishita

Abstract:

Plasma actuator is very effective to recover stall flows over an upper airfoil surface. We first manufacture the actuator, test the stability of the device by trial and error basis and find the conditions for steady operations. We visualize the flow around an airfoil in the smoke tunnel and observe the stall recovery. The plasma actuator is stationary device and has no moving parts, and it might be an ideal device to control a model aircraft. We can use the actuator not only as a stall recovery device but also as a spoiler. We put the actuator near the leading edge of an elevator of a model aircraft as a spoiler, and measure the aerodynamic forces by a three-component balance. We observe the effect of the plasma actuator on the aerodynamic forces and the device effectiveness changes depending on the angle of attack whether it is positive or negative. We also visualize the flow caused by the plasma actuator by a desk-top Schlieren photography which is otherwise very difficult in a low-speed wind tunnel experiment.

Keywords: aerodynamics, plasma actuator, model aircraft, wind tunnel

Procedia PDF Downloads 341
12545 Framework for Socio-Technical Issues in Requirements Engineering for Developing Resilient Machine Vision Systems Using Levels of Automation through the Lifecycle

Authors: Ryan Messina, Mehedi Hasan

Abstract:

This research is to examine the impacts of using data to generate performance requirements for automation in visual inspections using machine vision. These situations are intended for design and how projects can smooth the transfer of tacit knowledge to using an algorithm. We have proposed a framework when specifying machine vision systems. This framework utilizes varying levels of automation as contingency planning to reduce data processing complexity. Using data assists in extracting tacit knowledge from those who can perform the manual tasks to assist design the system; this means that real data from the system is always referenced and minimizes errors between participating parties. We propose using three indicators to know if the project has a high risk of failing to meet requirements related to accuracy and reliability. All systems tested achieved a better integration into operations after applying the framework.

Keywords: automation, contingency planning, continuous engineering, control theory, machine vision, system requirements, system thinking

Procedia PDF Downloads 175
12544 Edge Enhancement Visual Methodology for Fat Amount and Distribution Assessment in Dry-Cured Ham Slices

Authors: Silvia Grassi, Stefano Schiavon, Ernestina Casiraghi, Cristina Alamprese

Abstract:

Dry-cured ham is an uncooked meat product particularly appreciated for its peculiar sensory traits among which lipid component plays a key role in defining quality and, consequently, consumers’ acceptability. Usually, fat content and distribution are chemically determined by expensive, time-consuming, and destructive analyses. Moreover, different sensory techniques are applied to assess product conformity to desired standards. In this context, visual systems are getting a foothold in the meat market envisioning more reliable and time-saving assessment of food quality traits. The present work aims at developing a simple but systematic and objective visual methodology to assess the fat amount of dry-cured ham slices, in terms of total, intermuscular and intramuscular fractions. To the aim, 160 slices from 80 PDO dry-cured hams were evaluated by digital image analysis and Soxhlet extraction. RGB images were captured by a flatbed scanner, converted in grey-scale images, and segmented based on intensity histograms as well as on a multi-stage algorithm aimed at edge enhancement. The latter was performed applying the Canny algorithm, which consists of image noise reduction, calculation of the intensity gradient for each image, spurious response removal, actual thresholding on corrected images, and confirmation of strong edge boundaries. The approach allowed for the automatic calculation of total, intermuscular and intramuscular fat fractions as percentages of the total slice area. Linear regression models were run to estimate the relationships between the image analysis results and the chemical data, thus allowing for the prediction of the total, intermuscular and intramuscular fat content by the dry-cured ham images. The goodness of fit of the obtained models was confirmed in terms of coefficient of determination (R²), hypothesis testing and pattern of residuals. Good regression models have been found being 0.73, 0.82, and 0.73 the R2 values for the total fat, the sum of intermuscular and intramuscular fat and the intermuscular fraction, respectively. In conclusion, the edge enhancement visual procedure brought to a good fat segmentation making the simple visual approach for the quantification of the different fat fractions in dry-cured ham slices sufficiently simple, accurate and precise. The presented image analysis approach steers towards the development of instruments that can overcome destructive, tedious and time-consuming chemical determinations. As future perspectives, the results of the proposed image analysis methodology will be compared with those of sensory tests in order to develop a fast grading method of dry-cured hams based on fat distribution. Therefore, the system will be able not only to predict the actual fat content but it will also reflect the visual appearance of samples as perceived by consumers.

Keywords: dry-cured ham, edge detection algorithm, fat content, image analysis

Procedia PDF Downloads 154
12543 A Use Case-Oriented Performance Measurement Framework for AI and Big Data Solutions in the Banking Sector

Authors: Yassine Bouzouita, Oumaima Belghith, Cyrine Zitoun, Charles Bonneau

Abstract:

Performance measurement framework (PMF) is an essential tool in any organization to assess the performance of its processes. It guides businesses to stay on track with their objectives and benchmark themselves from the market. With the growing trend of the digital transformation of business processes, led by innovations in artificial intelligence (AI) & Big Data applications, developing a mature system capable of capturing the impact of digital solutions across different industries became a necessity. Based on the conducted research, no such system has been developed in academia nor the industry. In this context, this paper covers a variety of methodologies on performance measurement, overviews the major AI and big data applications in the banking sector, and covers an exhaustive list of relevant metrics. Consequently, this paper is of interest to both researchers and practitioners. From an academic perspective, it offers a comparative analysis of the reviewed performance measurement frameworks. From an industry perspective, it offers exhaustive research, from market leaders, of the major applications of AI and Big Data technologies, across the different departments of an organization. Moreover, it suggests a standardized classification model with a well-defined structure of intelligent digital solutions. The aforementioned classification is mapped to a centralized library that contains an indexed collection of potential metrics for each application. This library is arranged in a manner that facilitates the rapid search and retrieval of relevant metrics. This proposed framework is meant to guide professionals in identifying the most appropriate AI and big data applications that should be adopted. Furthermore, it will help them meet their business objectives through understanding the potential impact of such solutions on the entire organization.

Keywords: AI and Big Data applications, impact assessment, metrics, performance measurement

Procedia PDF Downloads 174
12542 GynApp: A Mobile Application for the Organization and Control of Gynecological Studies

Authors: Betzabet García-Mendoza, Rocío Abascal-Mena

Abstract:

Breast and cervical cancer are among the leading causes of death of women in Mexico. The mortality rate for these diseases is alarming, even though there have been many campaigns for making people self-aware of the importance of conducting gynecological studies for a timely prevention and detection, these have not been enough. This paper presents a mobile application for organizing and controlling gynecological studies in order to help and boost women to take care of their bodies and health. The process of analyzing and designing the mobile application is presented, along with all the steps carried out by following a user-centered design methodology.

Keywords: breast cancer, cervical cancer, gynecological mobile application, paper prototyping, storyboard, women health

Procedia PDF Downloads 278
12541 A Machine Learning Based Method to Detect System Failure in Resource Constrained Environment

Authors: Payel Datta, Abhishek Das, Abhishek Roychoudhury, Dhiman Chattopadhyay, Tanushyam Chattopadhyay

Abstract:

Machine learning (ML) and deep learning (DL) is most predominantly used in image/video processing, natural language processing (NLP), audio and speech recognition but not that much used in system performance evaluation. In this paper, authors are going to describe the architecture of an abstraction layer constructed using ML/DL to detect the system failure. This proposed system is used to detect the system failure by evaluating the performance metrics of an IoT service deployment under constrained infrastructure environment. This system has been tested on the manually annotated data set containing different metrics of the system, like number of threads, throughput, average response time, CPU usage, memory usage, network input/output captured in different hardware environments like edge (atom based gateway) and cloud (AWS EC2). The main challenge of developing such system is that the accuracy of classification should be 100% as the error in the system has an impact on the degradation of the service performance and thus consequently affect the reliability and high availability which is mandatory for an IoT system. Proposed ML/DL classifiers work with 100% accuracy for the data set of nearly 4,000 samples captured within the organization.

Keywords: machine learning, system performance, performance metrics, IoT, edge

Procedia PDF Downloads 171
12540 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 95