Search results for: third party monitoring software
6558 Consolidating Service Engineering Ontologies Building Service Ontology from SOA Modeling Language (SoaML)
Authors: Purnomo Yustianto, Robin Doss, Suhardi, Novianto Budi Kurniawan
Abstract:
As a term for characterizing a process of devising a service system, the term ‘service engineering’ is still regarded as an ‘open’ research challenge due to unspecified details and conflicting perspectives. This paper presents consolidated service engineering ontologies in collecting, specifying and defining relationship between components pertinent within the context of service engineering. The ontologies are built by way of literature surveys from the collected conceptual works by collating various concepts into an integrated ontology. Two ontologies are produced: general service ontology and software service ontology. The software-service ontology is drawn from the informatics domain, while the generalized ontology of a service system is built from both a business management and the information system perspective. The produced ontologies are verified by exercising conceptual operationalizations of the ontologies in adopting several service orientation features and service system patterns. The proposed ontologies are demonstrated to be sufficient to serve as a basis for a service engineering framework.Keywords: engineering, ontology, service, SoaML
Procedia PDF Downloads 1896557 An Intensional Conceptualization Model for Ontology-Based Semantic Integration
Authors: Fateh Adhnouss, Husam El-Asfour, Kenneth McIsaac, AbdulMutalib Wahaishi, Idris El-Feghia
Abstract:
Conceptualization is an essential component of semantic ontology-based approaches. There have been several approaches that rely on extensional structure and extensional reduction structure in order to construct conceptualization. In this paper, several limitations are highlighted relating to their applicability to the construction of conceptualizations in dynamic and open environments. These limitations arise from a number of strong assumptions that do not apply to such environments. An intensional structure is strongly argued to be a natural and adequate modeling approach. This paper presents a conceptualization structure based on property relations and propositions theory (PRP) to the model ontology that is suitable for open environments. The model extends the First-Order Logic (FOL) notation and defines the formal representation that enables interoperability between software systems and supports semantic integration for software systems in open, dynamic environments.Keywords: conceptualization, ontology, extensional structure, intensional structure
Procedia PDF Downloads 1156556 Empirical Investigation for the Correlation between Object-Oriented Class Lack of Cohesion and Coupling
Authors: Jehad Al Dallal
Abstract:
The design of the internal relationships among object-oriented class members (i.e., attributes and methods) and the external relationships among classes affects the overall quality of the object-oriented software. The degree of relatedness among class members is referred to as class cohesion and the degree to which a class is related to other classes is called class coupling. Well designed classes are expected to exhibit high cohesion and low coupling values. In this paper, using classes of three open-source Java systems, we empirically investigate the relation between class cohesion and coupling. In the empirical study, five lack-of-cohesion metrics and eight coupling metrics are considered. The empirical study results show that class cohesion and coupling internal quality attributes are inversely correlated. The strength of the correlation highly depends on the cohesion and coupling measurement approaches.Keywords: class cohesion measure, class coupling measure, object-oriented class, software quality
Procedia PDF Downloads 2366555 Translation and Adaptation of Computer Assisted ASPIRA Smoking Prevention Program in Romania
Authors: Z. Abram, V. Nadasan, J. Balint, J. L. Ferencz
Abstract:
Introduction: Online smoking prevention programs became popular in the last time. In order to extend the use of such programs, existing applications can be adapted and translated in the native languages of the target groups. It is the first time that in Romania such a software was implemented. Our goal was to provide a computer-aided intervention with attractive content targeting high school students who are familiar with information and communication technology. Material and methods: ASPIRA is the Romanian/Hungarian adapted version of a smoking prevention program created in USA. Prior to apply the questionnaire and ASPIRA online program which contains five modules that include tests, videos and interactive games, the program was tested in some IT laboratories on a group of schoolchildren and students. The pilot study questionnaires were completed considering the opinions of young people and the functionality of the software. Results: Above 90% of participants reported a good or very good impression about the ASPIRA program. Only a small minority found that the program included some parts which were too long or reported the existence of any technical problems regarding the functionality of the software. 76% of the participants had little or very little difficulty in understanding the messages presented by the English speaking characters. Only 7.5% of the participants thought that the program included content that was not appropriate for the local culture. Conclusions: The vast majority of students reported favorite impressions about ASPIRA online program. High school students and boys were more critical. Language and cultural barriers did not have the potential to reduce in a significant manner the effectiveness of the tested program.Keywords: smoking prevention, ASPIRA online program, youth opinions, language/cultural barriers
Procedia PDF Downloads 2606554 Evaluation of Deformation for Deep Excavations in the Greater Vancouver Area Through Case Studies
Authors: Boris Kolev, Matt Kokan, Mohammad Deriszadeh, Farshid Bateni
Abstract:
Due to the increasing demand for real estate and the need for efficient land utilization in Greater Vancouver, developers have been increasingly considering the construction of high-rise structures with multiple below-grade parking. The temporary excavations required to allow for the construction of underground levels have recently reached up to 40 meters in depth. One of the challenges with deep excavations is the prediction of wall displacements and ground settlements due to their effect on the integrity of City utilities, infrastructure, and adjacent buildings. A large database of survey monitoring data has been collected for deep excavations in various soil conditions and shoring systems. The majority of the data collected is for tie-back anchors and shotcrete lagging systems. The data were categorized, analyzed and the results were evaluated to find a relationship between the most dominant parameters controlling the displacement, such as depth of excavation, soil properties, and the tie-back anchor loading and arrangement. For a select number of deep excavations, finite element modeling was considered for analyses. The lateral displacements from the simulation results were compared to the recorded survey monitoring data. The study concludes with a discussion and comparison of the available empirical and numerical modeling methodologies for evaluating lateral displacements in deep excavations.Keywords: deep excavations, lateral displacements, numerical modeling, shoring walls, tieback anchors
Procedia PDF Downloads 1816553 Initial Observations of the Utilization of Zoom Software for Synchronous English as a Foreign Language Oral Communication Classes at a Japanese University
Authors: Paul Nadasdy
Abstract:
In 2020, oral communication classes at many universities in Japan switched to online and hybrid lessons because of the coronavirus pandemic. Teachers had to adapt their practices immediately and deal with the challenges of the online environment. Even for experienced teachers, this still presented a problem as many had not conducted online classes before. Simultaneously, for many students, this type of learning was completely alien to them, and they had to adapt to the challenges faced by communicating in English online. This study collected data from 418 first grade students in the first semester of English communication classes at a technical university in Tokyo, Japan. Zoom software was used throughout the learning period. Though there were many challenges in the setting up and implementation of Zoom classes at the university, the results indicated that the students enjoyed the format and made the most of the circumstances. This proved the robustness of the course that was taught in regular lessons and the adaptability of teachers and students to challenges in a very short timeframe.Keywords: zoom, hybrid lessons, communicative english, online teaching
Procedia PDF Downloads 846552 Structural Morphing on High Performance Composite Hydrofoil to Postpone Cavitation
Authors: Fatiha Mohammed Arab, Benoit Augier, Francois Deniset, Pascal Casari, Jacques Andre Astolfi
Abstract:
For the top high performance foiling yachts, cavitation is often a limiting factor for take-off and top speed. This work investigates solutions to delay the onset of cavitation thanks to structural morphing. The structural morphing is based on compliant leading and trailing edge, with effect similar to flaps. It is shown here that the commonly accepted effect of flaps regarding the control of lift and drag forces can also be used to postpone the inception of cavitation. A numerical and experimental study is conducted in order to assess the effect of the geometric parameters of hydrofoil on their hydrodynamic performances and in cavitation inception. The effect of a 70% trailing edge and a 30% leading edge of NACA 0012 is investigated using Xfoil software at a constant Reynolds number 106. The simulations carried out for a range flaps deflections and various angles of attack. So, the result showed that the lift coefficient increase with the increase of flap deflection, but also with the increase of angle of attack and enlarged the bucket cavitation. To evaluate the efficiency of the Xfoil software, a 2D analysis flow over a NACA 0012 with leading and trailing edge flap was studied using Fluent software. The results of the two methods are in a good agreement. To validate the numerical approach, a passive adaptive composite model is built and tested in the hydrodynamic tunnel at the Research Institute of French Naval Academy. The model shows the ability to simulate the effect of flap by a LE and TE structural morphing due to hydrodynamic loading.Keywords: cavitation, flaps, hydrofoil, panel method, xfoil
Procedia PDF Downloads 1766551 Land Use Change Detection Using Satellite Images for Najran City, Kingdom of Saudi Arabia (KSA)
Authors: Ismail Elkhrachy
Abstract:
Determination of land use changing is an important component of regional planning for applications ranging from urban fringe change detection to monitoring change detection of land use. This data are very useful for natural resources management.On the other hand, the technologies and methods of change detection also have evolved dramatically during past 20 years. So it has been well recognized that the change detection had become the best methods for researching dynamic change of land use by multi-temporal remotely-sensed data. The objective of this paper is to assess, evaluate and monitor land use change surrounding the area of Najran city, Kingdom of Saudi Arabia (KSA) using Landsat images (June 23, 2009) and ETM+ image(June. 21, 2014). The post-classification change detection technique was applied. At last,two-time subset images of Najran city are compared on a pixel-by-pixel basis using the post-classification comparison method and the from-to change matrix is produced, the land use change information obtained.Three classes were obtained, urban, bare land and agricultural land from unsupervised classification method by using Erdas Imagine and ArcGIS software. Accuracy assessment of classification has been performed before calculating change detection for study area. The obtained accuracy is between 61% to 87% percent for all the classes. Change detection analysis shows that rapid growth in urban area has been increased by 73.2%, the agricultural area has been decreased by 10.5 % and barren area reduced by 7% between 2009 and 2014. The quantitative study indicated that the area of urban class has unchanged by 58.2 km〗^2, gained 70.3 〖km〗^2 and lost 16 〖km〗^2. For bare land class 586.4〖km〗^2 has unchanged, 53.2〖km〗^2 has gained and 101.5〖km〗^2 has lost. While agriculture area class, 20.2〖km〗^2 has unchanged, 31.2〖km〗^2 has gained and 37.2〖km〗^2 has lost.Keywords: land use, remote sensing, change detection, satellite images, image classification
Procedia PDF Downloads 5236550 The Estimation Method of Inter-Story Drift for Buildings Based on Evolutionary Learning
Authors: Kyu Jin Kim, Byung Kwan Oh, Hyo Seon Park
Abstract:
The seismic responses-based structural health monitoring system has been performed to reduce seismic damage. The inter-story drift ratio which is the major index of the seismic capacity assessment is employed for estimating the seismic damage of buildings. Meanwhile, seismic response analysis to estimate the structural responses of building demands significantly high computational cost due to increasing number of high-rise and large buildings. To estimate the inter-story drift ratio of buildings from the earthquake efficiently, this paper suggests the estimation method of inter-story drift for buildings using an artificial neural network (ANN). In the method, the radial basis function neural network (RBFNN) is integrated with optimization algorithm to optimize the variable through evolutionary learning that refers to evolutionary radial basis function neural network (ERBFNN). The estimation method estimates the inter-story drift without seismic response analysis when the new earthquakes are subjected to buildings. The effectiveness of the estimation method is verified through a simulation using multi-degree of freedom system.Keywords: structural health monitoring, inter-story drift ratio, artificial neural network, radial basis function neural network, genetic algorithm
Procedia PDF Downloads 3276549 Compact Optical Sensors for Harsh Environments
Authors: Branislav Timotijevic, Yves Petremand, Markus Luetzelschwab, Dara Bayat, Laurent Aebi
Abstract:
Optical miniaturized sensors with remote readout are required devices for the monitoring in harsh electromagnetic environments. As an example, in turbo and hydro generators, excessively high vibrations of the end-windings can lead to dramatic damages, imposing very high, additional service costs. A significant change of the generator temperature can also be an indicator of the system failure. Continuous monitoring of vibrations, temperature, humidity, and gases is therefore mandatory. The high electromagnetic fields in the generators impose the use of non-conductive devices in order to prevent electromagnetic interferences and to electrically isolate the sensing element to the electronic readout. Metal-free sensors are good candidates for such systems since they are immune to very strong electromagnetic fields and given the fact that they are non-conductive. We have realized miniature optical accelerometer and temperature sensors for a remote sensing of the harsh environments using the common, inexpensive silicon Micro Electro-Mechanical System (MEMS) platform. Both devices show highly linear response. The accelerometer has a deviation within 1% from the linear fit when tested in a range 0 – 40 g. The temperature sensor can provide the measurement accuracy better than 1 °C in a range 20 – 150 °C. The design of other type of sensors for the environments with high electromagnetic interferences has also been discussed.Keywords: optical MEMS, temperature sensor, accelerometer, remote sensing, harsh environment
Procedia PDF Downloads 3676548 A Novel Algorithm for Parsing IFC Models
Authors: Raninder Kaur Dhillon, Mayur Jethwa, Hardeep Singh Rai
Abstract:
Information technology has made a pivotal progress across disparate disciplines, one of which is AEC (Architecture, Engineering and Construction) industry. CAD is a form of computer-aided building modulation that architects, engineers and contractors use to create and view two- and three-dimensional models. The AEC industry also uses building information modeling (BIM), a newer computerized modeling system that can create four-dimensional models; this software can greatly increase productivity in the AEC industry. BIM models generate open source IFC (Industry Foundation Classes) files which aim for interoperability for exchanging information throughout the project lifecycle among various disciplines. The methods developed in previous studies require either an IFC schema or MVD and software applications, such as an IFC model server or a Building Information Modeling (BIM) authoring tool, to extract a partial or complete IFC instance model. This paper proposes an efficient algorithm for extracting a partial and total model from an Industry Foundation Classes (IFC) instance model without an IFC schema or a complete IFC model view definition (MVD). Procedia PDF Downloads 3006547 Reliability Analysis of Steel Columns under Buckling Load in Second-Order Theory
Authors: Hamed Abshari, M. Reza Emami Azadi, Madjid Sadegh Azar
Abstract:
For studying the overall instability of members of steel structures, there are several methods in which overall buckling and geometrical imperfection effects are considered in analysis. In first section, these methods are compared and ability of software to apply these methods is studied. Buckling loads determined from theoretical methods and software is compared for 2D one bay, one and two stories steel frames. To consider actual condition, buckling loads of three steel frames that have various dimensions are calculated and compared. Also, uncertainties that exist in loading and modeling of structures such as geometrical imperfection, yield stress, and modulus of elasticity in buckling load of 2D framed steel structures have been studied. By performing these uncertainties to each reliability analysis procedures (first-order, second-order, and simulation methods of reliability), one index of reliability from each procedure is determined. These values are studied and compared.Keywords: buckling, second-order theory, reliability index, steel columns
Procedia PDF Downloads 4926546 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading
Authors: Robert Caulk
Abstract:
A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration
Procedia PDF Downloads 896545 Visual Intelligence: Perception, Image and Manipulation in Visual Communication
Authors: Poojitha Vemula
Abstract:
Understanding how we use image manipulation to communicate through an audience’s perceptions and conceive visual intelligence. With the use of many software and high-end skills, designers have developed a third eye to combine two different visuals and create the desired image by using photoshop and other software skills. The purpose of visual intelligence is to convey a message to the targeted audience. For instance, the images of models are retouched on their skin to make it more convincing and draw attention from the audience. There are many ways of manipulating an image, such as double exposure, retouching photography inks or paint airbrushing and piecing photos together, or enhancing the brightness and contrast. To understand visual intelligence, a questionnaire survey as well as research was conducted on how image manipulation is used by both the audience and the designers. This depends on the message that needs to be conveyed by the brands. For instance, Fair & Lovely, a brightening cream for ladies use a lot of retouching and effects to show the dramatic change the cream takes effect on dark or dusky faces. Thus the designer’s role is to use their third eye to incorporate the message into visuals. The research and questionnaire survey concludes the perceptions and manipulations used in visual communication. However this is all to make an effortless communication between the designer and the audience by using the skills of the designer and the features provided by the software. The objective of visual intelligence is to covet the message of the brands that advertise their products or services by using visuals through softwares. Conveying a message through visual intelligence requires an audiences perceptions and understanding from the visuals created by the artists or designers. Visual intelligence determines how we use our technical skills to retouch and manipulate an image for a better understanding to convey the message to the targeted audience. This also bridges the communication between the brand and the audience.Keywords: graphic design, visual communication, convey messages, photoshop, image manipulation
Procedia PDF Downloads 2186544 An Authentication Protocol for Quantum Enabled Mobile Devices
Authors: Natarajan Venkatachalam, Subrahmanya V. R. K. Rao, Vijay Karthikeyan Dhandapani, Swaminathan Saravanavel
Abstract:
The quantum communication technology is an evolving design which connects multiple quantum enabled devices to internet for secret communication or sensitive information exchange. In future, the number of these compact quantum enabled devices will increase immensely making them an integral part of present communication systems. Therefore, safety and security of such devices is also a major concern for us. To ensure the customer sensitive information will not be eavesdropped or deciphered, we need a strong authentications and encryption mechanism. In this paper, we propose a mutual authentication scheme between these smart quantum devices and server based on the secure exchange of information through quantum channel which gives better solutions for symmetric key exchange issues. An important part of this work is to propose a secure mutual authentication protocol over the quantum channel. We show that our approach offers robust authentication protocol and further our solution is lightweight, scalable, cost-effective with optimized computational processing overheads.Keywords: quantum cryptography, quantum key distribution, wireless quantum communication, authentication protocol, quantum enabled device, trusted third party
Procedia PDF Downloads 1746543 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion
Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro
Abstract:
In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment
Procedia PDF Downloads 436542 Quantifying the Methods of Monitoring Timers in Electric Water Heater for Grid Balancing on Demand-Side Management: A Systematic Mapping Review
Authors: Yamamah Abdulrazaq, Lahieb A. Abrahim, Samuel E. Davies, Iain Shewring
Abstract:
An electric water heater (EWH) is a powerful appliance that uses electricity in residential, commercial, and industrial settings, and the ability to control them properly will result in cost savings and the prevention of blackouts on the national grid. This article discusses the usage of timers in EWH control strategies for demand-side management (DSM). Up to the authors' knowledge, there is no systematic mapping review focusing on the utilisation of EWH control strategies in DSM has yet been conducted. Consequently, the purpose of this research is to identify and examine main papers exploring EWH procedures in DSM by quantifying and categorising information with regard to publication year and source, kind of methods, and source of data for monitoring control techniques. In order to answer the research questions, a total of 31 publications published between 1999 and 2023 were selected depending on specific inclusion and exclusion criteria. The data indicate that direct load control (DLC) has been somewhat more prevalent than indirect load control (ILC). Additionally, the mixing method is much lower than the other techniques, and the proportion of Real-time data (RTD) to non-real-time data (NRTD) is about equal.Keywords: demand side management, direct load control, electric water heater, indirect load control, non real-time data, real-time data
Procedia PDF Downloads 826541 Privacy-Preserving Location Sharing System with Client/Server Architecture in Mobile Online Social Network
Authors: Xi Xiao, Chunhui Chen, Xinyu Liu, Guangwu Hu, Yong Jiang
Abstract:
Location sharing is a fundamental service in mobile Online Social Networks (mOSNs), which raises significant privacy concerns in recent years. Now, most location-based service applications adopt client/server architecture. In this paper, a location sharing system, named CSLocShare, is presented to provide flexible privacy-preserving location sharing with client/server architecture in mOSNs. CSLocShare enables location sharing between both trusted social friends and untrusted strangers without the third-party server. In CSLocShare, Location-Storing Social Network Server (LSSNS) provides location-based services but do not know the users’ real locations. The thorough analysis indicates that the users’ location privacy is protected. Meanwhile, the storage and the communication cost are saved. CSLocShare is more suitable and effective in reality.Keywords: mobile online social networks, client/server architecture, location sharing, privacy-preserving
Procedia PDF Downloads 3306540 Optimization of a Cone Loudspeaker Parameter of Design Parameters by Analysis of a Narrow Acoustic Sound Pathway
Authors: Yue Hu, Xilu Zhao, Takao Yamaguchi, Manabu Sasajima, Yoshio Koike, Akira Hara
Abstract:
This study tried optimization of design parameter of a cone loudspeaker unit as an example of the high flexibility of the products design. We developed an acoustic analysis software program that considers the impact of damping caused by air viscosity. In sound reproduction, it is difficult to each design the parameter of the loudspeaker. To overcome the limitation of the design problem in practice, this paper proposes a new an acoustic analysis algorithm to optimize design the parameter of the loudspeaker. The material character of cone paper and the loudspeaker edge was the design parameter, and the vibration displacement of the cone paper was the objective function. The results of the analysis were compared with the predicted value. They had high accuracy to the predicted value. These results suggest that, though the parameter design is difficult by experience and intuition, it can be performed comparatively easily using the optimization design by the developed acoustic analysis software.Keywords: air viscosity, loudspeaker, cone paper, edge, optimization
Procedia PDF Downloads 4016539 A Comparative Study of the Challenges of E-Learning in Nigerian Universities
Authors: J. N. Anene, A. A. Bello, C. C. Anene
Abstract:
The paper carried out a comparative study of the challenges of e-learning in Nigerian universities. The purpose of the study was to determine if there was a significant difference in the challenges faced by students in e-learning in Nigerian Universities. A total of two hundred and twenty-eight students from nine universities constituted the sample for the study. A simple random sampling technique was employed in selecting thirty–two students from one of each university in the six geo-political zones of Nigeria. The questionnaire based on 'yes or no' and column charts constituted the instrument employed in the study. Percentages were used to analyse 'yes or no' while column charts were used to compare responds of the students. The finding of the study revealed that majority of students in all the universities under study claimed that their universities lacked appropriate software, that good quality educational content online was lacking, they also agreed that sustainability of e-learning was not prioritized, that they had no access to appropriate content for ICT-enhanced learning and training and that they had access to affordable and reliable computers. For lecturers, the computer certification should be the first on the list of promotion requirements. The finding of the study revealed that students from seven out of nine universities confirmed that their universities lack of appropriate software whereas the other two claimed that they have appropriate software. Also, out of nine universities, two disagreed to the fact that good quality educational content online lacked, whereas seven agreed that they lacked good quality educational content online. The finding of the study also revealed that most of the respondents in almost all the university under study agreed that sustainability of e-learning was not prioritized. The study recommended among other that the Nigerian Government should make concerted effort to provide the enablement for all lecturers and students to become computer literate. This should be done within a time frame, and at the end of the computer course, certificates should be issued, and no student should graduate in his or her field of study without passing the computer course.Keywords: e-learning, developing countries, computer literacy, ICT
Procedia PDF Downloads 3366538 Road Condition Monitoring Using Built-in Vehicle Technology Data, Drones, and Deep Learning
Authors: Judith Mwakalonge, Geophrey Mbatta, Saidi Siuhi, Gurcan Comert, Cuthbert Ruseruka
Abstract:
Transportation agencies worldwide continuously monitor their roads' conditions to minimize road maintenance costs and maintain public safety and rideability quality. Existing methods for carrying out road condition surveys involve manual observations of roads using standard survey forms done by qualified road condition surveyors or engineers either on foot or by vehicle. Automated road condition survey vehicles exist; however, they are very expensive since they require special vehicles equipped with sensors for data collection together with data processing and computing devices. The manual methods are expensive, time-consuming, infrequent, and can hardly provide real-time information for road conditions. This study contributes to this arena by utilizing built-in vehicle technologies, drones, and deep learning to automate road condition surveys while using low-cost technology. A single model is trained to capture flexible pavement distresses (Potholes, Rutting, Cracking, and raveling), thereby providing a more cost-effective and efficient road condition monitoring approach that can also provide real-time road conditions. Additionally, data fusion is employed to enhance the road condition assessment with data from vehicles and drones.Keywords: road conditions, built-in vehicle technology, deep learning, drones
Procedia PDF Downloads 1246537 Numerical Simulation of the Bond Behavior Between Concrete and Steel Reinforcing Bars in Specialty Concrete
Authors: Camille A. Issa, Omar Masri
Abstract:
In the study, the commercial finite element software Abaqus was used to develop a three-dimensional nonlinear finite element model capable of simulating the pull-out test of reinforcing bars from underwater concrete. The results of thirty-two pull-out tests that have different parameters were implemented in the software to study the effect of the concrete cover, the bar size, the use of stirrups, and the compressive strength of concrete. The interaction properties used in the model provided accurate results in comparison with the experimental bond-slip results, thus the model has successfully simulated the pull-out test. The results of the finite element model are used to better understand and visualize the distribution of stresses in each component of the model, and to study the effect of the various parameters used in this study including the role of the stirrups in preventing the stress from reaching to the sides of the specimens.Keywords: pull-out test, bond strength, underwater concrete, nonlinear finite element analysis, abaqus
Procedia PDF Downloads 4426536 Occupational Safety in Construction Projects
Authors: Heba Elbibas, Esra Gnijeewa, Zedan Hatush
Abstract:
This paper presents research on occupational safety in construction projects, where the importance of safety management in projects was studied, including the preparation of a safety plan and program for each project and the identification of the responsibilities of each party to the contract. The research consists of two parts: 1-Field visits: which were field visits to three construction projects, including building projects, road projects, and tower installation. The safety level of these projects was evaluated through a checklist that includes the most important safety elements in terms of the application of these items in the projects. 2-Preparation of a questionnaire: which included supervisors and engineers and aimed to determine the level of awareness and commitment of different project categories to safety standards. The results showed the following: i) There is a moderate occupational safety policy. ii) The preparation and storage of maintenance reports are not fully complied with. iii) There is a moderate level of training on occupational safety for project workers. iv) The company does not impose penalties on safety violators permanently. v) There is a moderate policy for equipment and machinery safety. vi) Self-injuries occur due to (fatigue, lack of attention, deliberate error, and emotional factors), with a rate of 82.4%.Keywords: management, safety, occupational safety, classification
Procedia PDF Downloads 1066535 Optimal Cropping Pattern in an Irrigation Project: A Hybrid Model of Artificial Neural Network and Modified Simplex Algorithm
Authors: Safayat Ali Shaikh
Abstract:
Software has been developed for optimal cropping pattern in an irrigation project considering land constraint, water availability constraint and pick up flow constraint using modified Simplex Algorithm. Artificial Neural Network Models (ANN) have been developed to predict rainfall. AR (1) model used to generate 1000 years rainfall data to train the ANN. Simulation has been done with expected rainfall data. Eight number crops and three types of soil class have been considered for optimization model. Area under each crop and each soil class have been quantified using Modified Simplex Algorithm to get optimum net return. Efficacy of the software has been tested using data of large irrigation project in India.Keywords: artificial neural network, large irrigation project, modified simplex algorithm, optimal cropping pattern
Procedia PDF Downloads 2036534 Application of Federated Learning in the Health Care Sector for Malware Detection and Mitigation Using Software-Defined Networking Approach
Authors: A. Dinelka Panagoda, Bathiya Bandara, Chamod Wijetunga, Chathura Malinda, Lakmal Rupasinghe, Chethana Liyanapathirana
Abstract:
This research takes us forward with the concepts of Federated Learning and Software-Defined Networking (SDN) to introduce an efficient malware detection technique and provide a mitigation mechanism to give birth to a resilient and automated healthcare sector network system by also adding the feature of extended privacy preservation. Due to the daily transformation of new malware attacks on hospital Integrated Clinical Environment (ICEs), the healthcare industry is at an undefinable peak of never knowing its continuity direction. The state of blindness by the array of indispensable opportunities that new medical device inventions and their connected coordination offer daily, a factor that should be focused driven is not yet entirely understood by most healthcare operators and patients. This solution has the involvement of four clients in the form of hospital networks to build up the federated learning experimentation architectural structure with different geographical participation to reach the most reasonable accuracy rate with privacy preservation. While the logistic regression with cross-entropy conveys the detection, SDN comes in handy in the second half of the research to stack up the initial development phases of the system with malware mitigation based on policy implementation. The overall evaluation sums up with a system that proves the accuracy with the added privacy. It is no longer needed to continue with traditional centralized systems that offer almost everything but not privacy.Keywords: software-defined network, federated learning, privacy, integrated clinical environment, decentralized learning, malware detection, malware mitigation
Procedia PDF Downloads 1876533 A Relational Data Base for Radiation Therapy
Authors: Raffaele Danilo Esposito, Domingo Planes Meseguer, Maria Del Pilar Dorado Rodriguez
Abstract:
As far as we know, it is still unavailable a commercial solution which would allow to manage, openly and configurable up to user needs, the huge amount of data generated in a modern Radiation Oncology Department. Currently, available information management systems are mainly focused on Record & Verify and clinical data, and only to a small extent on physical data. Thus, results in a partial and limited use of the actually available information. In the present work we describe the implementation at our department of a centralized information management system based on a web server. Our system manages both information generated during patient planning and treatment, and information of general interest for the whole department (i.e. treatment protocols, quality assurance protocols etc.). Our objective it to be able to analyze in a simple and efficient way all the available data and thus to obtain quantitative evaluations of our treatments. This would allow us to improve our work flow and protocols. To this end we have implemented a relational data base which would allow us to use in a practical and efficient way all the available information. As always we only use license free software.Keywords: information management system, radiation oncology, medical physics, free software
Procedia PDF Downloads 2416532 Modelling and Simulation of Bioethanol Production from Food Waste Using CHEMCAD Software
Authors: Kgomotso Matobole, Noluzuko Monakali, Hilary Rutto, Tumisang Seodigeng
Abstract:
On a global scale, there is an alarming generation of food waste. Food waste is generated across the food supply chain. Worldwide urbanization, as well as global economic growth, have contributed to this amount of food waste the environment is receiving. Food waste normally ends on illegal dumping sites when not properly disposed, or disposed to landfills. This results in environmental pollution due to inadequate waste management practices. Food waste is rich in organic matter and highly biodegradable; hence, it can be utilized for the production of bioethanol, a type of biofuel. In so doing, alternative energy will be created, and the volumes of food waste will be reduced in the process. This results in food waste being seen as a precious commodity in energy generation instead of a pollutant. The main aim of the project was to simulate a biorefinery, using a software called CHEMCAD 7.12. The resulting purity of the ethanol from the simulation was 98.9%, with the feed ratio of 1: 2 for food waste and water. This was achieved by integrating necessary unit operations and optimisation of their operating conditions.Keywords: fermentation, bioethanol, food waste, hydrolysis, simulation, modelling
Procedia PDF Downloads 3766531 Knowledge Based Liability for ISPs’ Copyright and Trademark Infringement in the EU E-Commerce Directive: Two Steps Behind the Philosophy of Computing Mind
Authors: Mohammad Sadeghi
Abstract:
The subject matter of this article is the efficiency of current knowledge standard to afford the legal integration regarding criteria and approaches to ISP knowledge standards, to shield ISP and copyright, trademark and other parties’ rights in the online information society. The EU recognizes the knowledge-based liability for intermediaries in the European Directive on Electronic Commerce, but the implication of all parties’ responsibility for combating infringement has been immolated by dominating attention on liability due to the lack of the appropriate legal mechanism to devote each party responsibility. Moreover, there is legal challenge on the applicability of knowledge-based liability on hosting services and information location tools service. The aim of this contribution is to discuss the advantages and disadvantages of ECD knowledge standard through case law with a special emphasis on duty of prevention and constructive knowledge role on internet service providers (ISP s’) to achieve fair balance between all parties rights.Keywords: internet service providers, liability, copyright infringement, hosting, caching, mere conduit service, notice and takedown, E-commerce Directive
Procedia PDF Downloads 5246530 Detection of Abnormal Process Behavior in Copper Solvent Extraction by Principal Component Analysis
Authors: Kirill Filianin, Satu-Pia Reinikainen, Tuomo Sainio
Abstract:
Frequent measurements of product steam quality create a data overload that becomes more and more difficult to handle. In the current study, plant history data with multiple variables was successfully treated by principal component analysis to detect abnormal process behavior, particularly, in copper solvent extraction. The multivariate model is based on the concentration levels of main process metals recorded by the industrial on-stream x-ray fluorescence analyzer. After mean-centering and normalization of concentration data set, two-dimensional multivariate model under principal component analysis algorithm was constructed. Normal operating conditions were defined through control limits that were assigned to squared score values on x-axis and to residual values on y-axis. 80 percent of the data set were taken as the training set and the multivariate model was tested with the remaining 20 percent of data. Model testing showed successful application of control limits to detect abnormal behavior of copper solvent extraction process as early warnings. Compared to the conventional techniques of analyzing one variable at a time, the proposed model allows to detect on-line a process failure using information from all process variables simultaneously. Complex industrial equipment combined with advanced mathematical tools may be used for on-line monitoring both of process streams’ composition and final product quality. Defining normal operating conditions of the process supports reliable decision making in a process control room. Thus, industrial x-ray fluorescence analyzers equipped with integrated data processing toolbox allows more flexibility in copper plant operation. The additional multivariate process control and monitoring procedures are recommended to apply separately for the major components and for the impurities. Principal component analysis may be utilized not only in control of major elements’ content in process streams, but also for continuous monitoring of plant feed. The proposed approach has a potential in on-line instrumentation providing fast, robust and cheap application with automation abilities.Keywords: abnormal process behavior, failure detection, principal component analysis, solvent extraction
Procedia PDF Downloads 3096529 Tunnel Convergence Monitoring by Distributed Fiber Optics Embedded into Concrete
Authors: R. Farhoud, G. Hermand, S. Delepine-lesoille
Abstract:
Future underground facility of French radioactive waste disposal, named Cigeo, is designed to store intermediate and high level - long-lived French radioactive waste. Intermediate level waste cells are tunnel-like, about 400m length and 65 m² section, equipped with several concrete layers, which can be grouted in situ or composed of tunnel elements pre-grouted. The operating space into cells, to allow putting or removing waste containers, should be monitored for several decades without any maintenance. To provide the required information, design was performed and tested in situ in Andra’s underground laboratory (URL) at 500m under the surface. Based on distributed optic fiber sensors (OFS) and backscattered Brillouin for strain and Raman for temperature interrogation technics, the design consists of 2 loops of OFS, at 2 different radiuses, around the monitored section (Orthoradiale strains) and longitudinally. Strains measured by distributed OFS cables were compared to classical vibrating wire extensometers (VWE) and platinum probes (Pt). The OFS cables were composed of 2 cables sensitive to strains and temperatures and one only for temperatures. All cables were connected, between sensitive part and instruments, to hybrid cables to reduce cost. The connection has been made according to 2 technics: splicing fibers in situ after installation or preparing each fiber with a connector and only plugging them together in situ. Another challenge was installing OFS cables along a tunnel mad in several parts, without interruption along several parts. First success consists of the survival rate of sensors after installation and quality of measurements. Indeed, 100% of OFS cables, intended for long-term monitoring, survived installation. Few new configurations were tested with relative success. Measurements obtained were very promising. Indeed, after 3 years of data, no difference was observed between cables and connection methods of OFS and strains fit well with VWE and Pt placed at the same location. Data, from Brillouin instrument sensitive to strains and temperatures, were compensated with data provided by Raman instrument only sensitive to temperature and into a separated fiber. These results provide confidence in the next steps of the qualification processes which consists of testing several data treatment approach for direct analyses.Keywords: monitoring, fiber optic, sensor, data treatment
Procedia PDF Downloads 128