Search results for: raw complex data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28338

Search results for: raw complex data

25608 Synthesis and Characterization of Mixed ligand complexes of Bipyridyl and Glycine with Different Counter Anions as Functional Antioxidant Enzyme Mimics

Authors: Mohamed M. Ibrahim, Gaber A. M. Mersal, Salih Al-Juaid, Samir A. El-Shazly

Abstract:

A series of mixed ligand complexes, viz., [Cu(BPy)(Gly)X]Y {X = Cl (1), Y = 0; X = 0, Y = ClO4- (2); X = H2O, Y = NO3- (3); X = H2O, Y = CH3COO- (4); and [Cu(BPy)(Gly)-(H2O)]2(SO4) (5) have been synthesized. Their structures and properties were characterized by elemental analysis, thermal analaysis, IR, UV–vis, and ESR spectroscopy, as well as electrochemical measurements including cyclic voltammetry, electrical molar conductivity, and magnetic moment measurements. Complexes 1 and 2 formed slightly distorted square-pyramidal coordination geometries of CuN3OCl and CuN3O2, respectively in which the N,O-donor glycine and N,N-donor bipyridyl bind at the basal plane with chloride ion or water as the axial ligand. Complex 3 shows square planar CuN3O coordination geometry, which exhibits chemically significant hydrogen bonding interactions besides showing coordination polymer formation. The superoxide dismutase and catalase-like activities of all complexes were tested and were found to be promising candidates as durable electron-transfer catalyst being close to the efficiency of the mimicking enzymes displaying either catalase or tyrosinase activity to serve for complete reactive oxygen species (ROS) detoxification, both with respect to superoxide radicals and related peroxides. The DNA binding interaction with super coiled pGEM-T plasmid DNA was investigated by using spectral (absorption and emission) titration and electrochemical techniques. The results revealed that DNA intercalate with complexes 1 and 2 through the groove binding mode. The calculated intrinsic binding constant (Kb) of 1 and 2 were 4.71 and 2.429 × 105 M−1, respectively. Gel electrophoresis study reveals the fact that both complexes cleave super coiled pGEM-T plasmid DNA to nicked and linear forms in the absence of any additives. On the other hand, the interaction of both complexes with DNA, the quasi-reversible CuII/CuI redox couple slightly improves its reversibility with considerable decrease in current intensity. All the experimental results indicate that the bipyridyl mixed copper(II) complex (1) intercalate more effectively into the DNA base pairs.

Keywords: enzyme mimics, mixed ligand complexes, X-ray structures, antioxidant, DNA-binding, DNA cleavage

Procedia PDF Downloads 541
25607 Berry Phase and Quantum Skyrmions: A Loop Tour in Physics

Authors: Sinuhé Perea Puente

Abstract:

In several physics systems the whole can be obtained as an exact copy of each of its parts, which facilitates the study of a complex system by looking carefully at its elements, separately. Reducionism offers simplified models which makes the problems easier, but “there’s plenty of room...at the mesoscopic scale”. Here we present a tour for two of its representants: Berry phase and skyrmions, studying some of its basic definitions and properties, and two cases in which both arise together, to finish constraining the scale for our mesoscopic system in the quest of quantum skyrmions, discovering which properties are conserved and which others may be destroyed.

Keywords: condensed mattter, quantum physics, skyrmions, topological defects

Procedia PDF Downloads 132
25606 A Human Activity Recognition System Based on Sensory Data Related to Object Usage

Authors: M. Abdullah, Al-Wadud

Abstract:

Sensor-based activity recognition systems usually accounts which sensors have been activated to perform an activity. The system then combines the conditional probabilities of those sensors to represent different activities and takes the decision based on that. However, the information about the sensors which are not activated may also be of great help in deciding which activity has been performed. This paper proposes an approach where the sensory data related to both usage and non-usage of objects are utilized to make the classification of activities. Experimental results also show the promising performance of the proposed method.

Keywords: Naïve Bayesian, based classification, activity recognition, sensor data, object-usage model

Procedia PDF Downloads 318
25605 Application of Post-Stack and Pre-Stack Seismic Inversion for Prediction of Hydrocarbon Reservoirs in a Persian Gulf Gas Field

Authors: Nastaran Moosavi, Mohammad Mokhtari

Abstract:

Seismic inversion is a technique which has been in use for years and its main goal is to estimate and to model physical characteristics of rocks and fluids. Generally, it is a combination of seismic and well-log data. Seismic inversion can be carried out through different methods; we have conducted and compared post-stack and pre- stack seismic inversion methods on real data in one of the fields in the Persian Gulf. Pre-stack seismic inversion can transform seismic data to rock physics such as P-impedance, S-impedance and density. While post- stack seismic inversion can just estimate P-impedance. Then these parameters can be used in reservoir identification. Based on the results of inverting seismic data, a gas reservoir was detected in one of Hydrocarbon oil fields in south of Iran (Persian Gulf). By comparing post stack and pre-stack seismic inversion it can be concluded that the pre-stack seismic inversion provides a more reliable and detailed information for identification and prediction of hydrocarbon reservoirs.

Keywords: density, p-impedance, s-impedance, post-stack seismic inversion, pre-stack seismic inversion

Procedia PDF Downloads 317
25604 A Data-Driven Monitoring Technique Using Combined Anomaly Detectors

Authors: Fouzi Harrou, Ying Sun, Sofiane Khadraoui

Abstract:

Anomaly detection based on Principal Component Analysis (PCA) was studied intensively and largely applied to multivariate processes with highly cross-correlated process variables. Monitoring metrics such as the Hotelling's T2 and the Q statistics are usually used in PCA-based monitoring to elucidate the pattern variations in the principal and residual subspaces, respectively. However, these metrics are ill suited to detect small faults. In this paper, the Exponentially Weighted Moving Average (EWMA) based on the Q and T statistics, T2-EWMA and Q-EWMA, were developed for detecting faults in the process mean. The performance of the proposed methods was compared with that of the conventional PCA-based fault detection method using synthetic data. The results clearly show the benefit and the effectiveness of the proposed methods over the conventional PCA method, especially for detecting small faults in highly correlated multivariate data.

Keywords: data-driven method, process control, anomaly detection, dimensionality reduction

Procedia PDF Downloads 294
25603 An Investigation of E-Government by Using GIS and Establishing E-Government in Developing Countries Case Study: Iraq

Authors: Ahmed M. Jamel

Abstract:

Electronic government initiatives and public participation to them are among the indicators of today's development criteria of the countries. After consequent two wars, Iraq's current position in, for example, UN's e-government ranking is quite concerning and did not improve in recent years, either. In the preparation of this work, we are motivated with the fact that handling geographic data of the public facilities and resources are needed in most of the e-government projects. Geographical information systems (GIS) provide most common tools not only to manage spatial data but also to integrate such type of data with nonspatial attributes of the features. With this background, this paper proposes that establishing a working GIS in the health sector of Iraq would improve e-government applications. As the case study, investigating hospital locations in Erbil is chosen.

Keywords: e-government, GIS, Iraq, Erbil

Procedia PDF Downloads 382
25602 Evaluation of Classification Algorithms for Diagnosis of Asthma in Iranian Patients

Authors: Taha SamadSoltani, Peyman Rezaei Hachesu, Marjan GhaziSaeedi, Maryam Zolnoori

Abstract:

Introduction: Data mining defined as a process to find patterns and relationships along data in the database to build predictive models. Application of data mining extended in vast sectors such as the healthcare services. Medical data mining aims to solve real-world problems in the diagnosis and treatment of diseases. This method applies various techniques and algorithms which have different accuracy and precision. The purpose of this study was to apply knowledge discovery and data mining techniques for the diagnosis of asthma based on patient symptoms and history. Method: Data mining includes several steps and decisions should be made by the user which starts by creation of an understanding of the scope and application of previous knowledge in this area and identifying KD process from the point of view of the stakeholders and finished by acting on discovered knowledge using knowledge conducting, integrating knowledge with other systems and knowledge documenting and reporting.in this study a stepwise methodology followed to achieve a logical outcome. Results: Sensitivity, Specifity and Accuracy of KNN, SVM, Naïve bayes, NN, Classification tree and CN2 algorithms and related similar studies was evaluated and ROC curves were plotted to show the performance of the system. Conclusion: The results show that we can accurately diagnose asthma, approximately ninety percent, based on the demographical and clinical data. The study also showed that the methods based on pattern discovery and data mining have a higher sensitivity compared to expert and knowledge-based systems. On the other hand, medical guidelines and evidence-based medicine should be base of diagnostics methods, therefore recommended to machine learning algorithms used in combination with knowledge-based algorithms.

Keywords: asthma, datamining, classification, machine learning

Procedia PDF Downloads 443
25601 Decision Support System in Air Pollution Using Data Mining

Authors: E. Fathallahi Aghdam, V. Hosseini

Abstract:

Environmental pollution is not limited to a specific region or country; that is why sustainable development, as a necessary process for improvement, pays attention to issues such as destruction of natural resources, degradation of biological system, global pollution, and climate change in the world, especially in the developing countries. According to the World Health Organization, as a developing city, Tehran (capital of Iran) is one of the most polluted cities in the world in terms of air pollution. In this study, three pollutants including particulate matter less than 10 microns, nitrogen oxides, and sulfur dioxide were evaluated in Tehran using data mining techniques and through Crisp approach. The data from 21 air pollution measuring stations in different areas of Tehran were collected from 1999 to 2013. Commercial softwares Clementine was selected for this study. Tehran was divided into distinct clusters in terms of the mentioned pollutants using the software. As a data mining technique, clustering is usually used as a prologue for other analyses, therefore, the similarity of clusters was evaluated in this study through analyzing local conditions, traffic behavior, and industrial activities. In fact, the results of this research can support decision-making system, help managers improve the performance and decision making, and assist in urban studies.

Keywords: data mining, clustering, air pollution, crisp approach

Procedia PDF Downloads 424
25600 Test Suite Optimization Using an Effective Meta-Heuristic BAT Algorithm

Authors: Anuradha Chug, Sunali Gandhi

Abstract:

Regression Testing is a very expensive and time-consuming process carried out to ensure the validity of modified software. Due to the availability of insufficient resources to re-execute all the test cases in time constrained environment, efforts are going on to generate test data automatically without human efforts. Many search based techniques have been proposed to generate efficient, effective as well as optimized test data, so that the overall cost of the software testing can be minimized. The generated test data should be able to uncover all potential lapses that exist in the software or product. Inspired from the natural behavior of bat for searching her food sources, current study employed a meta-heuristic, search-based bat algorithm for optimizing the test data on the basis certain parameters without compromising their effectiveness. Mathematical functions are also applied that can effectively filter out the redundant test data. As many as 50 Java programs are used to check the effectiveness of proposed test data generation and it has been found that 86% saving in testing efforts can be achieved using bat algorithm while covering 100% of the software code for testing. Bat algorithm was found to be more efficient in terms of simplicity and flexibility when the results were compared with another nature inspired algorithms such as Firefly Algorithm (FA), Hill Climbing Algorithm (HC) and Ant Colony Optimization (ACO). The output of this study would be useful to testers as they can achieve 100% path coverage for testing with minimum number of test cases.

Keywords: regression testing, test case selection, test case prioritization, genetic algorithm, bat algorithm

Procedia PDF Downloads 372
25599 Case Study: Throughput Analysis over PLC Infrastructure as Last Mile Residential Solution in Colombia

Authors: Edward P. Guillen, A. Karina Martinez Barliza

Abstract:

Powerline Communications (PLC) as last mile solution to provide communication services, has the advantage of transmitting over channels already used for electrical distribution. However these channels have been not designed with this purpose, for that reason telecommunication companies in Colombia want to know how good would be using PLC in costs and network performance in comparison to cable modem or DSL. This paper analyzes PLC throughput for residential complex scenarios using a PLC network scenarios and some statistical results are shown.

Keywords: home network, power line communication, throughput analysis, power factor, cost, last mile solution

Procedia PDF Downloads 264
25598 A Modular Solution for Large-Scale Critical Industrial Scheduling Problems with Coupling of Other Optimization Problems

Authors: Ajit Rai, Hamza Deroui, Blandine Vacher, Khwansiri Ninpan, Arthur Aumont, Francesco Vitillo, Robert Plana

Abstract:

Large-scale critical industrial scheduling problems are based on Resource-Constrained Project Scheduling Problems (RCPSP), that necessitate integration with other optimization problems (e.g., vehicle routing, supply chain, or unique industrial ones), thus requiring practical solutions (i.e., modular, computationally efficient with feasible solutions). To the best of our knowledge, the current industrial state of the art is not addressing this holistic problem. We propose an original modular solution that answers the issues exhibited by the delivery of complex projects. With three interlinked entities (project, task, resources) having their constraints, it uses a greedy heuristic with a dynamic cost function for each task with a situational assessment at each time step. It handles large-scale data and can be easily integrated with other optimization problems, already existing industrial tools and unique constraints as required by the use case. The solution has been tested and validated by domain experts on three use cases: outage management in Nuclear Power Plants (NPPs), planning of future NPP maintenance operation, and application in the defense industry on supply chain and factory relocation. In the first use case, the solution, in addition to the resources’ availability and tasks’ logical relationships, also integrates several project-specific constraints for outage management, like, handling of resource incompatibility, updating of tasks priorities, pausing tasks in a specific circumstance, and adjusting dynamic unit of resources. With more than 20,000 tasks and multiple constraints, the solution provides a feasible schedule within 10-15 minutes on a standard computer device. This time-effective simulation corresponds with the nature of the problem and requirements of several scenarios (30-40 simulations) before finalizing the schedules. The second use case is a factory relocation project where production lines must be moved to a new site while ensuring the continuity of their production. This generates the challenge of merging job shop scheduling and the RCPSP with location constraints. Our solution allows the automation of the production tasks while considering the rate expectation. The simulation algorithm manages the use and movement of resources and products to respect a given relocation scenario. The last use case establishes a future maintenance operation in an NPP. The project contains complex and hard constraints, like on Finish-Start precedence relationship (i.e., successor tasks have to start immediately after predecessors while respecting all constraints), shareable coactivity for managing workspaces, and requirements of a specific state of "cyclic" resources (they can have multiple states possible with only one at a time) to perform tasks (can require unique combinations of several cyclic resources). Our solution satisfies the requirement of minimization of the state changes of cyclic resources coupled with the makespan minimization. It offers a solution of 80 cyclic resources with 50 incompatibilities between levels in less than a minute. Conclusively, we propose a fast and feasible modular approach to various industrial scheduling problems that were validated by domain experts and compatible with existing industrial tools. This approach can be further enhanced by the use of machine learning techniques on historically repeated tasks to gain further insights for delay risk mitigation measures.

Keywords: deterministic scheduling, optimization coupling, modular scheduling, RCPSP

Procedia PDF Downloads 191
25597 Subjective Mapping Methodologies: Mapping Local Perceptions with Geographic Information Systems

Authors: A. Llopis Alvarez, D. Muller-Eie

Abstract:

Participatory GIS (geographic information systems) are designed for community mapping exercises in order to produce spatial representations of local knowledge. Ideally, participatory GIS caters to public participation through the use of spatial data in order to increase community-led policy-and decision-making. Having defined a spatial object, such as a neighborhood, subjective mapping involves attaining a description of the spatial, physical, social and psychological characteristics of that spatial object. This paper highlights an emerging appreciation of the subjective component, particularly in spatial analyses. The beliefs, feelings, and behaviors associated with an urban area reflect its sense of place for an individual or a group. It is important therefore to understand what types of beliefs, emotions, and behavioral patterns are relevant to particular resident, groups and urban scales. In this sense, resident’s emotional attachment to their urban areas motivates civic engagement and facilitates awareness of its strengths and its problems. Similarly, subjective perceptions act in complex ways to influence the formation and maintenance of social identity and quality of life. This paper reports on findings from a case study of immigrant population in Norwegian cities, their residential conditions and their relationship to quality of urban life. Cognitive mapping methodologies are used in this study to understand local perceptions of urban qualities. Thus, measures to alleviate disadvantages and improve quality of urban life are more likely to be effective when they are informed by an understanding of a place as constructed by those who live in it, meaning their subjective perceptions about it.

Keywords: mapping methodologies, participatory GIS, perceptual maps, public participation, spatial analysis, subjective perceptions

Procedia PDF Downloads 139
25596 Modified InVEST for Whatsapp Messages Forensic Triage and Search through Visualization

Authors: Agria Rhamdhan

Abstract:

WhatsApp as the most popular mobile messaging app has been used as evidence in many criminal cases. As the use of mobile messages generates large amounts of data, forensic investigation faces the challenge of large data problems. The hardest part of finding this important evidence is because current practice utilizes tools and technique that require manual analysis to check all messages. That way, analyze large sets of mobile messaging data will take a lot of time and effort. Our work offers methodologies based on forensic triage to reduce large data to manageable sets resulting easier to do detailed reviews, then show the results through interactive visualization to show important term, entities and relationship through intelligent ranking using Term Frequency-Inverse Document Frequency (TF-IDF) and Latent Dirichlet Allocation (LDA) Model. By implementing this methodology, investigators can improve investigation processing time and result's accuracy.

Keywords: forensics, triage, visualization, WhatsApp

Procedia PDF Downloads 165
25595 Low Cost Webcam Camera and GNSS Integration for Updating Home Data Using AI Principles

Authors: Mohkammad Nur Cahyadi, Hepi Hapsari Handayani, Agus Budi Raharjo, Ronny Mardianto, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan

Abstract:

PDAM (local water company) determines customer charges by considering the customer's building or house. Charges determination significantly affects PDAM income and customer costs because the PDAM applies a subsidy policy for customers classified as small households. Periodic updates are needed so that pricing is in line with the target. A thorough customer survey in Surabaya is needed to update customer building data. However, the survey that has been carried out so far has been by deploying officers to conduct one-by-one surveys for each PDAM customer. Surveys with this method require a lot of effort and cost. For this reason, this research offers a technology called moblie mapping, a mapping method that is more efficient in terms of time and cost. The use of this tool is also quite simple, where the device will be installed in the car so that it can record the surrounding buildings while the car is running. Mobile mapping technology generally uses lidar sensors equipped with GNSS, but this technology requires high costs. In overcoming this problem, this research develops low-cost mobile mapping technology using a webcam camera sensor added to the GNSS and IMU sensors. The camera used has specifications of 3MP with a resolution of 720 and a diagonal field of view of 78⁰. The principle of this invention is to integrate four camera sensors, a GNSS webcam, and GPS to acquire photo data, which is equipped with location data (latitude, longitude) and IMU (roll, pitch, yaw). This device is also equipped with a tripod and a vacuum cleaner to attach to the car's roof so it doesn't fall off while running. The output data from this technology will be analyzed with artificial intelligence to reduce similar data (Cosine Similarity) and then classify building types. Data reduction is used to eliminate similar data and maintain the image that displays the complete house so that it can be processed for later classification of buildings. The AI method used is transfer learning by utilizing a trained model named VGG-16. From the analysis of similarity data, it was found that the data reduction reached 50%. Then georeferencing is done using the Google Maps API to get address information according to the coordinates in the data. After that, geographic join is done to link survey data with customer data already owned by PDAM Surya Sembada Surabaya.

Keywords: mobile mapping, GNSS, IMU, similarity, classification

Procedia PDF Downloads 77
25594 An Investigation into the Views of Distant Science Education Students Regarding Teaching Laboratory Work Online

Authors: Abraham Motlhabane

Abstract:

This research analysed the written views of science education students regarding the teaching of laboratory work using the online mode. The research adopted the qualitative methodology. The qualitative research was aimed at investigating small and distinct groups normally regarded as a single-site study. Qualitative research was used to describe and analyze the phenomena from the student’s perspective. This means the research began with assumptions of the world view that use theoretical lenses of research problems inquiring into the meaning of individual students. The research was conducted with three groups of students studying for Postgraduate Certificate in Education, Bachelor of Education and honors Bachelor of Education respectively. In each of the study programmes, the science education module is compulsory. Five science education students from each study programme were purposively selected to participate in this research. Therefore, 15 students participated in the research. In order to analysis the data, the data were first printed and hard copies were used in the analysis. The data was read several times and key concepts and ideas were highlighted. Themes and patterns were identified to describe the data. Coding as a process of organising and sorting data was used. The findings of the study are very diverse; some students are in favour of online laboratory whereas other students argue that science can only be learnt through hands-on experimentation.

Keywords: online learning, laboratory work, views, perceptions

Procedia PDF Downloads 137
25593 Research of Seepage Field and Slope Stability Considering Heterogeneous Characteristics of Waste Piles: A Less Costly Way to Reduce High Leachate Levels and Avoid Accidents

Authors: Serges Mendomo Meye, Li Guowei, Shen Zhenzhong, Gan Lei, Xu Liqun

Abstract:

Due to the characteristics of high-heap and large-volume, the complex layers of waste and the high-water level of leachate, environmental pollution, and slope instability are easily produced. It is therefore of great significance to research the heterogeneous seepage field and stability of landfills. This paper focuses on the heterogeneous characteristics of the landfill piles and analyzes the seepage field and slope stability of the landfill using statistical and numerical analysis methods. The calculated results are compared with the field measurement and literature research data to verify the reliability of the model, which may provide the basis for the design, safe, and eco-friendly operation of the landfill. The main innovations are as follows: (1) The saturated-unsaturated seepage equation of heterogeneous soil is derived theoretically. The heterogeneous landfill is regarded as composed of infinite layers of homogeneous waste, and a method for establishing the heterogeneous seepage model is proposed. Then the formation law of the stagnant water level of heterogeneous landfills is studied. It is found that the maximum stagnant water level of landfills is higher when considering the heterogeneous seepage characteristics, which harms the stability of landfills. (2) Considering the heterogeneity weight and strength characteristics of waste, a method of establishing a heterogeneous stability model is proposed, and it is extended to the three-dimensional stability study. It is found that the distribution of heterogeneous characteristics has a great influence on the stability of landfill slope. During the operation and management of the landfill, the reservoir bank should also be considered while considering the capacity of the landfill.

Keywords: heterogeneous characteristics, leachate levels, saturated-unsaturated seepage, seepage field, slope stability

Procedia PDF Downloads 241
25592 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.

Keywords: data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP

Procedia PDF Downloads 315
25591 English as a Medium of Instruction in Algerian Higher Business Degree Programmes

Authors: Sidi Ahmed Berrabah

Abstract:

English as a Medium of Instruction (EMI) is expanding rapidly in the world. A growing volume of research has been dedicated to investigating its introduction, with findings that describe a complex picture and suggest that the practicality and effectiveness of EMI are still the subjects of debate. However, considerably less attention has been given to understanding EMI in a context where its introduction has been discussed but not yet put into practice. One such context is Algeria, where discourses about a potential introduction of EMI have been going on for some time. It is likely that the first courses where EMI is introduced are Business degree programmes. This study aims to examine the current discourses and attitudes towards the potential implementation of EMI and the language practices in Business degree programmes in three Algerian universities. The research is conducted in three different universities in three different regions in Algeria with the aim of including both ‘centre’ and ‘periphery’ Algerian universities. In order to achieve the previous aims, a mixed research paradigm is used. Questionnaires, semi structured interviews, and classroom observations are used to gather data from three participant cohorts: university students of Business, lecturers of Business, and lecturers of English for specific purposes. The findings showed that students and lecturers of Business are found in favour of the introduction of English instead of French or standard Arabic as a medium of instruction. The reason is that English is seen as having internationalisation and instrumental benefits, while French was too closely linked to the colonial history of the country. The favourable attitudes towards EMI, however, seem to contrast with the daily classroom practices at the departments of Business studies, where students and lecturers make practical choices of using their language repertoire based on their linguistic background and skills. Classrooms in the three Algerian universities featured fluid and translanguaging practices that cannot be reduced to a monolingual EMI policy.

Keywords: EMI, Algerian universities, business degree programmes, translanguaging

Procedia PDF Downloads 207
25590 Mining Scientific Literature to Discover Potential Research Data Sources: An Exploratory Study in the Field of Haemato-Oncology

Authors: A. Anastasiou, K. S. Tingay

Abstract:

Background: Discovering suitable datasets is an important part of health research, particularly for projects working with clinical data from patients organized in cohorts (cohort data), but with the proliferation of so many national and international initiatives, it is becoming increasingly difficult for research teams to locate real world datasets that are most relevant to their project objectives. We present a method for identifying healthcare institutes in the European Union (EU) which may hold haemato-oncology (HO) data. A key enabler of this research was the bibInsight platform, a scientometric data management and analysis system developed by the authors at Swansea University. Method: A PubMed search was conducted using HO clinical terms taken from previous work. The resulting XML file was processed using the bibInsight platform, linking affiliations to the Global Research Identifier Database (GRID). GRID is an international, standardized list of institutions, including the city and country in which the institution exists, as well as a category of the main business type, e.g., Academic, Healthcare, Government, Company. Countries were limited to the 28 current EU members, and institute type to 'Healthcare'. An article was considered valid if at least one author was affiliated with an EU-based healthcare institute. Results: The PubMed search produced 21,310 articles, consisting of 9,885 distinct affiliations with correspondence in GRID. Of these articles, 760 were from EU countries, and 390 of these were healthcare institutes. One affiliation was excluded as being a veterinary hospital. Two EU countries did not have any publications in our analysis dataset. The results were analysed by country and by individual healthcare institute. Networks both within the EU and internationally show institutional collaborations, which may suggest a willingness to share data for research purposes. Geographical mapping can ensure that data has broad population coverage. Collaborations with industry or government may exclude healthcare institutes that may have embargos or additional costs associated with data access. Conclusions: Data reuse is becoming increasingly important both for ensuring the validity of results, and economy of available resources. The ability to identify potential, specific data sources from over twenty thousand articles in less than an hour could assist in improving knowledge of, and access to, data sources. As our method has not yet specified if these healthcare institutes are holding data, or merely publishing on that topic, future work will involve text mining of data-specific concordant terms to identify numbers of participants, demographics, study methodologies, and sub-topics of interest.

Keywords: data reuse, data discovery, data linkage, journal articles, text mining

Procedia PDF Downloads 111
25589 Numerical Simulation of Large-Scale Landslide-Generated Impulse Waves With a Soil‒Water Coupling Smooth Particle Hydrodynamics Model

Authors: Can Huang, Xiaoliang Wang, Qingquan Liu

Abstract:

Soil‒water coupling is an important process in landslide-generated impulse waves (LGIW) problems, accompanied by large deformation of soil, strong interface coupling and three-dimensional effect. A meshless particle method, smooth particle hydrodynamics (SPH) has great advantages in dealing with complex interface and multiphase coupling problems. This study presents an improved soil‒water coupled model to simulate LGIW problems based on an open source code DualSPHysics (v4.0). Aiming to solve the low efficiency problem in modeling real large-scale LGIW problems, graphics processing unit (GPU) acceleration technology is implemented into this code. An experimental example, subaerial landslide-generated water waves, is simulated to demonstrate the accuracy of this model. Then, the Huangtian LGIW, a real large-scale LGIW problem is modeled to reproduce the entire disaster chain, including landslide dynamics, fluid‒solid interaction, and surge wave generation. The convergence analysis shows that a particle distance of 5.0 m can provide a converged landslide deposit and surge wave for this example. Numerical simulation results are in good agreement with the limited field survey data. The application example of the Huangtian LGIW provides a typical reference for large-scale LGIW assessments, which can provide reliable information on landslide dynamics, interface coupling behavior, and surge wave characteristics.

Keywords: soil‒water coupling, landslide-generated impulse wave, large-scale, SPH

Procedia PDF Downloads 58
25588 Using Data Mining Technique for Scholarship Disbursement

Authors: J. K. Alhassan, S. A. Lawal

Abstract:

This work is on decision tree-based classification for the disbursement of scholarship. Tree-based data mining classification technique is used in other to determine the generic rule to be used to disburse the scholarship. The system based on the defined rules from the tree is able to determine the class (status) to which an applicant shall belong whether Granted or Not Granted. The applicants that fall to the class of granted denote a successful acquirement of scholarship while those in not granted class are unsuccessful in the scheme. An algorithm that can be used to classify the applicants based on the rules from tree-based classification was also developed. The tree-based classification is adopted because of its efficiency, effectiveness, and easy to comprehend features. The system was tested with the data of National Information Technology Development Agency (NITDA) Abuja, a Parastatal of Federal Ministry of Communication Technology that is mandated to develop and regulate information technology in Nigeria. The system was found working according to the specification. It is therefore recommended for all scholarship disbursement organizations.

Keywords: classification, data mining, decision tree, scholarship

Procedia PDF Downloads 364
25587 A Semantic and Concise Structure to Represent Human Actions

Authors: Tobias Strübing, Fatemeh Ziaeetabar

Abstract:

Humans usually manipulate objects with their hands. To represent these actions in a simple and understandable way, we need to use a semantic framework. For this purpose, the Semantic Event Chain (SEC) method has already been presented which is done by consideration of touching and non-touching relations between manipulated objects in a scene. This method was improved by a computational model, the so-called enriched Semantic Event Chain (eSEC), which incorporates the information of static (e.g. top, bottom) and dynamic spatial relations (e.g. moving apart, getting closer) between objects in an action scene. This leads to a better action prediction as well as the ability to distinguish between more actions. Each eSEC manipulation descriptor is a huge matrix with thirty rows and a massive set of the spatial relations between each pair of manipulated objects. The current eSEC framework has so far only been used in the category of manipulation actions, which eventually involve two hands. Here, we would like to extend this approach to a whole body action descriptor and make a conjoint activity representation structure. For this purpose, we need to do a statistical analysis to modify the current eSEC by summarizing while preserving its features, and introduce a new version called Enhanced eSEC or (e2SEC). This summarization can be done from two points of the view: 1) reducing the number of rows in an eSEC matrix, 2) shrinking the set of possible semantic spatial relations. To achieve these, we computed the importance of each matrix row in an statistical way, to see if it is possible to remove a particular one while all manipulations are still distinguishable from each other. On the other hand, we examined which semantic spatial relations can be merged without compromising the unity of the predefined manipulation actions. Therefore by performing the above analyses, we made the new e2SEC framework which has 20% fewer rows, 16.7% less static spatial and 11.1% less dynamic spatial relations. This simplification, while preserving the salient features of a semantic structure in representing actions, has a tremendous impact on the recognition and prediction of complex actions, as well as the interactions between humans and robots. It also creates a comprehensive platform to integrate with the body limbs descriptors and dramatically increases system performance, especially in complex real time applications such as human-robot interaction prediction.

Keywords: enriched semantic event chain, semantic action representation, spatial relations, statistical analysis

Procedia PDF Downloads 122
25586 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest

Authors: Bharatendra Rai

Abstract:

Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).

Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error

Procedia PDF Downloads 316
25585 Interfacing and Replication of Electronic Machinery Using MATLAB/SIMULINK

Authors: Abdulatif Abdulsalam, Mohamed Shaban

Abstract:

This paper introduces interfacing and replication of electronic tools based on the MATLAB/ SIMULINK mock-up package. Mock-up components contain dc-dc converters, power issue rectifiers, motivation machines, dc gear, synchronous gear, and more entire systems. Power issue rectifier model includes solid state device models. The tools are the clear-cut structure and mock-up of complex energetic systems connecting with power electronic machines.

Keywords: power electronics, machine, MATLAB, simulink

Procedia PDF Downloads 348
25584 Understanding Profit Shifting by Multinationals in the Context of Cross-Border M&A: A Methodological Exploration

Authors: Michal Friedrich

Abstract:

Cross-border investment has never been easier than in today’s global economy. Despite recent initiatives tightening the international tax landscape, profit shifting and tax optimization by multinational entities (MNEs) in the context of cross-border M&A remain persistent and complex phenomena that warrant in-depth exploration. By synthesizing the outcomes of existing research, this study aims to first provide a methodological framework for identifying MNEs’ profit-shifting behavior and quantifying its fiscal impacts via various macroeconomic and microeconomic approaches. The study also proposes additional methods and qualitative/quantitative measures for extracting insight into the profit shifting behavior of MNEs in the context of their M&A activities at industry and entity levels. To develop the proposed methods, this study applies the knowledge of international tax laws and known profit shifting conduits (incl. dividends, interest, and royalties) on several model cases/types of cross-border acquisitions and post-acquisition integration activities by MNEs and highlights important factors that encourage or discourage tax optimization. Follow-up research is envisaged to apply the methods outlined in this study on published data on real-world M&A transactions to gain practical country-by-country, industry and entity-level insights. In conclusion, this study seeks to contribute to the ongoing discourse on profit shifting by providing a methodological toolkit for exploring profit shifting tendencies MNEs in connection with their M&A activities and to serve as a backbone for further research. The study is expected to provide valuable insight to policymakers, tax authorities, and tax professionals alike.

Keywords: BEPS, cross-border M&A, international taxation, profit shifting, tax optimization

Procedia PDF Downloads 65
25583 Open Source Algorithms for 3D Geo-Representation of Subsurface Formations Properties in the Oil and Gas Industry

Authors: Gabriel Quintero

Abstract:

This paper presents the result of the implementation of a series of algorithms intended to be used for representing in most of the 3D geographic software, even Google Earth, the subsurface formations properties combining 2D charts or 3D plots over a 3D background, allowing everyone to use them, no matter the economic size of the company for which they work. Besides the existence of complex and expensive specialized software for modeling subsurface formations based on the same information provided to this one, the use of this open source development shows a higher and easier usability and good results, limiting the rendered properties and polygons to a basic set of charts and tubes.

Keywords: chart, earth, formations, subsurface, visualization

Procedia PDF Downloads 437
25582 Enhancing Wayfinding and User Experience in Hospital Environments: A Study of University Medical Centre Ljubljana

Authors: Nastja Utrosa, Matevz Juvancic

Abstract:

Hospital buildings are complex public environments characterized by intricate functional arrangements and architectural layouts. Effective wayfinding is essential for patients, visitors, students, and staff. However, spatial orientation planning is often overlooked until after construction. While these environments meet functional needs, they frequently neglect the psychological aspects of user experience. This study investigates wayfinding within complex urban healthcare environments, focusing on the influences of spatial design, spatial cognition, and user experience. The inherent complexity of these environments, with extensive spatial dimensions and dispersed buildings, exacerbates the problem. Gradual expansions and additions contribute to disorientation and navigational difficulties for users. Effective route guidance in urban healthcare settings has become increasingly crucial. However, research on the environmental elements that influence wayfinding in such environments remains limited. To address this gap, we conducted a study at the University Medical Centre Ljubljana (UMCL), Slovenia's largest university hospital. Using a questionnaire, we assessed how individuals' perceptions and use of outdoor hospital spaces with a diverse sample (n=179). We evaluated the area’s usability by analyzing visit frequency, stops, modes of arrival, and parking patterns and examined the visitors' age distribution. Additionally, we investigated spatial aids and the use of color as an orientation element at three specific locations within the medical center. Our study explored the impact of color on entrance selection and the effectiveness of warm versus cool colors for wayfinding. Our findings highlight the significance of graphic adjustments in shaping perceptions of hospital outdoor spaces. Most participants preferred visually organized entrances, underscoring the importance of effective visual communication. Implementing these adaptations can substantially enhance the user experience, reducing stress and increasing satisfaction in hospital environments.

Keywords: hospital layout design, healthcare facilities, wayfinding, navigational aids, spatial orientation, color, signage

Procedia PDF Downloads 36
25581 Image-Based (RBG) Technique for Estimating Phosphorus Levels of Different Crops

Authors: M. M. Ali, Ahmed Al- Ani, Derek Eamus, Daniel K. Y. Tan

Abstract:

In this glasshouse study, we developed the new image-based non-destructive technique for detecting leaf P status of different crops such as cotton, tomato and lettuce. Plants were allowed to grow on nutrient media containing different P concentrations, i.e. 0%, 50% and 100% of recommended P concentration (P0 = no P, L; P1 = 2.5 mL 10 L-1 of P and P2 = 5 mL 10 L-1 of P as NaH2PO4). After 10 weeks of growth, plants were harvested and data on leaf P contents were collected using the standard destructive laboratory method and at the same time leaf images were collected by a handheld crop image sensor. We calculated leaf area, leaf perimeter and RGB (red, green and blue) values of these images. This data was further used in the linear discriminant analysis (LDA) to estimate leaf P contents, which successfully classified these plants on the basis of leaf P contents. The data indicated that P deficiency in crop plants can be predicted using the image and morphological data. Our proposed non-destructive imaging method is precise in estimating P requirements of different crop species.

Keywords: image-based techniques, leaf area, leaf P contents, linear discriminant analysis

Procedia PDF Downloads 374
25580 The Use of Whatsapp Platform in Spreading Fake News among Mass Communication Students of Abdu Gusau Polytechnic, Talata Mafara

Authors: Aliyu Damri

Abstract:

In every educational institution, students of mass communication receive training to report events and issues accurately and objectively in accordance with official controls. However, the complex nature of society today made it possible to use WhatsApp platform that revolutionizes the means of sharing information, ideas, and experiences. This paper examined how students in the Department of Mass Communication, Abdu Gusau Polytechnic, Talata Mafara used WhatsApp platform in spreading fake news. It used in depth interview techniques and focus group discussion with students as well as the use of published materials to gather related and relevant data. Also, the paper used procedures involved to analyze long interview content. This procedure includes observation of a useful utterance, development of expanded observation, examination of interconnection of observed comments, collective scrutiny of observation for patterns and themes, and review and analysis of the themes across all interviews for development of thesis. The result indicated that inadequate and absent of official controls guiding the conduct of online information sharing, inaccuracies and poor source verification, lack of gate keeping procedures to ensure ethical and legal provisions, bringing users into the process, sharing all information, availability of misinformation, disinformation and rumor and problem of conversation strongly encouraged the emergence of fake news. Surprisingly, the idea of information as a commodity has increased, and transparency of a source as new ethics emerged.

Keywords: disinformation, fake news, group, mass communication, misinformation, WhatsApp

Procedia PDF Downloads 134
25579 Setting Control Limits For Inaccurate Measurements

Authors: Ran Etgar

Abstract:

The process of rounding off measurements in continuous variables is commonly encountered. Although it usually has minor effects, sometimes it can lead to poor outcomes in statistical process control using X ̅-chart. The traditional control limits can cause incorrect conclusions if applied carelessly. This study looks into the limitations of classical control limits, particularly the impact of asymmetry. An approach to determining the distribution function of the measured parameter (Y ̅) is presented, resulting in a more precise method to establish the upper and lower control limits. The proposed method, while slightly more complex than Shewhart's original idea, is still user-friendly and accurate and only requires the use of two straightforward tables.

Keywords: quality control, process control, round-off, measurement, rounding error

Procedia PDF Downloads 95