Search results for: scientific data mining
24155 A Platform for Managing Residents' Carbon Trajectories Based on the City Intelligent Model (CIM) 4.0
Authors: Chen Xi, Liu Xuebing, Lao Xuerui, Kuan Sinman, Jiang Yike, Wang Hanwei, Yang Xiaolang, Zhou Junjie, Xie Jinpeng
Abstract:
Climate change is a global problem facing humanity and this is now the consensus of the mainstream scientific community. In accordance with the carbon peak and carbon neutral targets and visions set out in the United Nations Framework Convention on Climate Change, the Kyoto Protocol and the Paris Agreement, this project uses the City Intelligent Model (CIM) and Artificial Intelligence Machine Vision (ICR) as the core technologies to accurately quantify low carbon behaviour into green corn, which is a means of guiding ecologically sustainable living patterns. Using individual communities as management units and blockchain as a guarantee of fairness in the whole cycle of green currency circulation, the project will form a modern resident carbon track management system based on the principle of enhancing the ecological resilience of communities and the cohesiveness of community residents, ultimately forming an ecologically sustainable smart village that can be self-organised and managed.Keywords: urban planning, urban governance, CIM, artificial Intelligence, sustainable development
Procedia PDF Downloads 8424154 Modified InVEST for Whatsapp Messages Forensic Triage and Search through Visualization
Authors: Agria Rhamdhan
Abstract:
WhatsApp as the most popular mobile messaging app has been used as evidence in many criminal cases. As the use of mobile messages generates large amounts of data, forensic investigation faces the challenge of large data problems. The hardest part of finding this important evidence is because current practice utilizes tools and technique that require manual analysis to check all messages. That way, analyze large sets of mobile messaging data will take a lot of time and effort. Our work offers methodologies based on forensic triage to reduce large data to manageable sets resulting easier to do detailed reviews, then show the results through interactive visualization to show important term, entities and relationship through intelligent ranking using Term Frequency-Inverse Document Frequency (TF-IDF) and Latent Dirichlet Allocation (LDA) Model. By implementing this methodology, investigators can improve investigation processing time and result's accuracy.Keywords: forensics, triage, visualization, WhatsApp
Procedia PDF Downloads 17224153 Low Cost Webcam Camera and GNSS Integration for Updating Home Data Using AI Principles
Authors: Mohkammad Nur Cahyadi, Hepi Hapsari Handayani, Agus Budi Raharjo, Ronny Mardianto, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan
Abstract:
PDAM (local water company) determines customer charges by considering the customer's building or house. Charges determination significantly affects PDAM income and customer costs because the PDAM applies a subsidy policy for customers classified as small households. Periodic updates are needed so that pricing is in line with the target. A thorough customer survey in Surabaya is needed to update customer building data. However, the survey that has been carried out so far has been by deploying officers to conduct one-by-one surveys for each PDAM customer. Surveys with this method require a lot of effort and cost. For this reason, this research offers a technology called moblie mapping, a mapping method that is more efficient in terms of time and cost. The use of this tool is also quite simple, where the device will be installed in the car so that it can record the surrounding buildings while the car is running. Mobile mapping technology generally uses lidar sensors equipped with GNSS, but this technology requires high costs. In overcoming this problem, this research develops low-cost mobile mapping technology using a webcam camera sensor added to the GNSS and IMU sensors. The camera used has specifications of 3MP with a resolution of 720 and a diagonal field of view of 78⁰. The principle of this invention is to integrate four camera sensors, a GNSS webcam, and GPS to acquire photo data, which is equipped with location data (latitude, longitude) and IMU (roll, pitch, yaw). This device is also equipped with a tripod and a vacuum cleaner to attach to the car's roof so it doesn't fall off while running. The output data from this technology will be analyzed with artificial intelligence to reduce similar data (Cosine Similarity) and then classify building types. Data reduction is used to eliminate similar data and maintain the image that displays the complete house so that it can be processed for later classification of buildings. The AI method used is transfer learning by utilizing a trained model named VGG-16. From the analysis of similarity data, it was found that the data reduction reached 50%. Then georeferencing is done using the Google Maps API to get address information according to the coordinates in the data. After that, geographic join is done to link survey data with customer data already owned by PDAM Surya Sembada Surabaya.Keywords: mobile mapping, GNSS, IMU, similarity, classification
Procedia PDF Downloads 8424152 Social Enterprise Strategies for Financial Sustainability in the Economic Literature
Authors: Adam Bereczk
Abstract:
Due to persistent socioeconomic problems regarding sustainability and labour market equilibrium in Europe, the subjects of social economy gained considerable academic attention recently. At the meantime, social enterprises pursuing the double bottom line criteria, struggling to find the proper management philosophies and strategies to make their social purpose business financially sustainable. Despite the strategic management literature was developed mainly on the bases of large corporations, in the past years, the interpretation of strategy concepts became a frequent topic in scientific discussions in the case of small and medium-sized enterprises also. The topic of strategic orientations is a good example of the trend. However, less is known about the case of social enterprises, despite the fact, the majority of them are small businesses engaged in real business activities. The main purpose of this work is to give a comprehensive summary of different perspectives regarding the interpretations of strategic orientations of social enterprises. The novelty of this work is it shows the previous outcomes and models of scholars from various fields of economic science who tried to intertwine the two spheres in different forms, methodize the findings and draw attention to the shortcomings.Keywords: social enterprises, business sustainability, strategic orientations, literature review
Procedia PDF Downloads 27924151 An Investigation into the Views of Distant Science Education Students Regarding Teaching Laboratory Work Online
Authors: Abraham Motlhabane
Abstract:
This research analysed the written views of science education students regarding the teaching of laboratory work using the online mode. The research adopted the qualitative methodology. The qualitative research was aimed at investigating small and distinct groups normally regarded as a single-site study. Qualitative research was used to describe and analyze the phenomena from the student’s perspective. This means the research began with assumptions of the world view that use theoretical lenses of research problems inquiring into the meaning of individual students. The research was conducted with three groups of students studying for Postgraduate Certificate in Education, Bachelor of Education and honors Bachelor of Education respectively. In each of the study programmes, the science education module is compulsory. Five science education students from each study programme were purposively selected to participate in this research. Therefore, 15 students participated in the research. In order to analysis the data, the data were first printed and hard copies were used in the analysis. The data was read several times and key concepts and ideas were highlighted. Themes and patterns were identified to describe the data. Coding as a process of organising and sorting data was used. The findings of the study are very diverse; some students are in favour of online laboratory whereas other students argue that science can only be learnt through hands-on experimentation.Keywords: online learning, laboratory work, views, perceptions
Procedia PDF Downloads 14824150 A Novel Dual Band-pass filter Based On Coupling of Composite Right/Left Hand CPW and (CSRRs) Uses Ferrite Components
Authors: Mohammed Berka, Khaled Merit
Abstract:
Recent works on microwave filters show that the constituent materials such filters are very important in the design and realization. Several solutions have been proposed to improve the qualities of filtering. In this paper, we propose a new dual band-pass filter based on the coupling of a composite (CRLH) coplanar waveguide with complementary split ring resonators (CSRRs). The (CRLH) CPW is composed of two resonators, each one has an interdigital capacitor (CID) and two short-circuited stubs parallel to top ground plane. On the lower ground plane, we use defected ground structure technology (DGS) to engrave two (CSRRs) offered with different shapes and dimensions. Between the top ground plane and the substrate, we place a ferrite layer to control the electromagnetic coupling between (CRLH) CPW and (CSRRs). The global filter that has coplanar access will have a dual band-pass behavior around the magnetic resonances of (CSRRs). Since there’s no scientific or experimental result in the literature for this kind of complicated structure, it was necessary to perform simulation using HFSS Ansoft designer.Keywords: complementary split ring resonators, coplanar waveguide, ferrite, filter, stub.
Procedia PDF Downloads 40424149 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.Keywords: data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP
Procedia PDF Downloads 31824148 Chemical, Physical and Microbiological Characteristics of a Texture-Modified Beef- Based 3D Printed Functional Product
Authors: Elvan G. Bulut, Betul Goksun, Tugba G. Gun, Ozge Sakiyan Demirkol, Kamuran Ayhan, Kezban Candogan
Abstract:
Dysphagia, difficulty in swallowing solid foods and thin liquids, is one of the common health threats among the elderly who require foods with modified texture in their diet. Although there are some commercial food formulations or hydrocolloids to thicken the liquid foods for dysphagic individuals, there is still a need for developing and offering new food products with enriched nutritional, textural and sensory characteristics to safely nourish these patients. 3D food printing is an appealing alternative in creating personalized foods for this purpose with attractive shape, soft and homogenous texture. In order to modify texture and prevent phase separation, hydrocolloids are generally used. In our laboratory, an optimized 3D printed beef-based formulation specifically for people with swallowing difficulties was developed based on the research project supported by the Scientific and Technological Research Council of Turkey (TÜBİTAK Project # 218O017). The optimized formulation obtained from response surface methodology was 60% beef powder, 5.88% gelatin, and 0.74% kappa-carrageenan (all in a dry basis). This product was enriched with powders of freeze-dried beet, celery, and red capia pepper, butter, and whole milk. Proximate composition (moisture, fat, protein, and ash contents), pH value, CIE lightness (L*), redness (a*) and yellowness (b*), and color difference (ΔE*) values were determined. Counts of total mesophilic aerobic bacteria (TMAB), lactic acid bacteria (LAB), mold and yeast, total coliforms were conducted, and detection of coagulase positive S. aureus, E. coli, and Salmonella spp. were performed. The 3D printed products had 60.11% moisture, 16.51% fat, 13.68% protein, and 1.65% ash, and the pH value was 6.19, whereas the ΔE* value was 3.04. Counts of TMAB, LAB, mold and yeast and total coliforms before and after 3D printing were 5.23-5.41 log cfu/g, < 1 log cfu/g, < 1 log cfu/g, 2.39-2.15 log EMS/g, respectively. Coagulase positive S. aureus, E. coli, and Salmonella spp. were not detected in the products. The data obtained from this study based on determining some important product characteristics of functional beef-based formulation provides an encouraging basis for future research on the subject and should be useful in designing mass production of 3D printed products of similar composition.Keywords: beef, dysphagia, product characteristics, texture-modified foods, 3D food printing
Procedia PDF Downloads 11124147 HBTOnto: An Ontology Model for Analyzing Human Behavior Trajectories
Authors: Heba M. Wagih, Hoda M. O. Mokhtar
Abstract:
Social Network has recently played a significant role in both scientific and social communities. The growing adoption of social network applications has been a relevant source of information nowadays. Due to its popularity, several research trends are emerged to service the huge volume of users including, Location-Based Social Networks (LBSN), Recommendation Systems, Sentiment Analysis Applications, and many others. LBSNs applications are among the highly demanded applications that do not focus only on analyzing the spatiotemporal positions in a given raw trajectory but also on understanding the semantics behind the dynamics of the moving object. LBSNs are possible means of predicting human mobility based on users social ties as well as their spatial preferences. LBSNs rely on the efficient representation of users’ trajectories. Hence, traditional raw trajectory information is no longer convenient. In our research, we focus on studying human behavior trajectory which is the major pillar in location recommendation systems. In this paper, we propose an ontology design patterns with their underlying description logics to efficiently annotate human behavior trajectories.Keywords: human behavior trajectory, location-based social network, ontology, social network
Procedia PDF Downloads 45324146 Synergy Effect of Energy and Water Saving in China's Energy Sectors: A Multi-Objective Optimization Analysis
Authors: Yi Jin, Xu Tang, Cuiyang Feng
Abstract:
The ‘11th five-year’ and ‘12th five-year’ plans have clearly put forward to strictly control the total amount and intensity of energy and water consumption. The synergy effect of energy and water has rarely been considered in the process of energy and water saving in China, where its contribution cannot be maximized. Energy sectors consume large amounts of energy and water when producing massive energy, which makes them both energy and water intensive. Therefore, the synergy effect in these sectors is significant. This paper assesses and optimizes the synergy effect in three energy sectors under the background of promoting energy and water saving. Results show that: From the perspective of critical path, chemical industry, mining and processing of non-metal ores and smelting and pressing of metals are coupling points in the process of energy and water flowing to energy sectors, in which the implementation of energy and water saving policies can bring significant synergy effect. Multi-objective optimization shows that increasing efforts on input restructuring can effectively improve synergy effects; relatively large synergetic energy saving and little water saving are obtained after solely reducing the energy and water intensity of coupling sectors. By optimizing the input structure of sectors, especially the coupling sectors, the synergy effect of energy and water saving can be improved in energy sectors under the premise of keeping economy running stably.Keywords: critical path, energy sector, multi-objective optimization, synergy effect, water
Procedia PDF Downloads 36124145 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest
Authors: Bharatendra Rai
Abstract:
Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error
Procedia PDF Downloads 32524144 Technologies for Phosphorus Removal from Wastewater: Review
Authors: Thandie Veronicah Sima, Moatlhodi Wiseman Letshwenyo
Abstract:
Discharge of wastewater is one of the major sources of phosphorus entering streams, lakes and other water bodies causing undesired environmental problem such as eutrophication. This condition not only puts the ecosystem at risk but also causes severe economic damages. Stringent laws have been developed globally by different bodies to control the level of phosphorus concentrations into receiving environments. In order to satisfy the constraints, a high degree of tertiary treatment or at least a significant reduction of phosphorus concentration is obligatory. This comprehensive review summarizes phosphorus removal technologies, from the most commonly used conventional technologies such as chemical precipitation through metal addition, membrane filtration, reverse osmosis and enhanced biological phosphorus removal using activated sludge system to passive systems such as constructed wetlands and filtration systems. Trends, perspectives and scientific procedures conducted by different researchers have been presented. This review critically evaluates the advantages and limitations behind each of the technologies. Enhancement of passive systems using reactive media such as industrial wastes to provide additional uptake through adsorption or precipitation is also discussed in this article.Keywords: adsorption, chemical precipitation, enhanced biological phosphorus removal, phosphorus removal
Procedia PDF Downloads 32824143 Image-Based (RBG) Technique for Estimating Phosphorus Levels of Different Crops
Authors: M. M. Ali, Ahmed Al- Ani, Derek Eamus, Daniel K. Y. Tan
Abstract:
In this glasshouse study, we developed the new image-based non-destructive technique for detecting leaf P status of different crops such as cotton, tomato and lettuce. Plants were allowed to grow on nutrient media containing different P concentrations, i.e. 0%, 50% and 100% of recommended P concentration (P0 = no P, L; P1 = 2.5 mL 10 L-1 of P and P2 = 5 mL 10 L-1 of P as NaH2PO4). After 10 weeks of growth, plants were harvested and data on leaf P contents were collected using the standard destructive laboratory method and at the same time leaf images were collected by a handheld crop image sensor. We calculated leaf area, leaf perimeter and RGB (red, green and blue) values of these images. This data was further used in the linear discriminant analysis (LDA) to estimate leaf P contents, which successfully classified these plants on the basis of leaf P contents. The data indicated that P deficiency in crop plants can be predicted using the image and morphological data. Our proposed non-destructive imaging method is precise in estimating P requirements of different crop species.Keywords: image-based techniques, leaf area, leaf P contents, linear discriminant analysis
Procedia PDF Downloads 38324142 Extraction of Saponins and Cyclopeptides from Cow Cockle (Vaccaria hispanica (Mill.) Rauschert) Seeds Grown in Turkey
Authors: Ihsan Burak Cam, Ferhan Balci-Torun, Ayhan Topuz, Esin Ari, Ismail Gokhan Deniz, Ilker Genc
Abstract:
The seeds of Vaccaria hispanica have been used in food and pharmaceutical industry. It is an important product due to its superior starch granules, triterpenic saponins, and cyclopeptides suitable for drug delivery. V. hispanica naturally grows in different climatic regions and has genotypes that differ in terms of seed content and composition. Sixty-six V. hispanica seed specimens were collected based on the representation of the distribution in all regions of Turkey and the determination of possible genotypic differences between regions. The seeds, collected from each of the 66 locations, were grown in greenhouse conditions in Akdeniz University, Antalya. Saponin and cyclopeptide contents of the V. hispanica seeds were determined after harvest. Accelerated solvent extraction (ASE) was applied for the extraction of saponins and cyclopeptides. Cyclopeptide (segetalin A) and saponin content of V. hispanica seeds were found in the range of 0.165-0.654 g/100 g and 0.15-1.14 g/100 g, respectively. The results were found to be promising for the seeds from Turkey in terms of saponin content and quality. Acknowledgment: This study was supported by the Scientific and Research Council of Turkey (TUBITAK) (project no 112 O 136).Keywords: Vaccaria hispanica, saponin, cyclopeptid, cow cockle seeds
Procedia PDF Downloads 29624141 Mining the Proteome of Fusobacterium nucleatum for Potential Therapeutics Discovery
Authors: Abdul Musaweer Habib, Habibul Hasan Mazumder, Saiful Islam, Sohel Sikder, Omar Faruk Sikder
Abstract:
The plethora of genome sequence information of bacteria in recent times has ushered in many novel strategies for antibacterial drug discovery and facilitated medical science to take up the challenge of the increasing resistance of pathogenic bacteria to current antibiotics. In this study, we adopted subtractive genomics approach to analyze the whole genome sequence of the Fusobacterium nucleatum, a human oral pathogen having association with colorectal cancer. Our study divulged 1499 proteins of Fusobacterium nucleatum, which has no homolog in human genome. These proteins were subjected to screening further by using the Database of Essential Genes (DEG) that resulted in the identification of 32 vitally important proteins for the bacterium. Subsequent analysis of the identified pivotal proteins, using the KEGG Automated Annotation Server (KAAS) resulted in sorting 3 key enzymes of F. nucleatum that may be good candidates as potential drug targets, since they are unique for the bacterium and absent in humans. In addition, we have demonstrated the 3-D structure of these three proteins. Finally, determination of ligand binding sites of the key proteins as well as screening for functional inhibitors that best fitted with the ligands sites were conducted to discover effective novel therapeutic compounds against Fusobacterium nucleatum.Keywords: colorectal cancer, drug target, Fusobacterium nucleatum, homology modeling, ligands
Procedia PDF Downloads 39024140 Numerical Simulation of Fracturing Behaviour of Pre-Cracked Crystalline Rock Using a Cohesive Grain-Based Distinct Element Model
Authors: Mahdi Saadat, Abbas Taheri
Abstract:
Understanding the cracking response of crystalline rocks at mineralogical scale is of great importance during the design procedure of mining structures. A grain-based distinct element model (GBM) is employed to numerically study the cracking response of Barre granite at micro- and macro-scales. The GBM framework is augmented with a proposed distinct element-based cohesive model to reproduce the micro-cracking response of the inter- and intra-grain contacts. The cohesive GBM framework is implemented in PFC2D distinct element codes. The microstructural properties of Barre granite are imported in PFC2D to generate synthetic specimens. The microproperties of the model is calibrated against the laboratory uniaxial compressive and Brazilian split tensile tests. The calibrated model is then used to simulate the fracturing behaviour of pre-cracked Barre granite with different flaw configurations. The numerical results of the proposed model demonstrate a good agreement with the experimental counterparts. The GBM framework proposed thus appears promising for further investigation of the influence of grain microstructure and mineralogical properties on the cracking behaviour of crystalline rocks.Keywords: discrete element modelling, cohesive grain-based model, crystalline rock, fracturing behavior
Procedia PDF Downloads 13224139 Design of Visual Repository, Constraint and Process Modeling Tool Based on Eclipse Plug-Ins
Authors: Rushiraj Heshi, Smriti Bhandari
Abstract:
Master Data Management requires creation of Central repository, applying constraints on Repository and designing processes to manage data. Designing of Repository, constraints on repository and business processes is very tedious and time consuming task for large Enterprise. Hence Visual Repository, constraints and Process (Workflow) modeling is the most critical step in Master Data Management.In this paper, we realize a Visual Modeling tool for implementing Repositories, Constraints and Processes based on Eclipse Plugin using GMF/EMF which follows principles of Model Driven Engineering (MDE).Keywords: EMF, GMF, GEF, repository, constraint, process
Procedia PDF Downloads 49924138 The Classification Performance in Parametric and Nonparametric Discriminant Analysis for a Class- Unbalanced Data of Diabetes Risk Groups
Authors: Lily Ingsrisawang, Tasanee Nacharoen
Abstract:
Introduction: The problems of unbalanced data sets generally appear in real world applications. Due to unequal class distribution, many research papers found that the performance of existing classifier tends to be biased towards the majority class. The k -nearest neighbors’ nonparametric discriminant analysis is one method that was proposed for classifying unbalanced classes with good performance. Hence, the methods of discriminant analysis are of interest to us in investigating misclassification error rates for class-imbalanced data of three diabetes risk groups. Objective: The purpose of this study was to compare the classification performance between parametric discriminant analysis and nonparametric discriminant analysis in a three-class classification application of class-imbalanced data of diabetes risk groups. Methods: Data from a healthy project for 599 staffs in a government hospital in Bangkok were obtained for the classification problem. The staffs were diagnosed into one of three diabetes risk groups: non-risk (90%), risk (5%), and diabetic (5%). The original data along with the variables; diabetes risk group, age, gender, cholesterol, and BMI was analyzed and bootstrapped up to 50 and 100 samples, 599 observations per sample, for additional estimation of misclassification error rate. Each data set was explored for the departure of multivariate normality and the equality of covariance matrices of the three risk groups. Both the original data and the bootstrap samples show non-normality and unequal covariance matrices. The parametric linear discriminant function, quadratic discriminant function, and the nonparametric k-nearest neighbors’ discriminant function were performed over 50 and 100 bootstrap samples and applied to the original data. In finding the optimal classification rule, the choices of prior probabilities were set up for both equal proportions (0.33: 0.33: 0.33) and unequal proportions with three choices of (0.90:0.05:0.05), (0.80: 0.10: 0.10) or (0.70, 0.15, 0.15). Results: The results from 50 and 100 bootstrap samples indicated that the k-nearest neighbors approach when k = 3 or k = 4 and the prior probabilities of {non-risk:risk:diabetic} as {0.90:0.05:0.05} or {0.80:0.10:0.10} gave the smallest error rate of misclassification. Conclusion: The k-nearest neighbors approach would be suggested for classifying a three-class-imbalanced data of diabetes risk groups.Keywords: error rate, bootstrap, diabetes risk groups, k-nearest neighbors
Procedia PDF Downloads 43724137 Indigenous Companies in Nigeria's Oil Sector: Stages, Opportunities, and Obstacles regarding Corporate Social Responsibility
Authors: L. U. Dumuje, R. Leite
Abstract:
There is an ongoing debate in terms of corporate social responsibility (CSR) initiative in Niger Delta, Nigeria, that originates from existing gap between stated objective of organizations in the Nigerian oil sector and their main activities that threaten the society. CSR in developing countries is becoming popular, and to contribute to scientific knowledge, we need to research on CSR practices and discourse in indigenous Nigeria that is scarce. Despite governments mandate in terms of unofficial blazing, methane gas is released into the air around refinery area which contributes to global warming. There is a need to understand if this practice applies to indigenous oil companies in Nigeria. To get a better understanding of CSR among indigenous oil companies in Nigeria, our study focuses on discourse and rhetoric regarding CSR. This current paper contributions is twofold: on the one hand, it aims to better understand practitioner’s rationale and fundamentals of CSR in Nigerian oil companies. On the other hand, it intends to identify the stages of CSR initiatives, advantages and difficulties of CSR implementation in indigenous Nigeria oil sector. This current paper uses the qualitative research as a methodological strategy. Instrument for data collection is semi-structured interview. Besides 28 interviews, we conduct five focus group discussions with stakeholders. Participant for this study consist of: employees, managers and executives of indigenous oil companies in Nigeria. It is relevant to mention, key informants as government institution, environmental organization and community leader/member are part of our sample. It is important that despite significant findings in some studies, there are still some gaps. To help filling this existing gaps, we have formulated some research questions, as follows: ‘What are the stages, opportunities and obstacles of having corporate social responsibility practice in indigenous oil companies in Nigeria’. This ongoing research sub-questions as follows: What are the CSR discourses and practices among indigenous companies in the Nigerian oil sector; what is the actual status regarding CSR development; what are the main perceptions of opportunities and obstacles with regard to CSR in indigenous Nigerian oil companies; who are the main stakeholders of indigenous Nigerian oil companies and their different meanings and understandings of CSR practices. Regarding the above questions, the following objectives have been determined: first, we conduct a literature review with the aim of understanding and identifying importance of CSR practises in western and developing countries. Second, this current paper identify specific characteristics of the national context in terms of CSR engagement in Nigeria, so we perform empirical research with relevant stakeholder in indigenous Nigerian, as well as key informants, in order to identify development of CSR and different perception of this praised initiative, CSR.Keywords: corporate social responsibility, indigenous, oil organizations, Nigeria, practice
Procedia PDF Downloads 13924136 BFDD-S: Big Data Framework to Detect and Mitigate DDoS Attack in SDN Network
Authors: Amirreza Fazely Hamedani, Muzzamil Aziz, Philipp Wieder, Ramin Yahyapour
Abstract:
Software-defined networking in recent years came into the sight of so many network designers as a successor to the traditional networking. Unlike traditional networks where control and data planes engage together within a single device in the network infrastructure such as switches and routers, the two planes are kept separated in software-defined networks (SDNs). All critical decisions about packet routing are made on the network controller, and the data level devices forward the packets based on these decisions. This type of network is vulnerable to DDoS attacks, degrading the overall functioning and performance of the network by continuously injecting the fake flows into it. This increases substantial burden on the controller side, and the result ultimately leads to the inaccessibility of the controller and the lack of network service to the legitimate users. Thus, the protection of this novel network architecture against denial of service attacks is essential. In the world of cybersecurity, attacks and new threats emerge every day. It is essential to have tools capable of managing and analyzing all this new information to detect possible attacks in real-time. These tools should provide a comprehensive solution to automatically detect, predict and prevent abnormalities in the network. Big data encompasses a wide range of studies, but it mainly refers to the massive amounts of structured and unstructured data that organizations deal with on a regular basis. On the other hand, it regards not only the volume of the data; but also that how data-driven information can be used to enhance decision-making processes, security, and the overall efficiency of a business. This paper presents an intelligent big data framework as a solution to handle illegitimate traffic burden on the SDN network created by the numerous DDoS attacks. The framework entails an efficient defence and monitoring mechanism against DDoS attacks by employing the state of the art machine learning techniques.Keywords: apache spark, apache kafka, big data, DDoS attack, machine learning, SDN network
Procedia PDF Downloads 17124135 Welding Process Selection for Storage Tank by Integrated Data Envelopment Analysis and Fuzzy Credibility Constrained Programming Approach
Authors: Rahmad Wisnu Wardana, Eakachai Warinsiriruk, Sutep Joy-A-Ka
Abstract:
Selecting the most suitable welding process usually depends on experiences or common application in similar companies. However, this approach generally ignores many criteria that can be affecting the suitable welding process selection. Therefore, knowledge automation through knowledge-based systems will significantly improve the decision-making process. The aims of this research propose integrated data envelopment analysis (DEA) and fuzzy credibility constrained programming approach for identifying the best welding process for stainless steel storage tank in the food and beverage industry. The proposed approach uses fuzzy concept and credibility measure to deal with uncertain data from experts' judgment. Furthermore, 12 parameters are used to determine the most appropriate welding processes among six competitive welding processes.Keywords: welding process selection, data envelopment analysis, fuzzy credibility constrained programming, storage tank
Procedia PDF Downloads 16924134 Impact of Pandemics on Cities and Societies
Authors: Deepak Jugran
Abstract:
Purpose: The purpose of this study is to identify how past Pandemics shaped social evolution and cities. Methodology: A historical and comparative analysis of major historical pandemics in human history their origin, transmission route, biological response and the aftereffects. A Comprehensive pre & post pandemic scenario and focuses selectively on major issues and pandemics that have deepest & lasting impact on society with available secondary data. Results: Past pandemics shaped the behavior of human societies and their cities and made them more resilient biologically, intellectually & socially endorsing the theory of “Survival of the fittest” by Sir Charles Darwin. Pandemics & Infectious diseases are here to stay and as a human society, we need to strengthen our collective response & preparedness besides evolving mechanisms for strict controls on inter-continental movements of people, & especially animals who become carriers for these viruses. Conclusion: Pandemics always resulted in great mortality, but they also improved the overall individual human immunology & collective social response; at the same time, they also improved the public health system of cities, health delivery systems, water, sewage distribution system, institutionalized various welfare reforms and overall collective social response by the societies. It made human beings more resilient biologically, intellectually, and socially hence endorsing the theory of “AGIL” by Prof Talcott Parsons. Pandemics & infectious diseases are here to stay and as humans, we need to strengthen our city response & preparedness besides evolving mechanisms for strict controls on inter-continental movements of people, especially animals who always acted as carriers for these novel viruses. Pandemics over the years acted like natural storms, mitigated the prevailing social imbalances and laid the foundation for scientific discoveries. We understand that post-Covid-19, institutionalized city, state and national mechanisms will get strengthened and the recommendations issued by the various expert groups which were ignored earlier will now be implemented for reliable anticipation, better preparedness & help to minimize the impact of Pandemics. Our analysis does not intend to present chronological findings of pandemics but rather focuses selectively on major pandemics in history, their causes and how they wiped out an entire city’s population and influenced the societies, their behavior and facilitated social evolution.Keywords: pandemics, Covid-19, social evolution, cities
Procedia PDF Downloads 11524133 On the Estimation of Crime Rate in the Southwest of Nigeria: Principal Component Analysis Approach
Authors: Kayode Balogun, Femi Ayoola
Abstract:
Crime is at alarming rate in this part of world and there are many factors that are contributing to this antisocietal behaviour both among the youths and old. In this work, principal component analysis (PCA) was used as a tool to reduce the dimensionality and to really know those variables that were crime prone in the study region. Data were collected on twenty-eight crime variables from National Bureau of Statistics (NBS) databank for a period of fifteen years, while retaining as much of the information as possible. We use PCA in this study to know the number of major variables and contributors to the crime in the Southwest Nigeria. The results of our analysis revealed that there were eight principal variables have been retained using the Scree plot and Loading plot which implies an eight-equation solution will be appropriate for the data. The eight components explained 93.81% of the total variation in the data set. We also found that the highest and commonly committed crimes in the Southwestern Nigeria were: Assault, Grievous Harm and Wounding, theft/stealing, burglary, house breaking, false pretence, unlawful arms possession and breach of public peace.Keywords: crime rates, data, Southwest Nigeria, principal component analysis, variables
Procedia PDF Downloads 44624132 On-Line Data-Driven Multivariate Statistical Prediction Approach to Production Monitoring
Authors: Hyun-Woo Cho
Abstract:
Detection of incipient abnormal events in production processes is important to improve safety and reliability of manufacturing operations and reduce losses caused by failures. The construction of calibration models for predicting faulty conditions is quite essential in making decisions on when to perform preventive maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of process measurement data. The calibration model is used to predict faulty conditions from historical reference data. This approach utilizes variable selection techniques, and the predictive performance of several prediction methods are evaluated using real data. The results shows that the calibration model based on supervised probabilistic model yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.Keywords: calibration model, monitoring, quality improvement, feature selection
Procedia PDF Downloads 35924131 Multilevel Gray Scale Image Encryption through 2D Cellular Automata
Authors: Rupali Bhardwaj
Abstract:
Cryptography is the science of using mathematics to encrypt and decrypt data; the data are converted into some other gibberish form, and then the encrypted data are transmitted. The primary purpose of this paper is to provide two levels of security through a two-step process, rather than transmitted the message bits directly, first encrypted it using 2D cellular automata and then scrambled with Arnold Cat Map transformation; it provides an additional layer of protection and reduces the chance of the transmitted message being detected. A comparative analysis on effectiveness of scrambling technique is provided by scrambling degree measurement parameters i.e. Gray Difference Degree (GDD) and Correlation Coefficient.Keywords: scrambling, cellular automata, Arnold cat map, game of life, gray difference degree, correlation coefficient
Procedia PDF Downloads 38024130 Tutankhamen’s Shrines (Naoses): Scientific Identification of Wood Species and Technology
Authors: Medhat Abdallah, Ahmed Abdrabou
Abstract:
Tutankhamen tomb was discovered on November 1922 by Howard carter, the grave was relatively intact and crammed full of the most beautiful burial items and furniture, the black shrine-shaped boxes on sleds studied here founded in treasury chamber. This study aims to identify the wood species used in making those shrines, illustrate technology of manufacture. Optical Microscope (OM), 3D software and Imaging Processes including; Visible light, Raking light and Visible-induced infrared luminescence were effective in illustrating wooden joints and techniques of manufacture. The results revealed that cedar of Lebanon Cedrus libani and sycamore fig Ficus sycomorus had been used for making the shrines’ boards and sleds while tamarisk Tamarix sp., Turkey Oak Quercus cerris L., and Sidder (nabk) Zizyphus spina christi used for making dowels. The wooden joint of mortise and tenon was used to connect the body of the shrine to the sled, while wooden pegs used to connect roof and cornice to the shrine body.Keywords: Tutankhamen, wood species, optical microscope, Cedrus libani, Ficus sycomorus
Procedia PDF Downloads 20924129 A Comparative Study on Supercritical C02 and Water as Working Fluids in a Heterogeneous Geothermal Reservoir
Authors: Musa D. Aliyu, Ouahid Harireche, Colin D. Hills
Abstract:
The incapability of supercritical C02 to transport and dissolve mineral species from the geothermal reservoir to the fracture apertures and other important parameters in heat mining makes it an attractive substance for Heat extraction from hot dry rock. In other words, the thermodynamic efficiency of hot dry rock (HDR) reservoirs also increases if supercritical C02 is circulated at excess temperatures of 3740C without the drawbacks connected with silica dissolution. Studies have shown that circulation of supercritical C02 in homogenous geothermal reservoirs is quite encouraging; in comparison to that of the water. This paper aims at investigating the aforementioned processes in the case of the heterogeneous geothermal reservoir located at the Soultz site (France). The MultiPhysics finite element package COMSOL with an interface of coupling different processes encountered in the geothermal reservoir stimulation is used. A fully coupled numerical model is developed to study the thermal and hydraulic processes in order to predict the long-term operation of the basic reservoir parameters that give optimum energy production. The results reveal that the temperature of the SCC02 at the production outlet is higher than that of water in long-term stimulation; as the temperature is an essential ingredient in rating the energy production. It is also observed that the mass flow rate of the SCC02 is far more favourable compared to that of water.Keywords: FEM, HDR, heterogeneous reservoir, stimulation, supercritical C02
Procedia PDF Downloads 38724128 Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks
Authors: Naveed Ghani, Samreen Javed
Abstract:
In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.Keywords: network worms, malware infection propagating malicious code, virus, security, VPN
Procedia PDF Downloads 35824127 Interactive IoT-Blockchain System for Big Data Processing
Authors: Abdallah Al-ZoubI, Mamoun Dmour
Abstract:
The spectrum of IoT devices is becoming widely diversified, entering almost all possible fields and finding applications in industry, health, finance, logistics, education, to name a few. The IoT active endpoint sensors and devices exceeded the 12 billion mark in 2021 and are expected to reach 27 billion in 2025, with over $34 billion in total market value. This sheer rise in numbers and use of IoT devices bring with it considerable concerns regarding data storage, analysis, manipulation and protection. IoT Blockchain-based systems have recently been proposed as a decentralized solution for large-scale data storage and protection. COVID-19 has actually accelerated the desire to utilize IoT devices as it impacted both demand and supply and significantly affected several regions due to logistic reasons such as supply chain interruptions, shortage of shipping containers and port congestion. An IoT-blockchain system is proposed to handle big data generated by a distributed network of sensors and controllers in an interactive manner. The system is designed using the Ethereum platform, which utilizes smart contracts, programmed in solidity to execute and manage data generated by IoT sensors and devices. such as Raspberry Pi 4, Rasbpian, and add-on hardware security modules. The proposed system will run a number of applications hosted by a local machine used to validate transactions. It then sends data to the rest of the network through InterPlanetary File System (IPFS) and Ethereum Swarm, forming a closed IoT ecosystem run by blockchain where a number of distributed IoT devices can communicate and interact, thus forming a closed, controlled environment. A prototype has been deployed with three IoT handling units distributed over a wide geographical space in order to examine its feasibility, performance and costs. Initial results indicated that big IoT data retrieval and storage is feasible and interactivity is possible, provided that certain conditions of cost, speed and thorough put are met.Keywords: IoT devices, blockchain, Ethereum, big data
Procedia PDF Downloads 15124126 Keynote Talk: The Role of Internet of Things in the Smart Cities Power System
Authors: Abdul-Rahman Al-Ali
Abstract:
As the number of mobile devices is growing exponentially, it is estimated to connect about 50 million devices to the Internet by the year 2020. At the end of this decade, it is expected that an average of eight connected devices per person worldwide. The 50 billion devices are not mobile phones and data browsing gadgets only, but machine-to-machine and man-to-machine devices. With such growing numbers of devices the Internet of Things (I.o.T) concept is one of the emerging technologies as of recently. Within the smart grid technologies, smart home appliances, Intelligent Electronic Devices (IED) and Distributed Energy Resources (DER) are major I.o.T objects that can be addressable using the IPV6. These objects are called the smart grid internet of things (SG-I.o.T). The SG-I.o.T generates big data that requires high-speed computing infrastructure, widespread computer networks, big data storage, software, and platforms services. A company’s utility control and data centers cannot handle such a large number of devices, high-speed processing, and massive data storage. Building large data center’s infrastructure takes a long time, it also requires widespread communication networks and huge capital investment. To maintain and upgrade control and data centers’ infrastructure and communication networks as well as updating and renewing software licenses which collectively, requires additional cost. This can be overcome by utilizing the emerging computing paradigms such as cloud computing. This can be used as a smart grid enabler to replace the legacy of utilities data centers. The talk will highlight the role of I.o.T, cloud computing services and their development models within the smart grid technologies.Keywords: intelligent electronic devices (IED), distributed energy resources (DER), internet, smart home appliances
Procedia PDF Downloads 325