Search results for: incidental information processing
9763 Digital Preservation Policies in the Institutional Repositories of Brazilian Federal Universities
Authors: Laerte Pereira da Silva Júnior, Maria Manuel Borges
Abstract:
Institutional Repositories (IR) are complex constructs that depend on political, cultural and technological aspects. Because IRs are a mirror of the organization's intellectual production, their main function is to make that production available worldwide, and also to consider its long term preservation. To this end, there is a need to define clearly the digital preservation policies supported by political decisions. There are several guidelines about the definition of digital preservation policies focusing in different themes from preservation planning to rights and restriction management, sustainability planning, etc., but this work aims to verify the implementation of digital preservation policies on the Institutional Repositories of the Federal Universities of Brazil. The methodology used was to check the information available on the websites of the IRs selected against two fields of the OpenDOAR, policies and OpenDOAR ID, to verify the existence of digital preservation policies. For this purpose a sample of the 21 of the 25 IRs registered at the Directory of Open Access Repositories (DOAR) was used, which is about 1/3 rd of the total of the brazilian universities. The 4 IRs that presented no information by the OpenDOAR team were desconsidered. The main conclusion is that most of the IRs of these universities have no polices clearly stated or no policies at all, and that there is a need to include these concerns at the top level management of IRs. The number of initiatives in digital preservation policies around the world stress the need of awareness of its importance in Brazil and requires measures to raise this awareness.Keywords: Brazil, digital preservation policies, institutional repositories, openDOAR
Procedia PDF Downloads 5359762 A Comparative Assessment of Information Value, Fuzzy Expert System Models for Landslide Susceptibility Mapping of Dharamshala and Surrounding, Himachal Pradesh, India
Authors: Kumari Sweta, Ajanta Goswami, Abhilasha Dixit
Abstract:
Landslide is a geomorphic process that plays an essential role in the evolution of the hill-slope and long-term landscape evolution. But its abrupt nature and the associated catastrophic forces of the process can have undesirable socio-economic impacts, like substantial economic losses, fatalities, ecosystem, geomorphologic and infrastructure disturbances. The estimated fatality rate is approximately 1person /100 sq. Km and the average economic loss is more than 550 crores/year in the Himalayan belt due to landslides. This study presents a comparative performance of a statistical bivariate method and a machine learning technique for landslide susceptibility mapping in and around Dharamshala, Himachal Pradesh. The final produced landslide susceptibility maps (LSMs) with better accuracy could be used for land-use planning to prevent future losses. Dharamshala, a part of North-western Himalaya, is one of the fastest-growing tourism hubs with a total population of 30,764 according to the 2011 census and is amongst one of the hundred Indian cities to be developed as a smart city under PM’s Smart Cities Mission. A total of 209 landslide locations were identified in using high-resolution linear imaging self-scanning (LISS IV) data. The thematic maps of parameters influencing landslide occurrence were generated using remote sensing and other ancillary data in the GIS environment. The landslide causative parameters used in the study are slope angle, slope aspect, elevation, curvature, topographic wetness index, relative relief, distance from lineaments, land use land cover, and geology. LSMs were prepared using information value (Info Val), and Fuzzy Expert System (FES) models. Info Val is a statistical bivariate method, in which information values were calculated as the ratio of the landslide pixels per factor class (Si/Ni) to the total landslide pixel per parameter (S/N). Using this information values all parameters were reclassified and then summed in GIS to obtain the landslide susceptibility index (LSI) map. The FES method is a machine learning technique based on ‘mean and neighbour’ strategy for the construction of fuzzifier (input) and defuzzifier (output) membership function (MF) structure, and the FR method is used for formulating if-then rules. Two types of membership structures were utilized for membership function Bell-Gaussian (BG) and Trapezoidal-Triangular (TT). LSI for BG and TT were obtained applying membership function and if-then rules in MATLAB. The final LSMs were spatially and statistically validated. The validation results showed that in terms of accuracy, Info Val (83.4%) is better than BG (83.0%) and TT (82.6%), whereas, in terms of spatial distribution, BG is best. Hence, considering both statistical and spatial accuracy, BG is the most accurate one.Keywords: bivariate statistical techniques, BG and TT membership structure, fuzzy expert system, information value method, machine learning technique
Procedia PDF Downloads 1279761 Smooth Second Order Nonsingular Terminal Sliding Mode Control for a 6 DOF Quadrotor UAV
Authors: V. Tabrizi, A. Vali, R. GHasemi, V. Behnamgol
Abstract:
In this article, a nonlinear model of an under actuated six degrees of freedom (6 DOF) quadrotor UAV is derived on the basis of the Newton-Euler formula. The derivation comprises determining equations of the motion of the quadrotor in three dimensions and approximating the actuation forces through the modeling of aerodynamic coefficients and electric motor dynamics. The robust nonlinear control strategy includes a smooth second order non-singular terminal sliding mode control which is applied to stabilizing this model. The control method is on the basis of super twisting algorithm for removing the chattering and producing smooth control signal. Also, nonsingular terminal sliding mode idea is used for introducing a nonlinear sliding variable that guarantees the finite time convergence in sliding phase. Simulation results show that the proposed algorithm is robust against uncertainty or disturbance and guarantees a fast and precise control signal.Keywords: quadrotor UAV, nonsingular terminal sliding mode, second order sliding mode t, electronics, control, signal processing
Procedia PDF Downloads 4419760 A Research Using Remote Monitoring Technology for Pump Output Monitoring in Distributed Fuel Stations in Nigeria
Authors: Ofoegbu Ositadinma Edward
Abstract:
This research paper discusses a web based monitoring system that enables effective monitoring of fuel pump output and sales volume from distributed fuel stations under the domain of a single company/organization. The traditional method of operation by these organizations in Nigeria is non-automated and accounting for dispensed product is usually approximated and manual as there is little or no technology implemented to presently provide information relating to the state of affairs in the station both to on-ground staff and to supervisory staff that are not physically present in the station. This results in unaccountable losses in product and revenue as well as slow decision making. Remote monitoring technology as a vast research field with numerous application areas incorporating various data collation techniques and sensor networks can be applied to provide information relating to fuel pump status in distributed fuel stations reliably. Thus, the proposed system relies upon a microcontroller, keypad and pump to demonstrate the traditional fuel dispenser. A web-enabled PC with an accompanying graphic user interface (GUI) was designed using virtual basic which is connected to the microcontroller via the serial port which is to provide the web implementation.Keywords: fuel pump, microcontroller, GUI, web
Procedia PDF Downloads 4349759 An As-Is Analysis and Approach for Updating Building Information Models and Laser Scans
Authors: Rene Hellmuth
Abstract:
Factory planning has the task of designing products, plants, processes, organization, areas, and the construction of a factory. The requirements for factory planning and the building of a factory have changed in recent years. Regular restructuring of the factory building is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity & Ambiguity) lead to more frequent restructuring measures within a factory. A building information model (BIM) is the planning basis for rebuilding measures and becomes an indispensable data repository to be able to react quickly to changes. Use as a planning basis for restructuring measures in factories only succeeds if the BIM model has adequate data quality. Under this aspect and the industrial requirement, three data quality factors are particularly important for this paper regarding the BIM model: up-to-dateness, completeness, and correctness. The research question is: how can a BIM model be kept up to date with required data quality and which visualization techniques can be applied in a short period of time on the construction site during conversion measures? An as-is analysis is made of how BIM models and digital factory models (including laser scans) are currently being kept up to date. Industrial companies are interviewed, and expert interviews are conducted. Subsequently, the results are evaluated, and a procedure conceived how cost-effective and timesaving updating processes can be carried out. The availability of low-cost hardware and the simplicity of the process are of importance to enable service personnel from facility mnagement to keep digital factory models (BIM models and laser scans) up to date. The approach includes the detection of changes to the building, the recording of the changing area, and the insertion into the overall digital twin. Finally, an overview of the possibilities for visualizations suitable for construction sites is compiled. An augmented reality application is created based on an updated BIM model of a factory and installed on a tablet. Conversion scenarios with costs and time expenditure are displayed. A user interface is designed in such a way that all relevant conversion information is available at a glance for the respective conversion scenario. A total of three essential research results are achieved: As-is analysis of current update processes for BIM models and laser scans, development of a time-saving and cost-effective update process and the conception and implementation of an augmented reality solution for BIM models suitable for construction sites.Keywords: building information modeling, digital factory model, factory planning, restructuring
Procedia PDF Downloads 1149758 Distance and Coverage: An Assessment of Location-Allocation Models for Fire Stations in Kuwait City, Kuwait
Authors: Saad M. Algharib
Abstract:
The major concern of planners when placing fire stations is finding their optimal locations such that the fire companies can reach fire locations within reasonable response time or distance. Planners are also concerned with the numbers of fire stations that are needed to cover all service areas and the fires, as demands, with standard response time or distance. One of the tools for such analysis is location-allocation models. Location-allocation models enable planners to determine the optimal locations of facilities in an area in order to serve regional demands in the most efficient way. The purpose of this study is to examine the geographic distribution of the existing fire stations in Kuwait City. This study utilized location-allocation models within the Geographic Information System (GIS) environment and a number of statistical functions to assess the current locations of fire stations in Kuwait City. Further, this study investigated how well all service areas are covered and how many and where additional fire stations are needed. Four different location-allocation models were compared to find which models cover more demands than the others, given the same number of fire stations. This study tests many ways to combine variables instead of using one variable at a time when applying these models in order to create a new measurement that influences the optimal locations for locating fire stations. This study also tests how location-allocation models are sensitive to different levels of spatial dependency. The results indicate that there are some districts in Kuwait City that are not covered by the existing fire stations. These uncovered districts are clustered together. This study also identifies where to locate the new fire stations. This study provides users of these models a new variable that can assist them to select the best locations for fire stations. The results include information about how the location-allocation models behave in response to different levels of spatial dependency of demands. The results show that these models perform better with clustered demands. From the additional analysis carried out in this study, it can be concluded that these models applied differently at different spatial patterns.Keywords: geographic information science, GIS, location-allocation models, geography
Procedia PDF Downloads 1779757 A Meta-Analysis of Handwriting and Visual-Motor Integration (VMI): The Moderating Effect of Handwriting Dimensions
Authors: Hong Lu, Xin Chen, Zhengcheng Fan
Abstract:
Prior research has claimed a close association between handwriting and mathematics attainment with the help of spatial cognition. However, the exact mechanism behind this relationship remains un-investigated. Focusing on visual-motor integration (VMI), one critical spatial skill, this meta-analysis aims to estimate the size of the handwriting- visual-motor integration relationship and examine the moderating effect of handwriting dimensions on the link. With a random effect model, a medium relation (r=.26, 95%CI [.22, .30]) between handwriting and VMI was summarized in 38 studies with 55 unique samples and 141 effect sizes. Findings suggested handwriting dimensions significantly moderated the handwriting- VMI relationship, with handwriting legibility showing a substantial correlation with VMI, but neither handwriting speed nor pressure. Identifying the essential relationship between handwriting legibility and VMI, this study adds to the literature about the key cognitive processing needs underlying handwriting, and spatial cognition thus highlights the cognitive mechanism regarding handwriting, spatial cognition, and mathematics performances.Keywords: handwriting, visual-motor integration, legibility, meta-analysis
Procedia PDF Downloads 1099756 A Comprehensive Review of Electronic Health Records Implementation in Healthcare
Authors: Lateefat Amao, Misagh Faezipour
Abstract:
Implementing electronic health records (EHR) in healthcare is a pivotal transition aimed at digitizing and optimizing patient health information management. The expectations associated with this transition are high, even towards other health information systems (HIS) and health technology. This multifaceted process involves careful planning and execution to improve the quality and efficiency of patient care, especially as healthcare technology is a sensitive niche. Key considerations include a thorough needs assessment, judicious vendor selection, robust infrastructure development, and training and adaptation of healthcare professionals. Comprehensive training programs, data migration from legacy systems and models, interoperability, as well as security and regulatory compliance are imperative for healthcare staff to navigate EHR systems adeptly. The purpose of this work is to offer a comprehensive review of the literature on EHR implementation. It explores the impact of this health technology on health practices, highlights challenges and barriers to its successful utility, and offers practical strategies that can impact its success in healthcare. This paper provides a thorough review of studies on the adoption of EHRs, emphasizing the wide range of experiences and results connected to EHR use in the medical field, especially across different types of healthcare organizations.Keywords: healthcare, electronic health records, EHR implementation, patient care, interoperability
Procedia PDF Downloads 819755 A Survey of Sentiment Analysis Based on Deep Learning
Authors: Pingping Lin, Xudong Luo, Yifan Fan
Abstract:
Sentiment analysis is a very active research topic. Every day, Facebook, Twitter, Weibo, and other social media, as well as significant e-commerce websites, generate a massive amount of comments, which can be used to analyse peoples opinions or emotions. The existing methods for sentiment analysis are based mainly on sentiment dictionaries, machine learning, and deep learning. The first two kinds of methods rely on heavily sentiment dictionaries or large amounts of labelled data. The third one overcomes these two problems. So, in this paper, we focus on the third one. Specifically, we survey various sentiment analysis methods based on convolutional neural network, recurrent neural network, long short-term memory, deep neural network, deep belief network, and memory network. We compare their futures, advantages, and disadvantages. Also, we point out the main problems of these methods, which may be worthy of careful studies in the future. Finally, we also examine the application of deep learning in multimodal sentiment analysis and aspect-level sentiment analysis.Keywords: document analysis, deep learning, multimodal sentiment analysis, natural language processing
Procedia PDF Downloads 1649754 Feature Engineering Based Detection of Buffer Overflow Vulnerability in Source Code Using Deep Neural Networks
Authors: Mst Shapna Akter, Hossain Shahriar
Abstract:
One of the most important challenges in the field of software code audit is the presence of vulnerabilities in software source code. Every year, more and more software flaws are found, either internally in proprietary code or revealed publicly. These flaws are highly likely exploited and lead to system compromise, data leakage, or denial of service. C and C++ open-source code are now available in order to create a largescale, machine-learning system for function-level vulnerability identification. We assembled a sizable dataset of millions of opensource functions that point to potential exploits. We developed an efficient and scalable vulnerability detection method based on deep neural network models that learn features extracted from the source codes. The source code is first converted into a minimal intermediate representation to remove the pointless components and shorten the dependency. Moreover, we keep the semantic and syntactic information using state-of-the-art word embedding algorithms such as glove and fastText. The embedded vectors are subsequently fed into deep learning networks such as LSTM, BilSTM, LSTM-Autoencoder, word2vec, BERT, and GPT-2 to classify the possible vulnerabilities. Furthermore, we proposed a neural network model which can overcome issues associated with traditional neural networks. Evaluation metrics such as f1 score, precision, recall, accuracy, and total execution time have been used to measure the performance. We made a comparative analysis between results derived from features containing a minimal text representation and semantic and syntactic information. We found that all of the deep learning models provide comparatively higher accuracy when we use semantic and syntactic information as the features but require higher execution time as the word embedding the algorithm puts on a bit of complexity to the overall system.Keywords: cyber security, vulnerability detection, neural networks, feature extraction
Procedia PDF Downloads 909753 Generation of Automated Alarms for Plantwide Process Monitoring
Authors: Hyun-Woo Cho
Abstract:
Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.Keywords: detection, monitoring, process data, noise
Procedia PDF Downloads 2529752 Traffic Light Detection Using Image Segmentation
Authors: Vaishnavi Shivde, Shrishti Sinha, Trapti Mishra
Abstract:
Traffic light detection from a moving vehicle is an important technology both for driver safety assistance functions as well as for autonomous driving in the city. This paper proposed a deep-learning-based traffic light recognition method that consists of a pixel-wise image segmentation technique and a fully convolutional network i.e., UNET architecture. This paper has used a method for detecting the position and recognizing the state of the traffic lights in video sequences is presented and evaluated using Traffic Light Dataset which contains masked traffic light image data. The first stage is the detection, which is accomplished through image processing (image segmentation) techniques such as image cropping, color transformation, segmentation of possible traffic lights. The second stage is the recognition, which means identifying the color of the traffic light or knowing the state of traffic light which is achieved by using a Convolutional Neural Network (UNET architecture).Keywords: traffic light detection, image segmentation, machine learning, classification, convolutional neural networks
Procedia PDF Downloads 1749751 Android Graphics System: Study of Dual-Software VSync Synchronization Architecture and Optimization
Authors: Prafulla Kumar Choubey, Krishna Kishor Jha, S. B. Vaisakh Punnekkattu Chirayil
Abstract:
In Graphics-display subsystem, frame buffers are shared between producer i.e. content rendering and consumer i.e. display. If a common buffer is operated by both producer and consumer simultaneously, their processing rates mismatch can cause tearing effect in displayed content. Therefore, Android OS employs triple buffered system, taking in to account an additional composition stage. Three stages-rendering, composition and display refresh, operate synchronously on three different buffers, which is achieved by using vsync pulses. This synchronization, however, brings in to the pipeline an additional latency of up to 26ms. The present study details about the existing synchronization mechanism of android graphics-display pipeline and discusses a new adaptive architecture which reduces the wait time to 5ms-16ms in all the use-cases. The proposed method uses two adaptive software vsyncs (PLL) for achieving the same result.Keywords: Android graphics system, vertical synchronization, atrace, adaptive system
Procedia PDF Downloads 3159750 Inference for Compound Truncated Poisson Lognormal Model with Application to Maximum Precipitation Data
Authors: M. Z. Raqab, Debasis Kundu, M. A. Meraou
Abstract:
In this paper, we have analyzed maximum precipitation data during a particular period of time obtained from different stations in the Global Historical Climatological Network of the USA. One important point to mention is that some stations are shut down on certain days for some reason or the other. Hence, the maximum values are recorded by excluding those readings. It is assumed that the number of stations that operate follows zero-truncated Poisson random variables, and the daily precipitation follows a lognormal random variable. We call this model a compound truncated Poisson lognormal model. The proposed model has three unknown parameters, and it can take a variety of shapes. The maximum likelihood estimators can be obtained quite conveniently using Expectation-Maximization (EM) algorithm. Approximate maximum likelihood estimators are also derived. The associated confidence intervals also can be obtained from the observed Fisher information matrix. Simulation results have been performed to check the performance of the EM algorithm, and it is observed that the EM algorithm works quite well in this case. When we analyze the precipitation data set using the proposed model, it is observed that the proposed model provides a better fit than some of the existing models.Keywords: compound Poisson lognormal distribution, EM algorithm, maximum likelihood estimation, approximate maximum likelihood estimation, Fisher information, skew distribution
Procedia PDF Downloads 1099749 Comparison of Heuristic Methods for Solving Traveling Salesman Problem
Authors: Regita P. Permata, Ulfa S. Nuraini
Abstract:
Traveling Salesman Problem (TSP) is the most studied problem in combinatorial optimization. In simple language, TSP can be described as a problem of finding a minimum distance tour to a city, starting and ending in the same city, and exactly visiting another city. In product distribution, companies often get problems in determining the minimum distance that affects the time allocation. In this research, we aim to apply TSP heuristic methods to simulate nodes as city coordinates in product distribution. The heuristics used are sub tour reversal, nearest neighbor, farthest insertion, cheapest insertion, nearest insertion, and arbitrary insertion. We have done simulation nodes using Euclidean distances to compare the number of cities and processing time, thus we get optimum heuristic method. The results show that the optimum heuristic methods are farthest insertion and nearest insertion. These two methods can be recommended to solve product distribution problems in certain companies.Keywords: Euclidean, heuristics, simulation, TSP
Procedia PDF Downloads 1289748 Influence of Online Sports Events on Betting among Nigerian Youth
Authors: Babajide Olufemi Diyaolu
Abstract:
The opportunity provided by advances in technology as regards sports betting is so numerous that even at one's comfort, with the use of a phone, Nigerian youth are found engaging in all kinds of betting. Today it is more difficult to differentiate a true fan as there are quite a number of them that became fans as a result of betting on live games. This study investigated the influence of online sports events on betting among Nigerian youth. A descriptive survey research design was used, and the population consists of all Nigerian youth that engages in betting and live within the southwest zone of Nigeria. A simple random sampling technique was used to pick three states from the southwest zone of Nigeria. Two thousand five hundred respondents comprising males and female were sampled from the three states. A structured questionnaire on online sports event contribution to sports betting (OSECSB) was used. The Instrument consists of three sections. Section A seeks information on the demographic data of the respondents. Section B seeks information on online sports events, while section C is used to extract information on sports betting. The modified instrument, which consists of 14 items, has a reliability coefficient of 0.74. The hypothesis was tested at 0.05 significance level. The completed questionnaire was collated, coded, and analyzed using descriptive statistics of frequency counts, percentage and pie chart, and inferential statistics of multiple regressions. The findings of this study revealed that online sports betting is a significant predictor of an increase in sports betting among Nigerian youth. The media and television, as well as globalization and the internet coupled with social media and various online platforms, have all contributed to the immense increase in sports betting. The increase in the advertisement of the betting platform during live matches, especially football, is becoming more alarming. In most organized international events, the media attention, as well as sponsorship right, are now been given to one or two betting platforms. There is a need for all stakeholders to put in place school-based intervention programs to reorientate our youth about the consequences of addiction to betting. Such programs must include meta-analyses and emotional control towards sports betting.Keywords: betting platform, Nigerian fans, Nigerian youth, sports betting
Procedia PDF Downloads 749747 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models
Authors: Rossella Arcucci, Luisa D'Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti
Abstract:
This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.Keywords: data assimilation, GPU architectures, ocean models, parallel algorithm
Procedia PDF Downloads 4129746 The Impact of Environmental Corporate Social Responsibility (ECSR) and the Perceived Moral Intensity on the Intention of Ethical Investment
Authors: Chiung-Yao Huang, Yu-Cheng Lin, Chiung-Hui Chen
Abstract:
This study seeks to examine perceived environmental corporate social responsibility (ECSR) with a focus on negative environmental questions, related to intention of ethical investment intention after a environmental failure recovery. An empirical test was employed to test the hypotheses. We manipulated the information on negative ECSR activities of a hypothetical firm in a experimental design with a failure recovery treatment. The company’s negative ECSR recovery was depicted in a positive perspective (depicting a follow-up strong social action), whereas in the negative ECSR treatment it was described in a negative perspective (depicting a follow-up non social action). In both treatments, information about other key characteristics of the focal company were kept constant. Investors’ intentions to invest in the company’s stock were evaluated by multi-item scales. Results indicate that positive ECSR recovery information about a firm enhances investors’ intentions to invest in the company’s stock. In addition, perceived moral intensity has a significant impact on the intention of ethical investment and that perceived moral intensity also serves as a key moderating variable in the relationship between negative ECSR and the intention of ethical investment. Finally, theoretical and managerial implications of the findings are discussed. Practical implications: The results suggest that managers may need to be aware of perceived moral intensity as a key variable in restoring the intention of ethical investment. The results further suggest that perceived moral intensity has a direct, and it also has an moderating influence between ECSR and the intention of ethical investment. Originality/value: In an attempt to deepen the understanding of how investors perceptions of firm environmental CSR are connected with other investor‐related outcomes through ECSR recovery, the present research proposes a comprehensive model which encompasses ECSR and other key relationship constructs after a ECSR failure and recovery.Keywords: ethical investment, Environmental Corporate Social Responsibility(ECSR), ECSR recovery, moral intensity
Procedia PDF Downloads 3509745 Microbiological Properties and Mineral Contents of Honeys from Bordj Bou Arreridj Region (Algeria)
Authors: Diafat Abdelouahab, Ekhalfi A Hammoudia, Meribai Abdelmalek A, Bahloul Ahmedb
Abstract:
The present study aimed to characterize 30 honey samples from the Bordj Bou Arreridj region (Algeria) regarding their floral origins, physicochemical parameters, mineral composition and microbial safety. Mean values obtained for physicochemical parameters were: pH 4.11, 17.17% moisture, 0.0061% ash, 370.57μS cm−1 electrical conductivity, 21.98 meq/kg free acidity, and 9.703 mg/kg HMF. The mineral content was determined by atomic absorption spectrometry. The mean values obtained were (mg/kg): Fe, 7.5714; Mg, 37.68; Na, 186,63; Zn, 3,86; Pb, 0,4869 × 10-3 ; Cd, 267 × 10-3. Aerobic mesophiles, fecal coliforms and sulphite-reducing clostridia were the microbial contaminants of interest studied. Microbiologically, the honey quality was considered good and all samples showed to be negative in respect to safety parameters. The results obtained for physicochemical characteristics of Bordj Bou Arreridj honey indicate a good quality level, adequate processing, good maturity and freshness.Keywords: pollen analysis, physicochemical analysis, mineral content, microbial contaminants
Procedia PDF Downloads 899744 Application of the Seismic Reflection Survey to an Active Fault Imaging
Authors: Nomin-Erdene Erdenetsogt, Tseedulam Khuut, Batsaikhan Tserenpil, Bayarsaikhan Enkhee
Abstract:
As the framework of 60 years of development of Astronomical and Geophysical science in modern Mongolia, various geophysical methods (electrical tomography, ground-penetrating radar, and high-resolution reflection seismic profiles) were used to image an active fault in-depth range between few decimeters to few tens meters. An active fault was fractured by an earthquake magnitude 7.6 during 1967. After geophysical investigations, trench excavations were done at the sites to expose the fault surfaces. The complex geophysical survey in the Mogod fault, Bulgan region of central Mongolia shows an interpretable reflection arrivals range of < 5 m to 50 m with the potential for increased resolution. Reflection profiles were used to help interpret the significance of neotectonic surface deformation at earthquake active fault. The interpreted profiles show a range of shallow fault structures and provide subsurface evidence with support of paleoseismologic trenching photos, electrical surveys.Keywords: Mogod fault, geophysics, seismic processing, seismic reflection survey
Procedia PDF Downloads 1299743 Examining E-Government Impact Using Public Value Approach: A Case Study in Pakistan
Authors: Shahid Nishat, Keith Thomas
Abstract:
E-government initiatives attract substantial public investments around the world. These investments are based on the premise of digital transformation of the public services, improved efficiency and transparency, and citizen participation in the social democratic processes. However, many e-Government projects, especially in developing countries, fail to achieve their intended outcomes, and a strong disparity exists between the investments made and outcomes achieved, often referred to as e-Government paradox. Further, there is lack of research on evaluating the impacts of e-Government in terms of public value it creates, which ultimately drives usage. This study aims to address these gaps by identifying key enablers of e-Government success and by proposing a public value based framework to examine impact of e-Government services. The study will extend Delone and McLean Information System (IS) Success model by integrating Technology Readiness (TR) characteristics to develop an integrated success model. Level of analysis will be mobile government applications, and the framework will be empirically tested using quantitative methods. The research will add to the literature on e-Government success and will be beneficial for governments, especially in developing countries aspiring to improve public services through the use of Information Communication Technologies (ICT).Keywords: e-Government, IS success model, public value, technology adoption, technology readiness
Procedia PDF Downloads 1319742 Detecting Model Financial Statement Fraud by Auditor Industry Specialization with Fraud Triangle Analysis
Authors: Reskino Resky
Abstract:
This research purposes to create a model to detecting financial statement fraud. This research examines the variable of fraud triangle and auditor industry specialization with financial statement fraud. This research used sample of company which is listed in Indonesian Stock Exchange that have sanctions and cases by Financial Services Authority in 2011-2013. The number of company that were became in this research were 30 fraud company and 30 non-fraud company. The method of determining the sample is by using purposive sampling method with judgement sampling, while the data processing methods used by researcher are mann-whitney u and discriminants analysis. This research have two from five variable that can be process with discriminant analysis. The result shows the financial targets can be detect financial statement fraud, while financial stability can’t be detect financial statement fraud.Keywords: fraud triangle analysis, financial targets, financial stability, auditor industry specialization, financial statement fraud
Procedia PDF Downloads 4579741 Real Time Detection of Application Layer DDos Attack Using Log Based Collaborative Intrusion Detection System
Authors: Farheen Tabassum, Shoab Ahmed Khan
Abstract:
The brutality of attacks on networks and decisive infrastructures are on the climb over recent years and appears to continue to do so. Distributed Denial of service attack is the most prevalent and easy attack on the availability of a service due to the easy availability of large botnet computers at cheap price and the general lack of protection against these attacks. Application layer DDoS attack is DDoS attack that is targeted on wed server, application server or database server. These types of attacks are much more sophisticated and challenging as they get around most conventional network security devices because attack traffic often impersonate normal traffic and cannot be recognized by network layer anomalies. Conventional techniques of single-hosted security systems are becoming gradually less effective in the face of such complicated and synchronized multi-front attacks. In order to protect from such attacks and intrusion, corporation among all network devices is essential. To overcome this issue, a collaborative intrusion detection system (CIDS) is proposed in which multiple network devices share valuable information to identify attacks, as a single device might not be capable to sense any malevolent action on its own. So it helps us to take decision after analyzing the information collected from different sources. This novel attack detection technique helps to detect seemingly benign packets that target the availability of the critical infrastructure, and the proposed solution methodology shall enable the incident response teams to detect and react to DDoS attacks at the earliest stage to ensure that the uptime of the service remain unaffected. Experimental evaluation shows that the proposed collaborative detection approach is much more effective and efficient than the previous approaches.Keywords: Distributed Denial-of-Service (DDoS), Collaborative Intrusion Detection System (CIDS), Slowloris, OSSIM (Open Source Security Information Management tool), OSSEC HIDS
Procedia PDF Downloads 3549740 A Hybrid System of Hidden Markov Models and Recurrent Neural Networks for Learning Deterministic Finite State Automata
Authors: Pavan K. Rallabandi, Kailash C. Patidar
Abstract:
In this paper, we present an optimization technique or a learning algorithm using the hybrid architecture by combining the most popular sequence recognition models such as Recurrent Neural Networks (RNNs) and Hidden Markov models (HMMs). In order to improve the sequence or pattern recognition/ classification performance by applying a hybrid/neural symbolic approach, a gradient descent learning algorithm is developed using the Real Time Recurrent Learning of Recurrent Neural Network for processing the knowledge represented in trained Hidden Markov Models. The developed hybrid algorithm is implemented on automata theory as a sample test beds and the performance of the designed algorithm is demonstrated and evaluated on learning the deterministic finite state automata.Keywords: hybrid systems, hidden markov models, recurrent neural networks, deterministic finite state automata
Procedia PDF Downloads 3889739 Solid Waste Management through Mushroom Cultivation: An Eco Friendly Approach
Authors: Mary Josephine
Abstract:
Waste of certain process can be the input source of other sectors in order to reduce environmental pollution. Today there are more and more solid wastes are generated, but only very small amount of those are recycled. So, the threatening of environmental pressure to public health is very serious. The methods considered for the treatment of solid waste are biogas tanks or processing to make animal feed and fertilizer, however, they did not perform well. An alternative approach is growing mushrooms on waste residues. This is regarded as an environmental friendly solution with potential economic benefit. The substrate producers do their best to produce quality substrate at low cost. Apart from other methods, this can be achieved by employing biologically degradable wastes used as the resource material component of the substrate. Mushroom growing is a significant tool for the restoration, replenishment and remediation of Earth’s overburdened ecosphere. One of the rational methods of waste utilization involves locally available wastes. The present study aims to find out the yield of mushroom grown on locally available waste for free and to conserve our environment by recycling wastes.Keywords: biodegradable, environment, mushroom, remediation
Procedia PDF Downloads 3979738 Adopting Data Science and Citizen Science to Explore the Development of African Indigenous Agricultural Knowledge Platform
Authors: Steven Sam, Ximena Schmidt, Hugh Dickinson, Jens Jensen
Abstract:
The goal of this study is to explore the potential of data science and citizen science approaches to develop an interactive, digital, open infrastructure that pulls together African indigenous agriculture and food systems data from multiple sources, making it accessible and reusable for policy, research and practice in modern food production efforts. The World Bank has recognised that African Indigenous Knowledge (AIK) is innovative and unique among local and subsistent smallholder farmers, and it is central to sustainable food production and enhancing biodiversity and natural resources in many poor, rural societies. AIK refers to tacit knowledge held in different languages, cultures and skills passed down from generation to generation by word of mouth. AIK is a key driver of food production, preservation, and consumption for more than 80% of citizens in Africa, and can therefore assist modern efforts of reducing food insecurity and hunger. However, the documentation and dissemination of AIK remain a big challenge confronting librarians and other information professionals in Africa, and there is a risk of losing AIK owing to urban migration, modernisation, land grabbing, and the emergence of relatively small-scale commercial farming businesses. There is also a clear disconnect between the AIK and scientific knowledge and modern efforts for sustainable food production. The study combines data science and citizen science approaches through active community participation to generate and share AIK for facilitating learning and promoting knowledge that is relevant for policy intervention and sustainable food production through a curated digital platform based on FAIR principles. The study adopts key informant interviews along with participatory photo and video elicitation approach, where farmers are given digital devices (mobile phones) to record and document their every practice involving agriculture, food production, processing, and consumption by traditional means. Data collected are analysed using the UK Science and Technology Facilities Council’s proven methodology of citizen science (Zooniverse) and data science. Outcomes are presented in participatory stakeholder workshops, where the researchers outline plans for creating the platform and developing the knowledge sharing standard framework and copyrights agreement. Overall, the study shows that learning from AIK, by investigating what local communities know and have, can improve understanding of food production and consumption, in particular in times of stress or shocks affecting the food systems and communities. Thus, the platform can be useful for local populations, research, and policy-makers, and it could lead to transformative innovation in the food system, creating a fundamental shift in the way the North supports sustainable, modern food production efforts in Africa.Keywords: Africa indigenous agriculture knowledge, citizen science, data science, sustainable food production, traditional food system
Procedia PDF Downloads 829737 Tobacco Taxation and the Heterogeneity of Smokers' Responses to Price Increases
Authors: Simone Tedeschi, Francesco Crespi, Paolo Liberati, Massimo Paradiso, Antonio Sciala
Abstract:
This paper aims at contributing to the understanding of smokers’ responses to cigarette prices increases with a focus on heterogeneity, both across individuals and price levels. To do this, a stated preference quasi-experimental design grounded in a random utility framework is proposed to evaluate the effect on smokers’ utility of the price level and variation, along with social conditioning and health impact perception. The analysis is based on individual-level data drawn from a unique survey gathering very detailed information on Italian smokers’ habits. In particular, qualitative information on the individual reactions triggered by changes in prices of different magnitude and composition are exploited. The main findings stemming from the analysis are the following; the average price elasticity of cigarette consumption is comparable with previous estimates for advanced economies (-.32). However, the decomposition of this result across five latent-classes of smokers, reveals extreme heterogeneity in terms of price responsiveness, implying a potential price elasticity that ranges between 0.05 to almost 1. Such heterogeneity is in part explained by observable characteristics such as age, income, gender, education as well as (current and lagged) smoking intensity. Moreover, price responsiveness is far from being independent from the size of the prospected price increase. Finally, by comparing even and uneven price variations, it is shown that uniform across-brand price increases are able to limit the scope of product substitutions and downgrade. Estimated price-response heterogeneity has significant implications for tax policy. Among them, first, it provides evidence and a rationale for why the aggregate price elasticity is likely to follow a strictly increasing pattern as a function of the experienced price variation. This information is crucial for forecasting the effect of a given tax-driven price change on tax revenue. Second, it provides some guidance on how to design excise tax reforms to balance public health and revenue goals.Keywords: smoking behaviour, preference heterogeneity, price responsiveness, cigarette taxation, random utility models
Procedia PDF Downloads 1629736 The Underground Ecosystem of Credit Card Frauds
Authors: Abhinav Singh
Abstract:
Point Of Sale (POS) malwares have been stealing the limelight this year. They have been the elemental factor in some of the biggest breaches uncovered in past couple of years. Some of them include • Target: A Retail Giant reported close to 40 million credit card data being stolen • Home Depot : A home product Retailer reported breach of close to 50 million credit records • Kmart: A US retailer recently announced breach of 800 thousand credit card details. Alone in 2014, there have been reports of over 15 major breaches of payment systems around the globe. Memory scrapping malwares infecting the point of sale devices have been the lethal weapon used in these attacks. These malwares are capable of reading the payment information from the payment device memory before they are being encrypted. Later on these malwares send the stolen details to its parent server. These malwares are capable of recording all the critical payment information like the card number, security number, owner etc. All these information are delivered in raw format. This Talk will cover the aspects of what happens after these details have been sent to the malware authors. The entire ecosystem of credit card frauds can be broadly classified into these three steps: • Purchase of raw details and dumps • Converting them to plastic cash/cards • Shop! Shop! Shop! The focus of this talk will be on the above mentioned points and how they form an organized network of cyber-crime. The first step involves buying and selling of the stolen details. The key point to emphasize are : • How is this raw information been sold in the underground market • The buyer and seller anatomy • Building your shopping cart and preferences • The importance of reputation and vouches • Customer support and replace/refunds These are some of the key points that will be discussed. But the story doesn’t end here. As of now the buyer only has the raw card information. How will this raw information be converted to plastic cash? Now comes in picture the second part of this underground economy where-in these raw details are converted into actual cards. There are well organized services running underground that can help you in converting these details into plastic cards. We will discuss about this technique in detail. At last, the final step involves shopping with the stolen cards. The cards generated with the stolen details can be easily used to swipe-and-pay for purchased goods at different retail shops. Usually these purchases are of expensive items that have good resale value. Apart from using the cards at stores, there are underground services that lets you deliver online orders to their dummy addresses. Once the package is received it will be delivered to the original buyer. These services charge based on the value of item that is being delivered. The overall underground ecosystem of credit card fraud works in a bulletproof way and it involves people working in close groups and making heavy profits. This is a brief summary of what I plan to present at the talk. I have done an extensive research and have collected good deal of material to present as samples. Some of them include: • List of underground forums • Credit card dumps • IRC chats among these groups • Personal chat with big card sellers • Inside view of these forum owners. The talk will be concluded by throwing light on how these breaches are being tracked during investigation. How are credit card breaches tracked down and what steps can financial institutions can build an incidence response over it.Keywords: POS mawalre, credit card frauds, enterprise security, underground ecosystem
Procedia PDF Downloads 4399735 Conventional and Islamic Perspective in Accounting: Potential for Alternative Reporting Framework
Authors: Shibly Abdullah
Abstract:
This paper provides an overview of fundamental philosophical and functional differences in conventional and Islamic accounting. The aim of this research is to undertake a detailed analysis focus on specific illustrations drawn from both these systems and highlight how these differences implicate in recording financial transactions and preparation of financial reports for a range of stakeholders. Accounting as being universally considered as a platform for providing a ‘true and fair’ view of corporate entities can be challenged in the current world view, as the business environment has evolved and transformed significantly. Growth of the non-traditional corporate entity such as Islamic financial institutions, fundamentally questions the applicability of conventional accounting standards in preparation of Shariah-compliant financial reporting. Coupled with this, there are significant concerns about the wider applicability of Islamic accounting standards and framework in order to achieve reporting practices satisfying the information needs generally. Against the backdrop of such a context, this paper raises fundamental question as to how potential convergence could be achieved between these two systems in order to provide users’ a transparent and comparable state of financial information resulting in an alternative framework of financial reporting.Keywords: accounting, conventional accounting, corporate reporting, Islamic accounting
Procedia PDF Downloads 2829734 Influence of Vesicular Arbuscular Mycorrhiza on Growth of Cucumis myriocarpus Indigenous Leafy Vegetable
Authors: Pontsho E. Tseke, Phatu W. Mashela
Abstract:
Climate-smart agriculture dictates that underusilised indigenous plant, which served as food for local marginalized communities, be assessed for introduction into mainstream agriculture. Most of the underutilised indigenous plants had survived adverse conditions in the wild; with limited information on how the interact with most abiotic and biotic factors. Cucumis myriocarpus leafy vegetable has nutritional, pharmacological and industrial applications, with limited information on how it interacts with effective microorganisms. The objective of this study was to determine the effects vesicular arbuscular mycorrhiza (VAM) on the growth of C. myriocarpus indigenous leafy vegetable under greenhouse conditions. Four-weeks-old seedlings of C. myriocarpus were transplanted into 20-cm-diameter plastic pots. Two weeks after transplanting, VAM was applied at 0, 10, 20, 30, 40, 50, 60 and 70 g Biocult-VAM plant. At 56 days after treatments, plant growth variables of C. myriocarpus with increase Biocult-VAM levels exhibited positive quadratic relations. Plant variables and increasing concentrations of salinity exhibited positive quadric relations, with 95 to 99% associations. Inclusion, Biocult-VAM can be used in sustainable production of C. myriocarpus for functional food security.Keywords: abiotic, biotic, rhizasphere, sustainable agriculture
Procedia PDF Downloads 279