Search results for: minimum data set
24310 A Crowdsourced Homeless Data Collection System and Its Econometric Analysis: Strengthening Inclusive Public Administration Policies
Authors: Praniil Nagaraj
Abstract:
This paper proposes a method to collect homeless data using crowdsourcing and presents an approach to analyze the data, demonstrating its potential to strengthen existing and future policies aimed at promoting socio-economic equilibrium. This paper's contributions can be categorized into three main areas. Firstly, a unique method for collecting homeless data is introduced, utilizing a user-friendly smartphone app (currently available for Android). The app enables the general public to quickly record information about homeless individuals, including the number of people and details about their living conditions. The collected data, including date, time, and location, is anonymized and securely transmitted to the cloud. It is anticipated that an increasing number of users motivated to contribute to society will adopt the app, thus expanding the data collection efforts. Duplicate data is addressed through simple classification methods, and historical data is utilized to fill in missing information. The second contribution of this paper is the description of data analysis techniques applied to the collected data. By combining this new data with existing information, statistical regression analysis is employed to gain insights into various aspects, such as distinguishing between unsheltered and sheltered homeless populations, as well as examining their correlation with factors like unemployment rates, housing affordability, and labor demand. Initial data is collected in San Francisco, while pre-existing information is drawn from three cities: San Francisco, New York City, and Washington D.C., facilitating the conduction of simulations. The third contribution focuses on demonstrating the practical implications of the data processing results. The challenges faced by key stakeholders, including charitable organizations and local city governments, are taken into consideration. Two case studies are presented as examples. The first case study explores improving the efficiency of food and necessities distribution, as well as medical assistance, driven by charitable organizations. The second case study examines the correlation between micro-geographic budget expenditure by local city governments and homeless information to justify budget allocation and expenditures. The ultimate objective of this endeavor is to enable the continuous enhancement of the quality of life for the underprivileged. It is hoped that through increased crowdsourcing of data from the public, the Generosity Curve and the Need Curve will intersect, leading to a better world for all.Keywords: crowdsourcing, homelessness, socio-economic policies, statistical analysis
Procedia PDF Downloads 4624309 Bacterial Contamination of Kitchen Sponges and Cutting Surfaces and Disinfection Procedures
Authors: Hayyan I Al Taweil
Abstract:
Background: The most common of bacterium in kitchen sponges and cutting surfaces which can play a task within the cross-contamination of foods, fomites and hands by foodborne pathogens. Aims and Objectives: This study investigated the incidence of bacterium in kitchen Sponge, and cutting surfaces. Material and methods: a complete of twenty four kitchen Sponges were collected from home kitchens and therefore the numbers of mesotrophic microorganism, coliform microorganism, E. coli, Salmonella, genus {pseudomonas|bacteria genus} and staphylococci in every kitchen Sponges were determined. Microbiological tests of all sponges for total mesophilic aerobic microorganism, S. aureus, Pseudomonas, Salmonella spp., and E. coli were performed on days 3, 7, and 14 by sampling. The sponges involved in daily use in kitchens countenosely with the dishwasher detergent a minimum of doubly daily Results: Results from the overall mesophilic aerobic microorganism, indicate a major increase within the variety of log CFU/ml. the amount of E. coli was reduced, Salmonella spp. was stabled, S. aureus was enhanced from the sponges throughout fourteen days. Genus Pseudomonas was enhanced and was the dominant micro flora within the sponges throughout fourteen days.Keywords: Kitchen Sponges, Microbiological Contamination, Disinfection; cutting surface; , Cross-Contamination
Procedia PDF Downloads 14524308 Does Level of Countries Corruption Affect Firms Working Capital Management?
Authors: Ebrahim Mansoori, Datin Joriah Muhammad
Abstract:
Recent studies in finance have focused on the effect of external variables on working capital management. This study investigates the effect of corruption indexes on firms' working capital management. A large data set that covers data from 2005 to 2013 from five ASEAN countries, namely, Malaysia, Indonesia, Singapore, Thailand, and the Philippines, was selected to investigate how the level of corruption in these countries affect working capital management. The results of panel data analysis include fixed effect estimations showed that a high level of countries' corruption indexes encourages managers to shorten the CCC length. Meanwhile, the managers reduce the level of investment in cash and cash equivalents when the levels of corruption indexes increase. Therefore, increasing the level of countries' corruption indexes encourages managers to select conservative working capital strategies by reducing the level of NLB.Keywords: ASEAN, corruption indexes, panel data analysis, working capital management
Procedia PDF Downloads 43824307 BIM Data and Digital Twin Framework: Preserving the Past and Predicting the Future
Authors: Mazharuddin Syed Ahmed
Abstract:
This research presents a framework used to develop The Ara Polytechnic College of Architecture Studies building “Kahukura” which is Green Building certified. This framework integrates the development of a smart building digital twin by utilizing Building Information Modelling (BIM) and its BIM maturity levels, including Levels of Development (LOD), eight dimensions of BIM, Heritage-BIM (H-BIM) and Facility Management BIM (FM BIM). The research also outlines a structured approach to building performance analysis and integration with the circular economy, encapsulated within a five-level digital twin framework. Starting with Level 1, the Descriptive Twin provides a live, editable visual replica of the built asset, allowing for specific data inclusion and extraction. Advancing to Level 2, the Informative Twin integrates operational and sensory data, enhancing data verification and system integration. At Level 3, the Predictive Twin utilizes operational data to generate insights and proactive management suggestions. Progressing to Level 4, the Comprehensive Twin simulates future scenarios, enabling robust “what-if” analyses. Finally, Level 5, the Autonomous Twin, represents the pinnacle of digital twin evolution, capable of learning and autonomously acting on behalf of users.Keywords: building information modelling, circular economy integration, digital twin, predictive analytics
Procedia PDF Downloads 4324306 Monitor Vehicle Speed Using Internet of Things Based Wireless Sensor Network System
Authors: Akber Oumer Abdurezak
Abstract:
Road traffic accident is a major problem in Ethiopia, resulting in the deaths of many people and potential injuries and crash every year and loss of properties. According to the Federal Transport Authority, one of the main causes of traffic accident and crash in Ethiopia is over speeding. Implementation of different technologies is used to monitor the speed of vehicles in order to minimize accidents and crashes. This research aimed at designing a speed monitoring system to monitor the speed of travelling vehicles and movements, reporting illegal speeds or overspeeding vehicles to the concerned bodies. The implementation of the system is through a wireless sensor network. The proposed system can sense and detect the movement of vehicles, process, and analysis the data obtained from the sensor and the cloud system. The data is sent to the central controlling server. The system contains accelerometer and gyroscope sensors to sense and collect the data of the vehicle. Arduino to process the data and Global System for Mobile Communication (GSM) module for communication purposes to send the data to the concerned body. When the speed of the vehicle exceeds the allowable speed limit, the system sends a message to database as “over speeding”. Both accelerometer and gyroscope sensors are used to collect acceleration data. The acceleration data then convert to speed, and the corresponding speed is checked with the speed limit, and those above the speed limit are reported to the concerned authorities to avoid frequent accidents. The proposed system decreases the occurrence of accidents and crashes due to overspeeding and can be used as an eye opener for the implementation of other intelligent transport system technologies. This system can also integrate with other technologies like GPS and Google Maps to obtain better output.Keywords: accelerometer, IOT, GSM, gyroscope
Procedia PDF Downloads 7524305 User Satisfaction Survey Based Facility Performance Evaluation
Authors: Gopikrishnan Seshadhri, V. M. Topkar
Abstract:
Facility management post occupation is a facet that has gained tremendous ground in the recent times. While the efficiency of expenditure and utilization of all types of resources are monitored to ensure timely completion with minimum cost and acceptable quality during construction phase, value for money comes out only when the facility performs satisfactorily post occupation, meeting aspirations and expectations of users of the facility. It is more so for the public facilities. Due to the paradigm shift in focus to outcome based performance evaluation, user satisfaction obtained mainly through questionnaires has become the single important criterion in performance evaluation. Questionnaires presently being used to gauge user satisfaction being subjective, the feedback obtained do not necessarily reflect actual performance. Hence, there is a requirement of developing a survey instrument that can gauge user satisfaction as objectively as possible and truly reflects the ground reality. A near correct picture of actual performance of the built facility from the user point of view will enable facility managers to address pertinent issues. This paper brings out the need for an effective survey instrument that will elicit more objective user response. It also lists steps involved in formulation of such an instrument.Keywords: facility performance evaluation, attributes, attribute descriptors, user satisfaction surveys, statistical methods, performance indicators
Procedia PDF Downloads 28924304 Image Distortion Correction Method of 2-MHz Side Scan Sonar for Underwater Structure Inspection
Authors: Youngseok Kim, Chul Park, Jonghwa Yi, Sangsik Choi
Abstract:
The 2-MHz Side Scan SONAR (SSS) attached to the boat for inspection of underwater structures is affected by shaking. It is difficult to determine the exact scale of damage of structure. In this study, a motion sensor is attached to the inside of the 2-MHz SSS to get roll, pitch, and yaw direction data, and developed the image stabilization tool to correct the sonar image. We checked that reliable data can be obtained with an average error rate of 1.99% between the measured value and the actual distance through experiment. It is possible to get the accurate sonar data to inspect damage in underwater structure.Keywords: image stabilization, motion sensor, safety inspection, sonar image, underwater structure
Procedia PDF Downloads 28024303 Futuristic Black Box Design Considerations and Global Networking for Real Time Monitoring of Flight Performance Parameters
Authors: K. Parandhama Gowd
Abstract:
The aim of this research paper is to conceptualize, discuss, analyze and propose alternate design methodologies for futuristic Black Box for flight safety. The proposal also includes global networking concepts for real time surveillance and monitoring of flight performance parameters including GPS parameters. It is expected that this proposal will serve as a failsafe real time diagnostic tool for accident investigation and location of debris in real time. In this paper, an attempt is made to improve the existing methods of flight data recording techniques and improve upon design considerations for futuristic FDR to overcome the trauma of not able to locate the block box. Since modern day communications and information technologies with large bandwidth are available coupled with faster computer processing techniques, the attempt made in this paper to develop a failsafe recording technique is feasible. Further data fusion/data warehousing technologies are available for exploitation.Keywords: flight data recorder (FDR), black box, diagnostic tool, global networking, cockpit voice and data recorder (CVDR), air traffic control (ATC), air traffic, telemetry, tracking and control centers ATTTCC)
Procedia PDF Downloads 57224302 Applying Hybrid Graph Drawing and Clustering Methods on Stock Investment Analysis
Authors: Mouataz Zreika, Maria Estela Varua
Abstract:
Stock investment decisions are often made based on current events of the global economy and the analysis of historical data. Conversely, visual representation could assist investors’ gain deeper understanding and better insight on stock market trends more efficiently. The trend analysis is based on long-term data collection. The study adopts a hybrid method that combines the Clustering algorithm and Force-directed algorithm to overcome the scalability problem when visualizing large data. This method exemplifies the potential relationships between each stock, as well as determining the degree of strength and connectivity, which will provide investors another understanding of the stock relationship for reference. Information derived from visualization will also help them make an informed decision. The results of the experiments show that the proposed method is able to produced visualized data aesthetically by providing clearer views for connectivity and edge weights.Keywords: clustering, force-directed, graph drawing, stock investment analysis
Procedia PDF Downloads 30224301 Clinical and Laboratory Diagnosis of Malaria in Surat Thani, Southern Thailand
Authors: Manas Kotepui, Chatree Ratcha, Kwuntida Uthaisar
Abstract:
Malaria infection is still to be considered a major public health problem in Thailand. This study, a retrospective data of patients in Surat Thani Province, Southern Thailand during 2012-2015 was retrieved and analyzed. These data include demographic data, clinical characteristics and laboratory diagnosis. Statistical analyses were performed to demonstrate the frequency, proportion, data tendency, and group comparisons. Total of 395 malaria patients were found. Most of patients were male (253 cases, 64.1%). Most of patients (262 cases, 66.3%) were admitted at 6 am-11.59 am of the day. Three hundred and fifty-five patients (97.5%) were positive with P. falciparum. Hemoglobin, hematocrit, and MCHC between P. falciparum and P. vivax were significant different (P value<0.05).During 2012-2015, prevalence of malaria was highest in 2013. Neutrophils, lymphocytes, and monocytes were significantly changed among patients with fever ≤ 3 days compared with patients with fever >3 days. This information will guide to understanding pathogenesis and characteristic of malaria infection in Sothern Thailand.Keywords: prevalence, malaria, Surat Thani, Thailand
Procedia PDF Downloads 27624300 Adaptive Swarm Balancing Algorithms for Rare-Event Prediction in Imbalanced Healthcare Data
Authors: Jinyan Li, Simon Fong, Raymond Wong, Mohammed Sabah, Fiaidhi Jinan
Abstract:
Clinical data analysis and forecasting have make great contributions to disease control, prevention and detection. However, such data usually suffer from highly unbalanced samples in class distributions. In this paper, we target at the binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat-inspired algorithm, and combine both of them with the synthetic minority over-sampling technique (SMOTE) for processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reveal that while the performance improvements obtained by the former methods are not scalable to larger data scales, the later one, which we call Adaptive Swarm Balancing Algorithms, leads to significant efficiency and effectiveness improvements on large datasets. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. Leading to more credible performances of the classifier, and shortening the running time compared with the brute-force method.Keywords: Imbalanced dataset, meta-heuristic algorithm, SMOTE, big data
Procedia PDF Downloads 44124299 Convergence and Stability in Federated Learning with Adaptive Differential Privacy Preservation
Authors: Rizwan Rizwan
Abstract:
This paper provides an overview of Federated Learning (FL) and its application in enhancing data security, privacy, and efficiency. FL utilizes three distinct architectures to ensure privacy is never compromised. It involves training individual edge devices and aggregating their models on a server without sharing raw data. This approach not only provides secure models without data sharing but also offers a highly efficient privacy--preserving solution with improved security and data access. Also we discusses various frameworks used in FL and its integration with machine learning, deep learning, and data mining. In order to address the challenges of multi--party collaborative modeling scenarios, a brief review FL scheme combined with an adaptive gradient descent strategy and differential privacy mechanism. The adaptive learning rate algorithm adjusts the gradient descent process to avoid issues such as model overfitting and fluctuations, thereby enhancing modeling efficiency and performance in multi-party computation scenarios. Additionally, to cater to ultra-large-scale distributed secure computing, the research introduces a differential privacy mechanism that defends against various background knowledge attacks.Keywords: federated learning, differential privacy, gradient descent strategy, convergence, stability, threats
Procedia PDF Downloads 3024298 Data Security in Cloud Storage
Authors: Amir Rashid
Abstract:
Today is the world of innovation and Cloud Computing is becoming a day to day technology with every passing day offering remarkable services and features on the go with rapid elasticity. This platform took business computing into an innovative dimension where clients interact and operate through service provider web portals. Initially, the trust relationship between client and service provider remained a big question but with the invention of several cryptographic paradigms, it is becoming common in everyday business. This research work proposes a solution for building a cloud storage service with respect to Data Security addressing public cloud infrastructure where the trust relationship matters a lot between client and service provider. For the great satisfaction of client regarding high-end Data Security, this research paper propose a layer of cryptographic primitives combining several architectures in order to achieve the goal. A survey has been conducted to determine the benefits for such an architecture would provide to both clients/service providers and recent developments in cryptography specifically by cloud storage.Keywords: data security in cloud computing, cloud storage architecture, cryptographic developments, token key
Procedia PDF Downloads 29424297 Fuzzy Total Factor Productivity by Credibility Theory
Authors: Shivi Agarwal, Trilok Mathur
Abstract:
This paper proposes the method to measure the total factor productivity (TFP) change by credibility theory for fuzzy input and output variables. Total factor productivity change has been widely studied with crisp input and output variables, however, in some cases, input and output data of decision-making units (DMUs) can be measured with uncertainty. These data can be represented as linguistic variable characterized by fuzzy numbers. Malmquist productivity index (MPI) is widely used to estimate the TFP change by calculating the total factor productivity of a DMU for different time periods using data envelopment analysis (DEA). The fuzzy DEA (FDEA) model is solved using the credibility theory. The results of FDEA is used to measure the TFP change for fuzzy input and output variables. Finally, numerical examples are presented to illustrate the proposed method to measure the TFP change input and output variables. The suggested methodology can be utilized for performance evaluation of DMUs and help to assess the level of integration. The methodology can also apply to rank the DMUs and can find out the DMUs that are lagging behind and make recommendations as to how they can improve their performance to bring them at par with other DMUs.Keywords: chance-constrained programming, credibility theory, data envelopment analysis, fuzzy data, Malmquist productivity index
Procedia PDF Downloads 36524296 What the Future Holds for Social Media Data Analysis
Authors: P. Wlodarczak, J. Soar, M. Ally
Abstract:
The dramatic rise in the use of Social Media (SM) platforms such as Facebook and Twitter provide access to an unprecedented amount of user data. Users may post reviews on products and services they bought, write about their interests, share ideas or give their opinions and views on political issues. There is a growing interest in the analysis of SM data from organisations for detecting new trends, obtaining user opinions on their products and services or finding out about their online reputations. A recent research trend in SM analysis is making predictions based on sentiment analysis of SM. Often indicators of historic SM data are represented as time series and correlated with a variety of real world phenomena like the outcome of elections, the development of financial indicators, box office revenue and disease outbreaks. This paper examines the current state of research in the area of SM mining and predictive analysis and gives an overview of the analysis methods using opinion mining and machine learning techniques.Keywords: social media, text mining, knowledge discovery, predictive analysis, machine learning
Procedia PDF Downloads 42324295 Clustering Locations of Textile and Garment Industries to Compare with the Future Industrial Cluster in Thailand
Authors: Kanogkan Leerojanaprapa
Abstract:
Textile and garment industry is used to a major exporting industry of Thailand. According to lacking of the nation's price-competitiveness by stopping the EU's GSP (Generalised Scheme of Preferences) and ‘Nationwide Minimum Wage Policy’ that Thailand’s employers must pay all employees at least 300 baht (about $10) a day, the supply chains of the Thai textile and garment industry is affected and need to be reformed. Therefore, either Thai textile or garment industry will be existed or not would be concerned. This is also challenged for the government to decide which industries should be promoted the future industries of Thailand. Recently Thai government launch The Cluster-based Special Economic Development Zones Policy for promoting business cluster (effect on September 16, 2015). They define a cluster as the concentration of interconnected businesses and related institutions that operate within the same geographic areas and textiles and garment is one of target industrial clusters and 9 provinces are targeted (Bangkok, Kanchanaburi, Nakhon Pathom, Ratchaburi, Samut Sakhon, Chonburi, Chachoengsao, Prachinburi, and Sa Kaeo). The cluster zone are defined to link west-east corridor connected to manufacturing source in Cambodia and Mynmar to Bangkok where are promoted to be design, sourcing, and trading hub. The Thai government will provide tax and non-tax incentives for targeted industries within the clusters and expects these businesses are scattered to where they can get the most benefit which will identify future industrial cluster. This research will show the difference between the current cluster and future cluster following the target provinces of the textile and garment. The current cluster is analysed from secondary data. The four characteristics of the numbers of plants in Spinning, weaving and finishing of textiles, Manufacture of made-up textile articles, except apparel, Manufacture of knitted and crocheted fabrics, and Manufacture of other textiles, not elsewhere classified in particular 77 provinces (in total) are clustered by K-means cluster analysis and Hierarchical Cluster Analysis. In addition, the cluster can be confirmed and showed which variables contribute the most to defined cluster solution with ANOVA test. The results of analysis can identify 22 provinces (which the textile or garment plants are located) into 3 clusters. Plants in cluster 1 tend to be large numbers of plants which is only Bangkok, Next plants in cluster 2 tend to be moderate numbers of plants which are Samut Prakan, Samut Sakhon and Nakhon Pathom. Finally plants in cluster 3 tend to be little numbers of plants which are other 18 provinces. The same methodology can be implemented in other industries for future study.Keywords: ANOVA, hierarchical cluster analysis, industrial clusters, K -means cluster analysis, textile and garment industry
Procedia PDF Downloads 21324294 Development of Automatic Laser Scanning Measurement Instrument
Authors: Chien-Hung Liu, Yu-Fen Chen
Abstract:
This study used triangular laser probe and three-axial direction mobile platform for surface measurement, programmed it and applied it to real-time analytic statistics of different measured data. This structure was used to design a system integration program: using triangular laser probe for scattering or reflection non-contact measurement, transferring the captured signals to the computer through RS-232, and using RS-485 to control the three-axis platform for a wide range of measurement. The data captured by the laser probe are formed into a 3D surface. This study constructed an optical measurement application program in the concept of visual programming language. First, the signals are transmitted to the computer through RS-232/RS-485, and then the signals are stored and recorded in graphic interface timely. This programming concept analyzes various messages, and makes proper presentation graphs and data processing to provide the users with friendly graphic interfaces and data processing state monitoring, and identifies whether the present data are normal in graphic concept. The major functions of the measurement system developed by this study are thickness measurement, SPC, surface smoothness analysis, and analytical calculation of trend line. A result report can be made and printed promptly. This study measured different heights and surfaces successfully, performed on-line data analysis and processing effectively, and developed a man-machine interface for users to operate.Keywords: laser probe, non-contact measurement, triangulation measurement principle, statistical process control, labVIEW
Procedia PDF Downloads 36024293 An Optimized Association Rule Mining Algorithm
Authors: Archana Singh, Jyoti Agarwal, Ajay Rana
Abstract:
Data Mining is an efficient technology to discover patterns in large databases. Association Rule Mining techniques are used to find the correlation between the various item sets in a database, and this co-relation between various item sets are used in decision making and pattern analysis. In recent years, the problem of finding association rules from large datasets has been proposed by many researchers. Various research papers on association rule mining (ARM) are studied and analyzed first to understand the existing algorithms. Apriori algorithm is the basic ARM algorithm, but it requires so many database scans. In DIC algorithm, less amount of database scan is needed but complex data structure lattice is used. The main focus of this paper is to propose a new optimized algorithm (Friendly Algorithm) and compare its performance with the existing algorithms A data set is used to find out frequent itemsets and association rules with the help of existing and proposed (Friendly Algorithm) and it has been observed that the proposed algorithm also finds all the frequent itemsets and essential association rules from databases as compared to existing algorithms in less amount of database scan. In the proposed algorithm, an optimized data structure is used i.e. Graph and Adjacency Matrix.Keywords: association rules, data mining, dynamic item set counting, FP-growth, friendly algorithm, graph
Procedia PDF Downloads 42124292 Polite Request Strategies in Commuter Discourse in Xhosa
Authors: Mawande Dlali
Abstract:
This paper examines the request strategies in commuter discourse involving taxi drivers and passengers in Khayelitsha as well as the responses to these requests. The present study considers requests in commuter transport as face threatening acts (FTAs), hence the need for the commuter crew to strategically shape their communicative actions to achieve their overall discourse goal of getting passengers to perform actions that are in their own interest with minimum resistance or confrontation. The crew presents itself by using communicative devices that prompt the passengers to evaluate it positively as warm, friendly, and respectful. However, the passengers' responses to requests range from compliance to resistance depending on their interpretation of the speaker’s motive and the probable social consequences. Participant observation by the researcher was the main method of collecting examples of requests and responses to the requests. Unstructured interviews and informal discussions were made with randomly selected taxi drivers and commuters. The findings and explanations presented in this article revealed the predominance of polite requests as speech acts in taxi discourse in Khayelitsha. This research makes a contribution to the contemporary pragmatics study of African languages in urban context.Keywords: face threatening acts, speech acts, request strategies, discourse
Procedia PDF Downloads 16424291 Failure Statistics Analysis of China’s Spacecraft in Full-Life
Authors: Xin-Yan Ji
Abstract:
The historical failures data of the spacecraft is very useful to improve the spacecraft design and the test philosophies and reduce the spacecraft flight risk. A study of spacecraft failures data was performed, which is the most comprehensive statistics of spacecrafts in China. 2593 on-orbit failures data and 1298 ground data that occurred on 150 spacecraft launched from 2000 to 2016 were identified and collected, which covered the navigation satellites, communication satellites, remote sensing deep space exploration manned spaceflight platforms. In this paper, the failures were analyzed to compare different spacecraft subsystem and estimate their impact on the mission, then the development of spacecraft in China was evaluated from design, software, workmanship, management, parts, and materials. Finally, the lessons learned from the past years show that electrical and mechanical failures are responsible for the largest parts, and the key solution to reduce in-orbit failures is improving design technology, enough redundancy, adequate space environment protection measures, and adequate ground testing.Keywords: spacecraft anomalies, anomalies mechanism, failure cause, spacecraft testing
Procedia PDF Downloads 11724290 Advances in Fiber Optic Technology for High-Speed Data Transmission
Authors: Salim Yusif
Abstract:
Fiber optic technology has revolutionized telecommunications and data transmission, providing unmatched speed, bandwidth, and reliability. This paper presents the latest advancements in fiber optic technology, focusing on innovations in fiber materials, transmission techniques, and network architectures that enhance the performance of high-speed data transmission systems. Key advancements include the development of ultra-low-loss optical fibers, multi-core fibers, advanced modulation formats, and the integration of fiber optics into next-generation network architectures such as Software-Defined Networking (SDN) and Network Function Virtualization (NFV). Additionally, recent developments in fiber optic sensors are discussed, extending the utility of optical fibers beyond data transmission. Through comprehensive analysis and experimental validation, this research offers valuable insights into the future directions of fiber optic technology, highlighting its potential to drive innovation across various industries.Keywords: fiber optics, high-speed data transmission, ultra-low-loss optical fibers, multi-core fibers, modulation formats, coherent detection, software-defined networking, network function virtualization, fiber optic sensors
Procedia PDF Downloads 6124289 Economic Evaluation of Bowland Shale Gas Wells Development in the UK
Authors: Elijah Acquah-Andoh
Abstract:
The UK has had its fair share of the shale gas revolutionary waves blowing across the global oil and gas industry at present. Although, its exploitation is widely agreed to have been delayed, shale gas was looked upon favorably by the UK Parliament when they recognized it as genuine energy source and granted licenses to industry to search and extract the resource. This, although a significant progress by industry, there yet remains another test the UK fracking resource must pass in order to render shale gas extraction feasible – it must be economically extractible and sustainably so. Developing unconventional resources is much more expensive and risky, and for shale gas wells, producing in commercial volumes is conditional upon drilling horizontal wells and hydraulic fracturing, techniques which increase CAPEX. Meanwhile, investment in shale gas development projects is sensitive to gas price and technical and geological risks. Using a Two-Factor Model, the economics of the Bowland shale wells were analyzed and the operational conditions under which fracking is profitable in the UK was characterized. We find that there is a great degree of flexibility about Opex spending; hence Opex does not pose much threat to the fracking industry in the UK. However, we discover Bowland shale gas wells fail to add value at gas price of $8/ Mmbtu. A minimum gas price of $12/Mmbtu at Opex of no more than $2/ Mcf and no more than $14.95M Capex are required to create value within the present petroleum tax regime, in the UK fracking industry.Keywords: capex, economical, investment, profitability, shale gas development, sustainable
Procedia PDF Downloads 57924288 Data Science Inquiry to Manage Football Referees’ Careers
Authors: Iñaki Aliende, Tom Webb, Lorenzo Escot
Abstract:
There is a concern about the decrease in football referees globally. A study in Spain has analyzed the factors affecting a referee's career over the past 30 years through a survey of 758 referees. Results showed the impact of factors such as threats, education, initial vocation, and dependents on a referee's career. To improve the situation, the federation needs to provide better information, support young referees, monitor referees, and raise public awareness of violence toward referees. The study also formed a comprehensive model for federations to enhance their officiating policies by means of data-driven techniques that can serve other federations to improve referees' careers.Keywords: data science, football referees, sport management, sport careers, survival analysis
Procedia PDF Downloads 9924287 Digitization and Economic Growth in Africa: The Role of Financial Sector Development
Authors: Abdul Ganiyu Iddrisu, Bei Chen
Abstract:
Digitization is the process of transforming analog material into digital form, especially for storage and use in a computer. Significant development of information and communication technology (ICT) over the past years has encouraged many researchers to investigate its contribution to promoting economic growth and reducing poverty. Yet the compelling empirical evidence on the effects of digitization on economic growth remains weak, particularly in Africa. This is because extant studies that explicitly evaluate digitization and economic growth nexus are mostly reports and desk reviews. This points out an empirical knowledge gap in the literature. Hypothetically, digitization influences financial sector development which in turn influences economic growth. Digitization has changed the financial sector and its operating environment. Obstacles to access to financing, for instance, physical distance, minimum balance requirements, and low-income flows, among others can be circumvented. Savings have increased, micro-savers have opened bank accounts, and banks are now able to price short-term loans. This has the potential to develop the financial sector. However, empirical evidence on the digitization-financial development nexus is dearth. On the other hand, a number of studies maintained that financial sector development greatly influences growth of economies. We, therefore, argue that financial sector development is one of the transmission mechanisms through which digitization affects economic growth. Employing macro-country-level data from African countries and using fixed effects, random effects and Hausman-Taylor estimation approaches, this paper contributes to the literature by analysing economic growth in Africa, focusing on the role of digitization and financial sector development. First, we assess how digitization influences financial sector development in Africa. From an economic policy perspective, it is important to identify digitization determinants of financial sector development so that action can be taken to reduce the economic shocks associated with financial sector distortions. This nexus is rarely examined empirically in the literature. Secondly, we examine the effect of domestic credit to the private sector and stock market capitalization as a percentage of GDP as used to proxy for financial sector development on economic growth. Digitization is represented by the volume of digital/ICT equipment imported and GDP growth is used to proxy economic growth. Finally, we examine the effect of digitization on economic growth in the light of financial sector development. The following key results were found; first, digitalization propels financial sector development in Africa. Second, financial sector development enhances economic growth. Finally, contrary to our expectation, the results also indicate that digitalization conditioned on financial sector development tends to reduce economic growth in Africa. However, results of the net effects suggest that digitalization, overall, improve economic growth in Africa. We, therefore, conclude that, digitalization in Africa does not only develop the financial sector but unconditionally contributes the growth of the continent’s economies.Keywords: digitalization, financial sector development, Africa, economic growth
Procedia PDF Downloads 14024286 Towards the Management of Cybersecurity Threats in Organisations
Authors: O. A. Ajigini, E. N. Mwim
Abstract:
Cybersecurity is the protection of computers, programs, networks, and data from attack, damage, unauthorised, unintended access, change, or destruction. Organisations collect, process and store their confidential and sensitive information on computers and transmit this data across networks to other computers. Moreover, the advent of internet technologies has led to various cyberattacks resulting in dangerous consequences for organisations. Therefore, with the increase in the volume and sophistication of cyberattacks, there is a need to develop models and make recommendations for the management of cybersecurity threats in organisations. This paper reports on various threats that cause malicious damage to organisations in cyberspace and provides measures on how these threats can be eliminated or reduced. The paper explores various aspects of protection measures against cybersecurity threats such as handling of sensitive data, network security, protection of information assets and cybersecurity awareness. The paper posits a model and recommendations on how to manage cybersecurity threats in organisations effectively. The model and the recommendations can then be utilised by organisations to manage the threats affecting their cyberspace. The paper provides valuable information to assist organisations in managing their cybersecurity threats and hence protect their computers, programs, networks and data in cyberspace. The paper aims to assist organisations to protect their information assets and data from cyberthreats as part of the contributions toward community engagement.Keywords: confidential information, cyberattacks, cybersecurity, cyberspace, sensitive information
Procedia PDF Downloads 25924285 Detectability Analysis of Typical Aerial Targets from Space-Based Platforms
Authors: Yin Zhang, Kai Qiao, Xiyang Zhi, Jinnan Gong, Jianming Hu
Abstract:
In order to achieve effective detection of aerial targets over long distances from space-based platforms, the mechanism of interaction between the radiation characteristics of the aerial targets and the complex scene environment including the sunlight conditions, underlying surfaces and the atmosphere are analyzed. A large simulated database of space-based radiance images is constructed considering several typical aerial targets, target working modes (flight velocity and altitude), illumination and observation angles, background types (cloud, ocean, and urban areas) and sensor spectrums ranging from visible to thermal infrared. The target detectability is characterized by the signal-to-clutter ratio (SCR) extracted from the images. The influence laws of the target detectability are discussed under different detection bands and instantaneous fields of view (IFOV). Furthermore, the optimal center wavelengths and widths of the detection bands are suggested, and the minimum IFOV requirements are proposed. The research can provide theoretical support and scientific guidance for the design of space-based detection systems and on-board information processing algorithms.Keywords: space-based detection, aerial targets, detectability analysis, scene environment
Procedia PDF Downloads 14424284 Environmental Benefits of Corn Cob Ash in Lateritic Soil Cement Stabilization for Road Works in a Sub-Tropical Region
Authors: Ahmed O. Apampa, Yinusa A. Jimoh
Abstract:
The potential economic viability and environmental benefits of using a biomass waste, such as corn cob ash (CCA) as pozzolan in stabilizing soils for road pavement construction in a sub-tropical region was investigated. Corn cob was obtained from Maya in South West Nigeria and processed to ash of characteristics similar to Class C Fly Ash pozzolan as specified in ASTM C618-12. This was then blended with ordinary Portland cement in the CCA:OPC ratios of 1:1, 1:2 and 2:1. Each of these blends was then mixed with lateritic soil of ASHTO classification A-2-6(3) in varying percentages from 0 – 7.5% at 1.5% intervals. The soil-CCA-Cement mixtures were thereafter tested for geotechnical index properties including the BS Proctor Compaction, California Bearing Ratio (CBR) and the Unconfined Compression Strength Test. The tests were repeated for soil-cement mix without any CCA blending. The cost of the binder inputs and optimal blends of CCA:OPC in the stabilized soil were thereafter analyzed by developing algorithms that relate the experimental data on strength parameters (Unconfined Compression Strength, UCS and California Bearing Ratio, CBR) with the bivariate independent variables CCA and OPC content, using Matlab R2011b. An optimization problem was then set up minimizing the cost of chemical stabilization of laterite with CCA and OPC, subject to the constraints of minimum strength specifications. The Evolutionary Engine as well as the Generalized Reduced Gradient option of the Solver of MS Excel 2010 were used separately on the cells to obtain the optimal blend of CCA:OPC. The optimal blend attaining the required strength of 1800 kN/m2 was determined for the 1:2 CCA:OPC as 5.4% mix (OPC content 3.6%) compared with 4.2% for the OPC only option; and as 6.2% mix for the 1:1 blend (OPC content 3%). The 2:1 blend did not attain the required strength, though over a 100% gain in UCS value was obtained over the control sample with 0% binder. Upon the fact that 0.97 tonne of CO2 is released for every tonne of cement used (OEE, 2001), the reduced OPC requirement to attain the same result indicates the possibility of reducing the net CO2 contribution of the construction industry to the environment ranging from 14 – 28.5% if CCA:OPC blends are widely used in soil stabilization, going by the results of this study. The paper concludes by recommending that Nigeria and other developing countries in the sub-tropics with abundant stock of biomass waste should look in the direction of intensifying the use of biomass waste as fuel and the derived ash for the production of pozzolans for road-works, thereby reducing overall green house gas emissions and in compliance with the objectives of the United Nations Framework on Climate Change.Keywords: corn cob ash, biomass waste, lateritic soil, unconfined compression strength, CO2 emission
Procedia PDF Downloads 37324283 Atmospheric Oxidation of Carbonyls: Insight to Mechanism, Kinetic and Thermodynamic Parameters
Authors: Olumayede Emmanuel Gbenga, Adeniyi Azeez Adebayo
Abstract:
Carbonyls are the first-generation products from tropospheric degradation reactions of volatile organic compounds (VOCs). This computational study examined the mechanism of removal of carbonyls from the atmosphere via hydroxyl radical. The kinetics of the reactions were computed from the activation energy (using enthalpy (ΔH**) and Gibbs free energy (ΔG**). The minimum energy path (MEP) analysis reveals that in all the molecules, the products have more stable energy than the reactants, which implies that the forward reaction is more thermodynamically favorable. The hydrogen abstraction of the aromatic aldehyde, especially without methyl substituents, is more kinetically favorable compared with the other aldehydes in the order of aromatic (without methyl or meta methyl) > alkene (short chain) > diene > long-chain aldehydes. The activation energy is much lower for the forward reaction than the backward, indicating that the forward reactions are more kinetically stable than their backward reaction. In terms of thermodynamic stability, the aromatic compounds are found to be less favorable in comparison to the aliphatic. The study concludes that the chemistry of the carbonyl bond of the aldehyde changed significantly from the reactants to the products.Keywords: atmospheric carbonyls, oxidation, mechanism, kinetic, thermodynamic
Procedia PDF Downloads 5024282 Non-Singular Gravitational Collapse of a Homogeneous Scalar Field in Deformed Phase Space
Authors: Amir Hadi Ziaie
Abstract:
In the present work, we revisit the collapse process of a spherically symmetric homogeneous scalar field (in FRW background) minimally coupled to gravity, when the phase-space deformations are taken into account. Such a deformation is mathematically introduced as a particular type of noncommutativity between the canonical momenta of the scale factor and of the scalar field. In the absence of such deformation, the collapse culminates in a spacetime singularity. However, when the phase-space is deformed, we find that the singularity is removed by a non-singular bounce, beyond which the collapsing cloud re-expands to infinity. More precisely, for negative values of the deformation parameter, we identify the appearance of a negative pressure, which decelerates the collapse to finally avoid the singularity formation. While in the un-deformed case, the horizon curve monotonically decreases to finally cover the singularity, in the deformed case the horizon has a minimum value that this value depends on deformation parameter and initial configuration of the collapse. Such a setting predicts a threshold mass for black hole formation in stellar collapse and manifests the role of non-commutative geometry in physics and especially in stellar collapse and supernova explosion.Keywords: gravitational collapse, non-commutative geometry, spacetime singularity, black hole physics
Procedia PDF Downloads 34424281 Programming without Code: An Approach and Environment to Conditions-On-Data Programming
Authors: Philippe Larvet
Abstract:
This paper presents the concept of an object-based programming language where tests (if... then... else) and control structures (while, repeat, for...) disappear and are replaced by conditions on data. According to the object paradigm, by using this concept, data are still embedded inside objects, as variable-value couples, but object methods are expressed into the form of logical propositions (‘conditions on data’ or COD).For instance : variable1 = value1 AND variable2 > value2 => variable3 = value3. Implementing this approach, a central inference engine turns and examines objects one after another, collecting all CODs of each object. CODs are considered as rules in a rule-based system: the left part of each proposition (left side of the ‘=>‘ sign) is the premise and the right part is the conclusion. So, premises are evaluated and conclusions are fired. Conclusions modify the variable-value couples of the object and the engine goes to examine the next object. The paper develops the principles of writing CODs instead of complex algorithms. Through samples, the paper also presents several hints for implementing a simple mechanism able to process this ‘COD language’. The proposed approach can be used within the context of simulation, process control, industrial systems validation, etc. By writing simple and rigorous conditions on data, instead of using classical and long-to-learn languages, engineers and specialists can easily simulate and validate the functioning of complex systems.Keywords: conditions on data, logical proposition, programming without code, object-oriented programming, system simulation, system validation
Procedia PDF Downloads 221