Search results for: information warfare techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16296

Search results for: information warfare techniques

8676 The Design of a Vehicle Traffic Flow Prediction Model for a Gauteng Freeway Based on an Ensemble of Multi-Layer Perceptron

Authors: Tebogo Emma Makaba, Barnabas Ndlovu Gatsheni

Abstract:

The cities of Johannesburg and Pretoria both located in the Gauteng province are separated by a distance of 58 km. The traffic queues on the Ben Schoeman freeway which connects these two cities can stretch for almost 1.5 km. Vehicle traffic congestion impacts negatively on the business and the commuter’s quality of life. The goal of this paper is to identify variables that influence the flow of traffic and to design a vehicle traffic prediction model, which will predict the traffic flow pattern in advance. The model will unable motorist to be able to make appropriate travel decisions ahead of time. The data used was collected by Mikro’s Traffic Monitoring (MTM). Multi-Layer perceptron (MLP) was used individually to construct the model and the MLP was also combined with Bagging ensemble method to training the data. The cross—validation method was used for evaluating the models. The results obtained from the techniques were compared using predictive and prediction costs. The cost was computed using combination of the loss matrix and the confusion matrix. The predicted models designed shows that the status of the traffic flow on the freeway can be predicted using the following parameters travel time, average speed, traffic volume and day of month. The implications of this work is that commuters will be able to spend less time travelling on the route and spend time with their families. The logistics industry will save more than twice what they are currently spending.

Keywords: bagging ensemble methods, confusion matrix, multi-layer perceptron, vehicle traffic flow

Procedia PDF Downloads 339
8675 Breaking the Barrier of Service Hostility: A Lean Approach to Achieve Operational Excellence

Authors: Mofizul Islam Awwal

Abstract:

Due to globalization, industries are rapidly growing throughout the world which leads to many manufacturing organizations. But recently, service industries are beginning to emerge in large numbers almost in all parts of the world including some developing countries. In this context, organizations need to have strong competitive advantage over their rivals to achieve their strategic business goals. Manufacturing industries are adopting many methods and techniques in order to achieve such competitive edge. Over the last decades, manufacturing industries have been successfully practicing lean concept to optimize their production lines. Due to its huge success in manufacturing context, lean has made its way into the service industry. Very little importance has been addressed to service in the area of operations management. Service industries are far behind than manufacturing industries in terms of operations improvement. It will be a hectic job to transfer the lean concept from production floor to service back/front office which will obviously yield possible improvement. Service processes are not as visible as production processes and can be very complex. Lack of research in this area made it quite difficult for service industries as there are no standardized frameworks for successfully implementing lean concept in service organization. The purpose of this research paper is to capture the present scenario of service industry in terms of lean implementation. Thorough analysis of past literature will be done on the applicability and understanding of lean in service structure. Classification of research papers will be done and critical factors will be unveiled for implementing lean in service industry to achieve operational excellence.

Keywords: lean service, lean literature classification, lean implementation, service industry, service excellence

Procedia PDF Downloads 372
8674 Beating Heart Coronary Artery Bypass Grafting on Intermittent Pump Support

Authors: Sushil Kumar Singh, Vivek Tewarson, Sarvesh Kumar, Shobhit Kumar

Abstract:

Objective: ‘Beating Heart coronary artery bypass grafting on Intermittent Pump Support’ is a more reliable method of coronary revascularization that takes advantage of off and on-pump CABG while eliminating the disadvantage of both techniques. Methods: From January 2015 to December 2021, a new technique, “Intermittent On pump beating heart CABG” using a suction stabilizer was used by putting aortic and venous cannulas electively in all the patients. Patients were supported by a pump intermittently, as and when required (Group 1, n=254). Retrospective data were collected from our record of the patients who underwent off-pump CABG electively by the same surgeon and team (Group 2, n=254). Results: Significant advantage was noted in Group 1 patients in terms of the number of grafts (3.31 ± 1.16 vs. 2.30 ±0.66), grafting of lateral vessels (316 vs.202), mean operating time (1.37 ± 0.23 hrs vs. 2.22 ± 0.45 hrs) and postoperative blood loss (406.30 ± 257.90 ml vs. 567.41 ± 265.20 ml).CPB support time was less than 15 minutes in the majority of patients (n=179, 70.37 %), with a mean of 16.81 minutes. It was required, particularly during the grafting of lateral vessels. A rise in enzymes level (CRP, CKMB, Trop I, and NTPro BNP) was noted in Group 1 patients. But, these did not affect the postoperative course in patients. There was no mortality in Group 1 patients, while four patients in Group 2 died. Coclusions: Intermittent on-pump CABG technique is a promising method of surgical revascularization for all patients requiring CABG. It has shown its superiority in terms of safety, the number of grafts, operating time, and better perioperative course.

Keywords: cardiopulmonary bypass, CABG, beating heart CABG, on-pump CABG

Procedia PDF Downloads 114
8673 Selective Oxidation of 6Mn-2Si Advanced High Strength Steels during Intercritical Annealing Treatment

Authors: Maedeh Pourmajidian, Joseph R. McDermid

Abstract:

Advanced High Strength Steels are revolutionizing both the steel and automotive industries due to their high specific strength and ability to absorb energy during crash events. This allows manufacturers to design vehicles with significantly increased fuel efficiency without compromising passenger safety. To maintain the structural integrity of the fabricated parts, they must be protected from corrosion damage through continuous hot-dip galvanizing process, which is challenging due to selective oxidation of Mn and Si on the surface of this AHSSs. The effects of process atmosphere oxygen partial pressure and small additions of Sn on the selective oxidation of a medium-Mn C-6Mn-2Si advanced high strength steel was investigated. Intercritical annealing heat treatments were carried out at 690˚C in an N2-5%H2 process atmosphere under dew points ranging from –50˚C to +5˚C. Surface oxide chemistries, morphologies, and thicknesses were determined at a variety of length scales by several techniques, including SEM, TEM+EELS, and XPS. TEM observations of the sample cross-sections revealed the transition to internal oxidation at the +5˚C dew point. EELS results suggested that the internal oxides network was composed of a multi-layer oxide structure with varying chemistry from oxide core towards the outer part. The combined effect of employing a known surface active element as a function of process atmosphere on the surface structure development and the possible impact on reactive wetting of the steel substrates by the continuous galvanizing zinc bath will be discussed.

Keywords: 3G AHSS, hot-dip galvanizing, oxygen partial pressure, selective oxidation

Procedia PDF Downloads 394
8672 Impact of Television on the Coverage of Lassa Fever Disease in Nigeria

Authors: H. Shola Adeosun, F. Ajoke Adebiyi

Abstract:

This study appraises the impact of television on the coverage of Lassa Fever disease. The objectives of the study are to find out whether television is an effective tool for raising awareness about Lassa fever shapes the perception of members of the public. The research work was based on the theoretical foundation of Agenda – setting and reinforcement theory. Survey research method was adopted in the study to elicit data from the residents of Obafemi Owode Local Government, area of Ogun state. Questionnaire and oral interview were adopted as a tool for data gathering. Simple random sampling techniques were used to draw a sample for this study. Out of filled 400 questionnaires distributed to the respondents. 37 of them were incorrectly filled and returned at the stipulated time. This is about (92.5% Tables, percentages, and figures were used to analyse and interpret the data and hypothesis formulation for this study revealed that Lassa fever diseases with higher media coverage were considered more serious and more representative of a disease and estimated to have lower incidents, than diseases less frequently found in the media. Thus, 92% of the respondents agree that they have access to television coverage of Lassa fever disease led to exaggerated perceptions of personal vulnerability. It, therefore, concludes that there is a need for relevant stakeholders to ensure better community health education and improved housing conditions in southwestern Nigeria, with an emphasis on slum areas and that Nigeria need to focus on the immediate response, while preparing for the future because a society or community is all about the people who inhabit. Therefore every effort must be geared towards their society and survival.

Keywords: impact, television, coverage, Lassa fever disease

Procedia PDF Downloads 209
8671 Depositional Environment and Source Potential of Devonian Source Rock, Ghadames Basin, Southern Tunisia

Authors: S. Mahmoudi, A. Belhaj Mohamed, M. Saidi, F. Rezgui

Abstract:

Depositional environment and source potential of the different organic rich levels of Devonian age (up to 990m thick) from the onshore EC-1 well (Southern Tunisia) were investigated using different geochemical techniques (Rock-Eval pyrolysis, GC-MS) of over than 130 cutting samples. The obtained results including Rock Eval Pyrolysis data and biomarker distribution (terpanes, steranes and aromatics) have been used to describe the depositional environment and to assess the thermal maturity of the Devonian organic matter. These results show that the Emsian deposits exhibit poor to fair TOC contents. The associated organic matter is composed of mixed kerogen (type II/III), as indicated by the predominance of C29 steranes over C27 and C28 homologous, that was deposited in a slightly reduced environment favoring organic matter preservation. Thermal maturity assessed from Tmax, TNR and MPI-1 values shows a mature stage of organic matter. The Middle Devonian (Eifelian) shales are rich in type II organic matter that was deposited in an open marine depositional environment. The TOC values are high and vary between 2 and 7 % indicating good to excellent source rock. The relatively high IH values (reaching 547 mg HC/g TOC) and the low values of t19/t23 ratio (down to 0.2) confirm the marine origin of the organic matter (type II). During the Upper Devonian, the organic matter was deposited under variable redox conditions, oxic to suboxic which is clearly indicated by the low C35/C34 hopanes ratio, immature to marginally mature with the vitrinite reflectance ranging from 0.5 to 0.7 Ro and Tmax value of 426°C-436 °C and the TOC values range between 0.8% to 4%.

Keywords: biomarker, depositional environment, devonian, source rock

Procedia PDF Downloads 472
8670 Asymptotic Analysis of the Viscous Flow through a Pipe and the Derivation of the Darcy-Weisbach Law

Authors: Eduard Marusic-Paloka

Abstract:

The Darcy-Weisbach formula is used to compute the pressure drop of the fluid in the pipe, due to the friction against the wall. Because of its simplicity, the Darcy-Weisbach formula became widely accepted by engineers and is used for laminar as well as the turbulent flows through pipes, once the method to compute the mysterious friction coefficient was derived. Particularly in the second half of the 20th century. Formula is empiric, and our goal is to derive it from the basic conservation law, via rigorous asymptotic analysis. We consider the case of the laminar flow but with significant Reynolds number. In case of the perfectly smooth pipe, the situation is trivial, as the Navier-Stokes system can be solved explicitly via the Poiseuille formula leading to the friction coefficient in the form 64/Re. For the rough pipe, the situation is more complicated and some effects of the roughness appear in the friction coefficient. We start from the Navier-Stokes system in the pipe with periodically corrugated wall and derive an asymptotic expansion for the pressure and for the velocity. We use the homogenization techniques and the boundary layer analysis. The approximation derived by formal analysis is then justified by rigorous error estimate in the norm of the appropriate Sobolev space, using the energy formulation and classical a priori estimates for the Navier-Stokes system. Our method leads to the formula for the friction coefficient. The formula involves resolution of the appropriate boundary layer problems, namely the boundary value problems for the Stokes system in an infinite band, that needs to be done numerically. However, theoretical analysis characterising their nature can be done without solving them.

Keywords: Darcy-Weisbach law, pipe flow, rough boundary, Navier law

Procedia PDF Downloads 349
8669 Biophysically Motivated Phylogenies

Authors: Catherine Felce, Lior Pachter

Abstract:

Current methods for building phylogenetic trees from gene expression data consider mean expression levels. With single-cell technologies, we can leverage more information about cell dynamics by considering the entire distribution of gene expression across cells. Using biophysical modeling, we propose a method for constructing phylogenetic trees from scRNA-seq data, building on Felsenstein's method of continuous characters. This method can highlight genes whose level of expression may be unchanged between species, but whose rates of transcription/decay may have evolved over time.

Keywords: phylogenetics, single-cell, biophysical modeling, transcription

Procedia PDF Downloads 41
8668 Analysis of Long-term Results After External Dacryocystorhinostomy Surgery in Patients Suffered from Diabetes Mellitus

Authors: N. Musayeva, N. Rustamova, N. Bagirov, S. Ibadov

Abstract:

Purpose: to analyze the long-term results of external dacryocystorhinostomy (DCR), which remains the preferred primary procedure in the surgical treatment of lacrimal duct obstruction in chronic dacryocystitis. Methodology: long-term results of external DCR (after 3 years) performed on 90 patients (90 eyes) with chronic dacryocystitis from 2018 to 2020 were evaluated. The Azerbaijan National Center of Ophthalmology, named after acad. Zarifa Aliyeva. 15 of the patients were men, 75 – women. The average age was 45±3.2 years. Surgical operations were performed under local anesthesia. All patients suffered from diabetes mellitus for more than 3 years. All patients underwent external DCR and silicone drainage (tube) was implanted. In the postoperative period (after 3 years), lacrimation, purulent discharge, and the condition of the scar at the operation site were assessed. Results: All patients were under observation for more than 18 months. In general, the effectiveness of the surgical operation was 93.34%. Recurrence of disease was observed in 6 patients and in 3 patients (3.33%), the scar at the site of the operation was rough (non-cosmetic). In 3 patients (3.33%) – the surgically formed anastomosis between the lacrimal sac and the nasal bone was obstructed by scar tissue. These patients were reoperated by trans canalicular laser DCR. Conclusion: Despite the long-term (more than a hundred years) use of external DCR, it remains one of the primary techniques in the surgery of chronic dacryocystitis. Due to the high success rate and good long-term results of DCR in the treatment of chronic dacryocystitis in patients suffering from diabetes mellitus, we recommend external DCR for this group of patients.

Keywords: chronic dacryocystitis, diabetes mellitus, external dacryocystorhinostomy, long-term results

Procedia PDF Downloads 58
8667 Understanding Stock-Out of Pharmaceuticals in Timor-Leste: A Case Study in Identifying Factors Impacting on Pharmaceutical Quantification in Timor-Leste

Authors: Lourenco Camnahas, Eileen Willis, Greg Fisher, Jessie Gunson, Pascale Dettwiller, Charlene Thornton

Abstract:

Stock-out of pharmaceuticals is a common issue at all level of health services in Timor-Leste, a small post-conflict country. This lead to the research questions: what are the current methods used to quantify pharmaceutical supplies; what factors contribute to the on-going pharmaceutical stock-out? The study examined factors that influence the pharmaceutical supply chain system. Methodology: Privett and Goncalvez dependency model has been adopted for the design of the qualitative interviews. The model examines pharmaceutical supply chain management at three management levels: management of individual pharmaceutical items, health facilities, and health systems. The interviews were conducted in order to collect information on inventory management, logistics management information system (LMIS) and the provision of pharmaceuticals. Andersen' behavioural model for healthcare utilization also informed the interview schedule, specifically factors linked to environment (healthcare system and external environment) and the population (enabling factors). Forty health professionals (bureaucrats, clinicians) and six senior officers from a United Nations Agency, a global multilateral agency and a local non-governmental organization were interviewed on their perceptions of factors (healthcare system/supply chain and wider environment) impacting on stock out. Additionally, policy documents for the entire healthcare system, along with population data were collected. Findings: An analysis using Pozzebon’s critical interpretation identified a range of difficulties within the system from poor coordination to failure to adhere to policy guidelines along with major difficulties with inventory management, quantification, forecasting, and budgetary constraints. Weak logistics management information system, lack of capacity in inventory management, monitoring and supervision are additional organizational factors that also contributed to the issue. There were various methods of quantification of pharmaceuticals applied in the government sector, and non-governmental organizations. Lack of reliable data is one of the major problems in the pharmaceutical provision. Global Fund has the best quantification methods fed by consumption data and malaria cases. There are other issues that worsen stock-out: political intervention, work ethic and basic infrastructure such as unreliable internet connectivity. Major issues impacting on pharmaceutical quantification have been identified. However, current data collection identified limitations within the Andersen model; specifically, a failure to take account of predictors in the healthcare system and the environment (culture/politics/social. The next step is to (a) compare models used by three non-governmental agencies with the government model; (b) to run the Andersen explanatory model for pharmaceutical expenditure for 2 to 5 drug items used by these three development partners in order to see how it correlates with the present model in terms of quantification and forecasting the needs; (c) to repeat objectives (a) and (b) using the government model; (d) to draw a conclusion about the strength.

Keywords: inventory management, pharmaceutical forecasting and quantification, pharmaceutical stock-out, pharmaceutical supply chain management

Procedia PDF Downloads 238
8666 Risk Analysis of Leaks from a Subsea Oil Facility Based on Fuzzy Logic Techniques

Authors: Belén Vinaixa Kinnear, Arturo Hidalgo López, Bernardo Elembo Wilasi, Pablo Fernández Pérez, Cecilia Hernández Fuentealba

Abstract:

The expanded use of risk assessment in legislative and corporate decision-making has increased the role of expert judgement in giving data for security-related decision-making. Expert judgements are required in most steps of risk assessment: danger recognizable proof, hazard estimation, risk evaluation, and examination of choices. This paper presents a fault tree analysis (FTA), which implies a probabilistic failure analysis applied to leakage of oil in a subsea production system. In standard FTA, the failure probabilities of items of a framework are treated as exact values while evaluating the failure probability of the top event. There is continuously insufficiency of data for calculating the failure estimation of components within the drilling industry. Therefore, fuzzy hypothesis can be used as a solution to solve the issue. The aim of this paper is to examine the leaks from the Zafiro West subsea oil facility by using fuzzy fault tree analysis (FFTA). As a result, the research has given theoretical and practical contributions to maritime safety and environmental protection. It has been also an effective strategy used traditionally in identifying hazards in nuclear installations and power industries.

Keywords: expert judgment, probability assessment, fault tree analysis, risk analysis, oil pipelines, subsea production system, drilling, quantitative risk analysis, leakage failure, top event, off-shore industry

Procedia PDF Downloads 187
8665 Operations Training Using Immersive Technologies: A Development Experience

Authors: A. Aman, S. M. Tang, F. H. Alharrassy

Abstract:

Omanisation was established to increase job opportunities for national employment in Sultanate of Oman. With half of the population below 25 years of age, the sultanate is striving to diversify the economy fast enough to meet the burgeoning number of jobseekers annually. On the other hand, training personnel to be competent oil and gas operators and technicians is a difficult task in a complex reservoir structures in Oman using highly advanced and sophisticated extracting processes. Coupled towards Omanisation which encourages nationals into the oil and gas sector so as to create sustainable employment for the local population, the challenge to churn out competent manpower became a daunting task. Immersive technologies provided the impetus to create a new digital media sector which provided job opportunities as well as the learning contents to enhance the competency-based training for the oil and gas sector in the Sultanate. This lead to a win-win-win collaboration amongst the government represented by the Information Technology Authority (ITA), private sector specialised company (represented by ASM Technologies), jobseekers and oil and gas organisations. This is also one of the first private-public partnership model in the Information Communication Technology (ICT) sector in Oman. A pilot phase was conducted for 8 months to develop four virtual applications for training in equipment and process engineering; oil rig familiarisation, Health Safety Environment (HSE) application, turbine application and the mechanical vapour compressor (MVC) water recycling plant in order to enhance the competency level of the trainees. The immersive applications were installed in operational settings which enabled new employees to practice and understand various processes and procedures regarding enhanced oil recovery. Existing employees used the application to review the working principles in order to carry out troubleshooting scenarios. Concurrently, these applications were also developed by local Omani resources within the country. This created job opportunities for job-seekers as well the establishment of a digital media sector. The purpose of this paper is to discuss how immersive technologies can enhance operational competencies, create job and establish a digital media sector in the Sultanate of Oman.

Keywords: immersive, virtual reality, operations training, Omanisation

Procedia PDF Downloads 224
8664 Affective Robots: Evaluation of Automatic Emotion Recognition Approaches on a Humanoid Robot towards Emotionally Intelligent Machines

Authors: Silvia Santano Guillén, Luigi Lo Iacono, Christian Meder

Abstract:

One of the main aims of current social robotic research is to improve the robots’ abilities to interact with humans. In order to achieve an interaction similar to that among humans, robots should be able to communicate in an intuitive and natural way and appropriately interpret human affects during social interactions. Similarly to how humans are able to recognize emotions in other humans, machines are capable of extracting information from the various ways humans convey emotions—including facial expression, speech, gesture or text—and using this information for improved human computer interaction. This can be described as Affective Computing, an interdisciplinary field that expands into otherwise unrelated fields like psychology and cognitive science and involves the research and development of systems that can recognize and interpret human affects. To leverage these emotional capabilities by embedding them in humanoid robots is the foundation of the concept Affective Robots, which has the objective of making robots capable of sensing the user’s current mood and personality traits and adapt their behavior in the most appropriate manner based on that. In this paper, the emotion recognition capabilities of the humanoid robot Pepper are experimentally explored, based on the facial expressions for the so-called basic emotions, as well as how it performs in contrast to other state-of-the-art approaches with both expression databases compiled in academic environments and real subjects showing posed expressions as well as spontaneous emotional reactions. The experiments’ results show that the detection accuracy amongst the evaluated approaches differs substantially. The introduced experiments offer a general structure and approach for conducting such experimental evaluations. The paper further suggests that the most meaningful results are obtained by conducting experiments with real subjects expressing the emotions as spontaneous reactions.

Keywords: affective computing, emotion recognition, humanoid robot, human-robot-interaction (HRI), social robots

Procedia PDF Downloads 222
8663 Relay-Augmented Bottleneck Throughput Maximization for Correlated Data Routing: A Game Theoretic Perspective

Authors: Isra Elfatih Salih Edrees, Mehmet Serdar Ufuk Türeli

Abstract:

In this paper, an energy-aware method is presented, integrating energy-efficient relay-augmented techniques for correlated data routing with the goal of optimizing bottleneck throughput in wireless sensor networks. The system tackles the dual challenge of throughput optimization while considering sensor network energy consumption. A unique routing metric has been developed to enable throughput maximization while minimizing energy consumption by utilizing data correlation patterns. The paper introduces a game theoretic framework to address the NP-complete optimization problem inherent in throughput-maximizing correlation-aware routing with energy limitations. By creating an algorithm that blends energy-aware route selection strategies with the best reaction dynamics, this framework provides a local solution. The suggested technique considerably raises the bottleneck throughput for each source in the network while reducing energy consumption by choosing the best routes that strike a compromise between throughput enhancement and energy efficiency. Extensive numerical analyses verify the efficiency of the method. The outcomes demonstrate the significant decrease in energy consumption attained by the energy-efficient relay-augmented bottleneck throughput maximization technique, in addition to confirming the anticipated throughput benefits.

Keywords: correlated data aggregation, energy efficiency, game theory, relay-augmented routing, throughput maximization, wireless sensor networks

Procedia PDF Downloads 73
8662 DCT and Stream Ciphers for Improved Image Encryption Mechanism

Authors: T. R. Sharika, Ashwini Kumar, Kamal Bijlani

Abstract:

Encryption is the process of converting crucial information’s unreadable to unauthorized persons. Image security is an important type of encryption that secures all type of images from cryptanalysis. A stream cipher is a fast symmetric key algorithm which is used to convert plaintext to cipher text. In this paper we are proposing an image encryption algorithm with Discrete Cosine Transform and Stream Ciphers that can improve compression of images and enhanced security. The paper also explains the use of a shuffling algorithm for enhancing securing.

Keywords: decryption, DCT, encryption, RC4 cipher, stream cipher

Procedia PDF Downloads 357
8661 Impacts of Aquaculture Farms on the Mangroves Forests of Sundarbans, India (2010-2018): Temporal Changes of NDVI

Authors: Sandeep Thakur, Ismail Mondal, Phani Bhusan Ghosh, Papita Das, Tarun Kumar De

Abstract:

Sundarbans Reserve forest of India has been undergoing major transformations in the recent past owing to population pressure and related changes. This has brought about major changes in the spatial landscape of the region especially in the western parts. This study attempts to assess the impacts of the Landcover changes on the mangrove habitats. Time series imageries of Landsat were used to analyze the Normalized Differential Vegetation Index (NDVI) patterns over the western parts of Indian Sundarbans forest in order to assess the heath of the mangroves in the region. The images were subjected to Land use Land cover (LULC) classification using sub-pixel classification techniques in ERDAS Imagine software and the changes were mapped. The spatial proliferation of aquaculture farms during the study period was also mapped. A multivariate regression analysis was carried out between the obtained NDVI values and the LULC classes. Similarly, the observed meteorological data sets (time series rainfall and minimum and maximum temperature) were also statistically correlated for regression. The study demonstrated the application of NDVI in assessing the environmental status of mangroves as the relationship between the changes in the environmental variables and the remote sensing based indices felicitate an efficient evaluation of environmental variables, which can be used in the coastal zone monitoring and development processes.

Keywords: aquaculture farms, LULC, Mangrove, NDVI

Procedia PDF Downloads 174
8660 Preventing Corruption in Dubai: Governance, Contemporary Strategies and Systemic Flaws

Authors: Graham Brooks, Belaisha Bin Belaisha, Hakkyong Kim

Abstract:

The problem of preventing and/or reducing corruption is a major international problem. This paper, however, specifically focuses on how organisations in Dubai are tackling the problem of money laundering. This research establishes that Dubai has a clear international anti-money laundering framework but suffers from some national weaknesses such as diverse anti-money laundering working practice, lack of communication, sharing information and disparate organisational vested self-interest.

Keywords: corruption, governance, money laundering, prevention, strategies

Procedia PDF Downloads 268
8659 Rehabilitation and Conservation of Mangrove Forest as Pertamina Corporate Social Responsibility Approach in Prevention Damage Climate in Indonesia

Authors: Nor Anisa

Abstract:

This paper aims to describe the use of conservation and rehabilitation of Mangrove forests as an alternative area in protecting the natural environment and ecosystems and ecology, community education and innovation of sustainable industrial development such as oil companies, gas and coal. The existence of globalization encourages energy needs such as gas, diesel and coal as an unaffected resource which is a basic need for human life while environmental degradation and natural phenomena continue to occur in Indonesia, especially global warming, sea water pollution, extinction of animal steps. The phenomenon or damage to nature in Indonesia is caused by a population explosion in Indonesia that causes unemployment, the land where the residence will disappear so that this will encourage the exploitation of nature and the environment. Therefore, Pertamina as a state-owned oil and gas company carries out its social responsibility efforts, namely to carry out conservation and rehabilitation and management of Mangrove fruit seeds which will provide an educational effect on the benefits of Mangrove seed maintenance. The method used in this study is a qualitative method and secondary data retrieval techniques where data is taken based on Pertamina activity journals and websites that can be accounted for. So the conclusion of this paper is: the benefits and function of conservation of mangrove forests in Indonesia physically, chemically, biologically and socially and economically and can provide innovation to the CSR (Corporate Social Responsibility) of the company in continuing social responsibility in the scope of environmental conservation and social education.

Keywords: mangrove, environmental damage, conservation and rehabilitation, innovation of corporate social responsibility

Procedia PDF Downloads 129
8658 Predicting Wearable Technology Readiness in a South African Government Department: Exploring the Influence of Wearable Technology Acceptance and Positive Attitude

Authors: Henda J Thomas, Cornelia PJ Harmse, Cecile Schultz

Abstract:

Wearables are one of the technologies that will flourish within the fourth industrial revolution and digital transformation arenas, allowing employers to integrate collected data into organisational information systems. The study aimed to investigate whether wearable technology readiness can predict employees’ acceptance to wear wearables in the workplace. The factors of technology readiness predisposition that predict acceptance and positive attitudes towards wearable use in the workplace were examined. A quantitative research approach was used. The population consisted of 8 081 South African Department of Employment and Labour employees (DEL). Census sampling was used, and questionnaires to collect data were sent electronically to all 8 081 employees, 351 questionnaires were received back. The measuring instrument called the Technology Readiness and Acceptance Model (TRAM) was used in this study. Four hypotheses were formulated to investigate the relationship between readiness and acceptance of wearables in the workplace. The results found consistent predictions of technology acceptance (TA) by eagerness, optimism, and discomfort in the technology readiness (TR) scales. The TR scales of optimism and eagerness were consistent positive predictors of the TA scales, while discomfort proved to be a negative predictor for two of the three TA scales. Insecurity was found not to be a predictor of TA. It was recommended that the digital transformation policy of the DEL should be revised. Wearables in the workplace should be embraced from the viewpoint of convenience, automation, and seamless integration with the DEL information systems. The empirical contribution of this study can be seen in the fact that positive attitude emerged as a factor that extends the TRAM. In this study, positive attitude is identified as a new dimension to the TRAM not found in the original TA model and subsequent studies of the TRAM. Furthermore, this study found that Perceived Usefulness (PU) and Behavioural Intention to Use and (BIU) could not be separated but formed one factor. The methodological contribution of this study can lead to the development of a Wearable Readiness and Acceptance Model (WRAM). To the best of our knowledge, no author has yet introduced the WRAM into the body of knowledge.

Keywords: technology acceptance model, technology readiness index, technology readiness and acceptance model, wearable devices, wearable technology, fourth industrial revolution

Procedia PDF Downloads 83
8657 Normalized Compression Distance Based Scene Alteration Analysis of a Video

Authors: Lakshay Kharbanda, Aabhas Chauhan

Abstract:

In this paper, an application of Normalized Compression Distance (NCD) to detect notable scene alterations occurring in videos is presented. Several research groups have been developing methods to perform image classification using NCD, a computable approximation to Normalized Information Distance (NID) by studying the degree of similarity in images. The timeframes where significant aberrations between the frames of a video have occurred have been identified by obtaining a threshold NCD value, using two compressors: LZMA and BZIP2 and defining scene alterations using Pixel Difference Percentage metrics.

Keywords: image compression, Kolmogorov complexity, normalized compression distance, root mean square error

Procedia PDF Downloads 337
8656 Continuance Commitment of Retail Pharmacist in a Labor Shortage: Results from the Questionnaire Survey

Authors: Shigeaki Mishima

Abstract:

Pharmacist labor shortage has become a long-term problem in Japan. This paper discusses the relationship between organizational commitment and pharmacists' organizational behavior in the context of labor shortage. Based on a multidimensional view of organizational commitment, effective commitment and continuous commitment are measured. It is suggested that the continuous commitment has a unique impact on withholding information behavior. We also discuss the impact of labor supply and demand on continuous commitment of retail pharmacist.

Keywords: organizational commitment, pharmacist, labor shortage, professional

Procedia PDF Downloads 403
8655 Hybrid Thresholding Lifting Dual Tree Complex Wavelet Transform with Wiener Filter for Quality Assurance of Medical Image

Authors: Hilal Naimi, Amelbahahouda Adamou-Mitiche, Lahcene Mitiche

Abstract:

The main problem in the area of medical imaging has been image denoising. The most defying for image denoising is to secure data carrying structures like surfaces and edges in order to achieve good visual quality. Different algorithms with different denoising performances have been proposed in previous decades. More recently, models focused on deep learning have shown a great promise to outperform all traditional approaches. However, these techniques are limited to the necessity of large sample size training and high computational costs. This research proposes a denoising approach basing on LDTCWT (Lifting Dual Tree Complex Wavelet Transform) using Hybrid Thresholding with Wiener filter to enhance the quality image. This research describes the LDTCWT as a type of lifting wavelets remodeling that produce complex coefficients by employing a dual tree of lifting wavelets filters to get its real part and imaginary part. Permits the remodel to produce approximate shift invariance, directionally selective filters and reduces the computation time (properties lacking within the classical wavelets transform). To develop this approach, a hybrid thresholding function is modeled by integrating the Wiener filter into the thresholding function.

Keywords: lifting wavelet transform, image denoising, dual tree complex wavelet transform, wavelet shrinkage, wiener filter

Procedia PDF Downloads 159
8654 Assessing the Effect of Urban Growth on Land Surface Temperature: A Case Study of Conakry Guinea

Authors: Arafan Traore, Teiji Watanabe

Abstract:

Conakry, the capital city of the Republic of Guinea, has experienced a rapid urban expansion and population increased in the last two decades, which has resulted in remarkable local weather and climate change, raise energy demand and pollution and treating social, economic and environmental development. In this study, the spatiotemporal variation of the land surface temperature (LST) is retrieved to characterize the effect of urban growth on the thermal environment and quantify its relationship with biophysical indices, a normalized difference vegetation index (NDVI) and a normalized difference built up Index (NDBI). Landsat data TM and OLI/TIRS acquired respectively in 1986, 2000 and 2016 were used for LST retrieval and Land use/cover change analysis. A quantitative analysis based on the integration of a remote sensing and a geography information system (GIS) has revealed an important increased in the LST pattern in the average from 25.21°C in 1986 to 27.06°C in 2000 and 29.34°C in 2016, which was quite eminent with an average gain in surface temperature of 4.13°C over 30 years study period. Additionally, an analysis using a Pearson correlation (r) between (LST) and the biophysical indices, normalized difference vegetation index (NDVI) and a normalized difference built-up Index (NDBI) has revealed a negative relationship between LST and NDVI and a strong positive relationship between LST and NDBI. Which implies that an increase in the NDVI value can reduce the LST intensity; conversely increase in NDBI value may strengthen LST intensity in the study area. Although Landsat data were found efficient in assessing the thermal environment in Conakry, however, the method needs to be refined with in situ measurements of LST in the future studies. The results of this study may assist urban planners, scientists and policies makers concerned about climate variability to make decisions that will enhance sustainable environmental practices in Conakry.

Keywords: Conakry, land surface temperature, urban heat island, geography information system, remote sensing, land use/cover change

Procedia PDF Downloads 238
8653 Synthesis of Zeolites from Bauxite and Kaolin: Effect of Synthesis Parameters on Competing Phases

Authors: Bright Kwakye-Awuah, Elizabeth Von-Kiti, Isaac Nkrumah, Baah Sefa-Ntiri, Craig D. Williams

Abstract:

Bauxite and kaolin from Ghana Bauxite Company mine site were used to synthesize zeolites. Bauxite served as the alumina source and kaolin the silica source. Synthesis variations include variation of aging time at constant crystallization time and variation of crystallization times at constant aging time. Characterization techniques such as X-ray diffraction (XRD), scanning electron microscopy (SEM), energy dispersive x-ray analysis (EDX) and Fourier transform infrared spectroscopy (FTIR) were employed in the characterization of the raw samples as well as the synthesized samples. The results obtained showed that the transformations that occurred and the phase of the resulting products were coordinated by the aging time, crystallization time, alkaline concentration and Si/Al ratio of the system. Zeolites A, X, Y, analcime, Sodalite, and ZK-14 were some of the phases achieved. Zeolite LTA was achieved with short crystallization times of 3, 5, 18 and 24 hours and a maximum aging of 24 hours. Zeolite LSX was synthesized with 24 hr aging followed with 24 hr hydrothermal treatment whilst zeolite Y crystallized after 48 hr of aging and 24 hr crystallization. Prolonged crystallization time produced a mixed phased product. Prolonged aging times, on the other hand, did not yield any zeolite as the sample was amorphous. Increasing the alkaline content of the reaction mixture above 5M introduced sodalite phase in the final product. The properties of the final products were comparable to zeolites synthesized from pure chemical reagents.

Keywords: bauxite, kaolin, aging, crystallization, zeolites

Procedia PDF Downloads 216
8652 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion

Authors: Ali Kazemi

Abstract:

Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.

Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting

Procedia PDF Downloads 56
8651 Geographic Information System Cloud for Sustainable Digital Water Management: A Case Study

Authors: Mohamed H. Khalil

Abstract:

Water is one of the most crucial elements which influence human lives and development. Noteworthy, over the last few years, GIS plays a significant role in optimizing water management systems, especially after exponential developing in this sector. In this context, the Egyptian government initiated an advanced ‘GIS-Web Based System’. This system is efficiently designed to tangibly assist and optimize the complement and integration of data between departments of Call Center, Operation and Maintenance, and laboratory. The core of this system is a unified ‘Data Model’ for all the spatial and tabular data of the corresponding departments. The system is professionally built to provide advanced functionalities such as interactive data collection, dynamic monitoring, multi-user editing capabilities, enhancing data retrieval, integrated work-flow, different access levels, and correlative information record/track. Noteworthy, this cost-effective system contributes significantly not only in the completeness of the base-map (93%), the water network (87%) in high level of details GIS format, enhancement of the performance of the customer service, but also in reducing the operating costs/day-to-day operations (~ 5-10 %). In addition, the proposed system facilitates data exchange between different departments (Call Center, Operation and Maintenance, and laboratory), which allowed a better understanding/analyzing of complex situations. Furthermore, this system reflected tangibly on: (i) dynamic environmental monitor/water quality indicators (ammonia, turbidity, TDS, sulfate, iron, pH, etc.), (ii) improved effectiveness of the different water departments, (iii) efficient deep advanced analysis, (iv) advanced web-reporting tools (daily, weekly, monthly, quarterly, and annually), (v) tangible planning synthesizing spatial and tabular data; and finally, (vi) scalable decision support system. It is worth to highlight that the proposed future plan (second phase) of this system encompasses scalability will extend to include integration with departments of Billing and SCADA. This scalability will comprise advanced functionalities in association with the existing one to allow further sustainable contributions.

Keywords: GIS Web-Based, base-map, water network, decision support system

Procedia PDF Downloads 85
8650 Enhancing Seismic Resilience in Colombia's Informal Housing: A Low-cost Retrofit Strategy with Buckling-restrained Braces to Protect Vulnerable Communities in Earthquake-prone Regions

Authors: Luis F. Caballero-castro, Dirsa Feliciano, Daniela Novoa, Orlando Arroyo, Jesús D. Villalba-morales

Abstract:

Colombia faces a critical challenge in seismic resilience due to the prevalence of informal housing, which constitutes approximately 70% of residential structures. More than 10 million Colombians (20% of the population), live in homes susceptible to collapse in the event of an earthquake. This, combined with the fact that 83% of the population is in intermediate and high seismic hazard areas, has brought serious consequences to the country. These consequences became evident during the 1999 Armenia earthquake, which affected nearly 100,000 properties and represented economic losses equivalent to 1.88% of that year's Gross Domestic Product (GDP). Despite previous efforts to reinforce informal housing through methods like externally reinforced masonry walls, alternatives related to seismic protection systems (SPDs), such as Buckling-Restrained Braces (BRB), have not yet been explored in the country. BRBs are reinforcement elements capable of withstanding both compression and tension, making them effective in enhancing the lateral stiffness of structures. In this study, the use of low-cost and easily installable BRBs for the retrofit of informal housing in Colombia was evaluated, considering the economic limitations of the communities. For this purpose, a case study was selected involving an informally constructed dwelling in the country, from which field information on its structural characteristics and construction materials was collected. Based on the gathered information, nonlinear models with and without BRBs were created, and their seismic performance was analyzed and compared through incremental static (pushover) and nonlinear dynamic analyses. In the first analysis, the capacity curve was identified, showcasing the sequence of failure events occurring from initial yielding to structural collapse. In the second case, the model underwent nonlinear dynamic analyses using a set of seismic records consistent with the country's seismic hazard. Based on the results, fragility curves were calculated to evaluate the probability of failure of the informal housings before and after the intervention with BRBs, providing essential information about their effectiveness in reducing seismic vulnerability. The results indicate that low-cost BRBs can significantly increase the capacity of informal housing to withstand earthquakes. The dynamic analysis revealed that retrofit structures experienced lower displacements and deformations, enhancing the safety of residents and the seismic performance of informally constructed houses. In other words, the use of low-cost BRBs in the retrofit of informal housing in Colombia is a promising strategy for improving structural safety in seismic-prone areas. This study emphasizes the importance of seeking affordable and practical solutions to address seismic risk in vulnerable communities in earthquake-prone regions in Colombia and serves as a model for addressing similar challenges of informal housing worldwide.

Keywords: buckling-restrained braces, fragility curves, informal housing, incremental dynamic analysis, seismic retrofit

Procedia PDF Downloads 88
8649 Reflecting on Deafblindness: Recommendations for Implementing Effective Strategies

Authors: V. Argyropoulos, M. Nikolaraizi, K. Tanou

Abstract:

There is little available information concerning the cognitive and communicative abilities of the people who are deaf-blind. This mainly stems from the general inadequacy of existing assessment instruments employed with deafblind individuals. Although considerable variability exists with regard to cognitive capacities of the deaf-blind, careful examination of the literature reveals that the majority of these persons suffer from significant deficits in cognitive and adaptive functioning. The few reports available primarily are case studies, narrative program descriptions, or position papers by workers in the field. Without the objective verification afforded by controlled research, specialists in psychology, education, and other rehabilitation services must rely on personal speculations or biases to guide their decisions in the planning, implementation, and evaluation of services to deaf-blind children and adults. This paper highlights the framework and discusses the results of an action research network. The aim of this study was twofold: a) to describe and analyse the different ways in which a student with deafblindness approached a number of developmental issues such as novel tasks, exploration and manipulation of objects, reactions to social stimuli, motor coordination, and quality of play and b) to map the appropriate functional approach for the specific student that could be used to develop strategies for classroom participation and socialization. The persons involved in this collaborative action research scheme were general teachers, a school counsellor, academic staff and student teachers. Rating scales and checklists were used to gather information in natural activities and settings, and additional data were also obtained through interviews with the educators of the student. The findings of this case study indicated that there is a great need to focus on the development of effective intervention strategies. The results showed that the identification of positive reinforcers for this population might represent an important and challenging aspect of behaviour programmes. Finally, the findings suggest that additional empirical work is needed to increase attention to methodological and social validity issues.

Keywords: action research, cognitive and communicative abilities, deafblindness, effective strategies

Procedia PDF Downloads 181
8648 Methane Oxidation to Methanol Catalyzed by Copper Oxide Clusters Supported in MIL-53(Al): A Density Functional Theory Study

Authors: Chun-Wei Yeh, Santhanamoorthi Nachimuthu, Jyh-Chiang Jiang

Abstract:

Reducing greenhouse gases or converting them into fuels and chemicals with added value is vital for the environment. Given the enhanced techniques for hydrocarbon extraction in this context, the catalytic conversion of methane to methanol is particularly intriguing for future applications as vehicle fuels and/or bulk chemicals. Metal-organic frameworks (MOFs) have received much attention recently for the oxidation of methane to methanol. In addition, biomimetic material, particulate methane monooxygenase (pMMO), has been reported to convert methane using copper oxide clusters as active sites. Inspired by these, in this study, we considered the well-known MIL-53(Al) MOF as support for copper oxide clusters (Cu2Ox, Cu3Ox) to investigate their reactivity towards methane oxidation using Density Functional Theory (DFT) calculations. The copper oxide clusters (Cu2O2, Cu3O2) are modeled by oxidizing copper clusters (Cu2, Cu3) with two oxidizers, O2 and N2O. The initial C-H bond activation barriers on Cu2O2/MIL-53(Al) and Cu3O2/MIL-53(Al) catalysts are 0.70 eV and 0.64 eV, respectively, and are the rate-determining steps in the overall methane conversion to methanol reactions. The desorption energy of the methanol over the Cu2O/MIL-53(Al) and Cu3O/MIL-53(Al) is 0.71eV and 0.75 eV, respectively. Furthermore, to explore the prospect of catalyst reusability, we considered the different oxidants and proposed the different reaction pathways for completing the reaction cycle and regenerating the active copper oxide clusters. To know the reason for the difference between bi-copper and tri-cooper systems, we also did an electronic analysis. Finally, we calculate the Microkinetic Simulation. The result shows that the reaction can happen at room temperature.

Keywords: DFT study, copper oxide cluster, MOFs, methane conversion

Procedia PDF Downloads 74
8647 Imputing the Minimum Social Value of Public Healthcare: A General Equilibrium Model of Israel

Authors: Erez Yerushalmi, Sani Ziv

Abstract:

The rising demand for healthcare services, without a corresponding rise in public supply, led to a debate on whether to increase private healthcare provision - especially in hospital services and second-tier healthcare. Proponents for increasing private healthcare highlight gains in efficiency, while opponents its risk to social welfare. None, however, provide a measure of the social value and its impact on the economy in terms of a monetary value. In this paper, we impute a minimum social value of public healthcare that corresponds to indifference between gains in efficiency, with losses to social welfare. Our approach resembles contingent valuation methods that introduce a hypothetical market for non-commodities, but is different from them because we use numerical simulation techniques to exploit certain market failure conditions. In this paper, we develop a general equilibrium model that distinguishes between public-private healthcare services and public-private financing. Furthermore, the social value is modelled as a by product of healthcare services. The model is then calibrated to our unique health focused Social Accounting Matrix of Israel, and simulates the introduction of a hypothetical health-labour market - given that it is heavily regulated in the baseline (i.e., the true situation in Israel today). For baseline parameters, we estimate the minimum social value at around 18% public healthcare financing. The intuition is that the gain in economic welfare from improved efficiency, is offset by the loss in social welfare due to a reduction in available social value. We furthermore simulate a deregulated healthcare scenario that internalizes the imputed value of social value and searches for the optimal weight of public and private healthcare provision.

Keywords: contingent valuation method (CVM), general equilibrium model, hypothetical market, private-public healthcare, social value of public healthcare

Procedia PDF Downloads 142