Search results for: porous network
2384 Concept of Automation in Management of Electric Power Systems
Authors: Richard Joseph, Nerey Mvungi
Abstract:
An electric power system includes a generating, a transmission, a distribution and consumers subsystems. An electrical power network in Tanzania keeps growing larger by the day and become more complex so that, most utilities have long wished for real-time monitoring and remote control of electrical power system elements such as substations, intelligent devices, power lines, capacitor banks, feeder switches, fault analyzers and other physical facilities. In this paper, the concept of automation of management of power systems from generation level to end user levels was determined by using Power System Simulator for Engineering (PSS/E) version 30.3.2.Keywords: automation, distribution subsystem, generating subsystem, PSS/E, TANESCO, transmission subsystem
Procedia PDF Downloads 6742383 Cloud-Based Mobile-to-Mobile Computation Offloading
Authors: Ebrahim Alrashed, Yousef Rafique
Abstract:
Mobile devices have drastically changed the way we do things on the move. They are being extremely relied on to perform tasks that are analogous to desktop computer capability. There has been a rapid increase of computational power on these devices; however, battery technology is still the bottleneck of evolution. The primary modern approach day approach to tackle this issue is offloading computation to the cloud, proving to be latency expensive and requiring high network bandwidth. In this paper, we explore efforts to perform barter-based mobile-to-mobile offloading. We present define a protocol and present an architecture to facilitate the development of such a system. We further highlight the deployment and security challenges.Keywords: computational offloading, power conservation, cloud, sandboxing
Procedia PDF Downloads 3882382 Optimization for Autonomous Robotic Construction by Visual Guidance through Machine Learning
Authors: Yangzhi Li
Abstract:
Network transfer of information and performance customization is now a viable method of digital industrial production in the era of Industry 4.0. Robot platforms and network platforms have grown more important in digital design and construction. The pressing need for novel building techniques is driven by the growing labor scarcity problem and increased awareness of construction safety. Robotic approaches in construction research are regarded as an extension of operational and production tools. Several technological theories related to robot autonomous recognition, which include high-performance computing, physical system modeling, extensive sensor coordination, and dataset deep learning, have not been explored using intelligent construction. Relevant transdisciplinary theory and practice research still has specific gaps. Optimizing high-performance computing and autonomous recognition visual guidance technologies improves the robot's grasp of the scene and capacity for autonomous operation. Intelligent vision guidance technology for industrial robots has a serious issue with camera calibration, and the use of intelligent visual guiding and identification technologies for industrial robots in industrial production has strict accuracy requirements. It can be considered that visual recognition systems have challenges with precision issues. In such a situation, it will directly impact the effectiveness and standard of industrial production, necessitating a strengthening of the visual guiding study on positioning precision in recognition technology. To best facilitate the handling of complicated components, an approach for the visual recognition of parts utilizing machine learning algorithms is proposed. This study will identify the position of target components by detecting the information at the boundary and corner of a dense point cloud and determining the aspect ratio in accordance with the guidelines for the modularization of building components. To collect and use components, operational processing systems assign them to the same coordinate system based on their locations and postures. The RGB image's inclination detection and the depth image's verification will be used to determine the component's present posture. Finally, a virtual environment model for the robot's obstacle-avoidance route will be constructed using the point cloud information.Keywords: robotic construction, robotic assembly, visual guidance, machine learning
Procedia PDF Downloads 862381 Efficacy and Safety of Updated Target Therapies for Treatment of Platinum-Resistant Recurrent Ovarian Cancer
Authors: John Hang Leung, Shyh-Yau Wang, Hei-Tung Yip, Fion, Ho Tsung-chin, Agnes LF Chan
Abstract:
Objectives: Platinum-resistant ovarian cancer has a short overall survival of 9–12 months and limited treatment options. The combination of immunotherapy and targeted therapy appears to be a promising treatment option for patients with ovarian cancer, particularly to patients with platinum-resistant recurrent ovarian cancer (PRrOC). However, there are no direct head-to-head clinical trials comparing their efficacy and toxicity. We, therefore, used a network to directly and indirectly compare seven newer immunotherapies or targeted therapies combined with chemotherapy in platinum-resistant relapsed ovarian cancer, including antibody-drug conjugates, PD-1 (Programmed death-1) and PD-L1 (Programmed death-ligand 1), PARP (Poly ADP-ribose polymerase) inhibitors, TKIs (Tyrosine kinase inhibitors), and antiangiogenic agents. Methods: We searched PubMed (Public/Publisher MEDLINE), EMBASE (Excerpta Medica Database), and the Cochrane Library electronic databases for phase II and III trials involving PRrOC patients treated with immunotherapy or targeted therapy plus chemotherapy. The quality of included trials was assessed using the GRADE method. The primary outcomes compared were progression-free survival, the secondary outcomes were overall survival and safety. Results: Seven randomized controlled trials involving a total of 2058 PRrOC patients were included in this analysis. Bevacizumab plus chemotherapy showed statistically significant differences in PFS (Progression-free survival) but not OS (Overall survival) for all interested targets and immunotherapy regimens; however, according to the heatmap analysis, bevacizumab plus chemotherapy had a statistically significant risk of ≥grade 3 SAEs (Severe adverse effects), particularly hematological severe adverse events (neutropenia, anemia, leukopenia, and thrombocytopenia). Conclusions: Bevacizumab plus chemotherapy resulted in better PFS as compared with all interested regimens for the treatment of PRrOC. However, statistical differences in SAEs as bevacizumab plus chemotherapy is associated with a greater risk for hematological SAE.Keywords: platinum-resistant recurrent ovarian cancer, network meta-analysis, immune checkpoint inhibitors, target therapy, antiangiogenic agents
Procedia PDF Downloads 792380 Market Index Trend Prediction using Deep Learning and Risk Analysis
Authors: Shervin Alaei, Reza Moradi
Abstract:
Trading in financial markets is subject to risks due to their high volatilities. Here, using an LSTM neural network, and by doing some risk-based feature engineering tasks, we developed a method that can accurately predict trends of the Tehran stock exchange market index from a few days ago. Our test results have shown that the proposed method with an average prediction accuracy of more than 94% is superior to the other common machine learning algorithms. To the best of our knowledge, this is the first work incorporating deep learning and risk factors to accurately predict market trends.Keywords: deep learning, LSTM, trend prediction, risk management, artificial neural networks
Procedia PDF Downloads 1562379 Green Wave Control Strategy for Optimal Energy Consumption by Model Predictive Control in Electric Vehicles
Authors: Furkan Ozkan, M. Selcuk Arslan, Hatice Mercan
Abstract:
Electric vehicles are becoming increasingly popular asa sustainable alternative to traditional combustion engine vehicles. However, to fully realize the potential of EVs in reducing environmental impact and energy consumption, efficient control strategies are essential. This study explores the application of green wave control using model predictive control for electric vehicles, coupled with energy consumption modeling using neural networks. The use of MPC allows for real-time optimization of the vehicles’ energy consumption while considering dynamic traffic conditions. By leveraging neural networks for energy consumption modeling, the EV's performance can be further enhanced through accurate predictions and adaptive control. The integration of these advanced control and modeling techniques aims to maximize energy efficiency and range while navigating urban traffic scenarios. The findings of this research offer valuable insights into the potential of green wave control for electric vehicles and demonstrate the significance of integrating MPC and neural network modeling for optimizing energy consumption. This work contributes to the advancement of sustainable transportation systems and the widespread adoption of electric vehicles. To evaluate the effectiveness of the green wave control strategy in real-world urban environments, extensive simulations were conducted using a high-fidelity vehicle model and realistic traffic scenarios. The results indicate that the integration of model predictive control and energy consumption modeling with neural networks had a significant impact on the energy efficiency and range of electric vehicles. Through the use of MPC, the electric vehicle was able to adapt its speed and acceleration profile in realtime to optimize energy consumption while maintaining travel time objectives. The neural network-based energy consumption modeling provided accurate predictions, enabling the vehicle to anticipate and respond to variations in traffic flow, further enhancing energy efficiency and range. Furthermore, the study revealed that the green wave control strategy not only reduced energy consumption but also improved the overall driving experience by minimizing abrupt acceleration and deceleration, leading to a smoother and more comfortable ride for passengers. These results demonstrate the potential for green wave control to revolutionize urban transportation by enhancing the performance of electric vehicles and contributing to a more sustainable and efficient mobility ecosystem.Keywords: electric vehicles, energy efficiency, green wave control, model predictive control, neural networks
Procedia PDF Downloads 542378 The Extent of Virgin Olive-Oil Prices' Distribution Revealing the Behavior of Market Speculators
Authors: Fathi Abid, Bilel Kaffel
Abstract:
The olive tree, the olive harvest during winter season and the production of olive oil better known by professionals under the name of the crushing operation have interested institutional traders such as olive-oil offices and private companies such as food industry refining and extracting pomace olive oil as well as export-import public and private companies specializing in olive oil. The major problem facing producers of olive oil each winter campaign, contrary to what is expected, it is not whether the harvest will be good or not but whether the sale price will allow them to cover production costs and achieve a reasonable margin of profit or not. These questions are entirely legitimate if we judge by the importance of the issue and the heavy complexity of the uncertainty and competition made tougher by a high level of indebtedness and the experience and expertise of speculators and producers whose objectives are sometimes conflicting. The aim of this paper is to study the formation mechanism of olive oil prices in order to learn about speculators’ behavior and expectations in the market, how they contribute by their industry knowledge and their financial alliances and the size the financial challenge that may be involved for them to build private information hoses globally to take advantage. The methodology used in this paper is based on two stages, in the first stage we study econometrically the formation mechanisms of olive oil price in order to understand the market participant behavior by implementing ARMA, SARMA, GARCH and stochastic diffusion processes models, the second stage is devoted to prediction purposes, we use a combined wavelet- ANN approach. Our main findings indicate that olive oil market participants interact with each other in a way that they promote stylized facts formation. The unstable participant’s behaviors create the volatility clustering, non-linearity dependent and cyclicity phenomena. By imitating each other in some periods of the campaign, different participants contribute to the fat tails observed in the olive oil price distribution. The best prediction model for the olive oil price is based on a back propagation artificial neural network approach with input information based on wavelet decomposition and recent past history.Keywords: olive oil price, stylized facts, ARMA model, SARMA model, GARCH model, combined wavelet-artificial neural network, continuous-time stochastic volatility mode
Procedia PDF Downloads 3392377 Pattern Identification in Statistical Process Control Using Artificial Neural Networks
Authors: M. Pramila Devi, N. V. N. Indra Kiran
Abstract:
Control charts, predominantly in the form of X-bar chart, are important tools in statistical process control (SPC). They are useful in determining whether a process is behaving as intended or there are some unnatural causes of variation. A process is out of control if a point falls outside the control limits or a series of point’s exhibit an unnatural pattern. In this paper, a study is carried out on four training algorithms for CCPs recognition. For those algorithms optimal structure is identified and then they are studied for type I and type II errors for generalization without early stopping and with early stopping and the best one is proposed.Keywords: control chart pattern recognition, neural network, backpropagation, generalization, early stopping
Procedia PDF Downloads 3722376 Production of Bio-Composites from Cocoa Pod Husk for Use in Packaging Materials
Authors: L. Kanoksak, N. Sukanya, L. Napatsorn, T. Siriporn
Abstract:
A growing population and demand for packaging are driving up the usage of natural resources as raw materials in the pulp and paper industry. Long-term effects of environmental is disrupting people's way of life all across the planet. Finding pulp sources to replace wood pulp is therefore necessary. To produce wood pulp, various other potential plants or plant parts can be employed as substitute raw materials. For example, pulp and paper were made from agricultural residue that mainly included pulp can be used in place of wood. In this study, cocoa pod husks were an agricultural residue of the cocoa and chocolate industries. To develop composite materials to replace wood pulp in packaging materials. The paper was coated with polybutylene adipate-co-terephthalate (PBAT). By selecting and cleaning fresh cocoa pod husks, the size was reduced. And the cocoa pod husks were dried. The morphology and elemental composition of cocoa pod husks were studied. To evaluate the mechanical and physical properties, dried cocoa husks were extracted using the soda-pulping process. After selecting the best formulations, paper with a PBAT bioplastic coating was produced on a paper-forming machine Physical and mechanical properties were studied. By using the Field Emission Scanning Electron Microscope/Energy Dispersive X-Ray Spectrometer (FESEM/EDS) technique, the structure of dried cocoa pod husks showed the main components of cocoa pod husks. The appearance of porous has not been found. The fibers were firmly bound for use as a raw material for pulp manufacturing. Dry cocoa pod husks contain the major elements carbon (C) and oxygen (O). Magnesium (Mg), potassium (K), and calcium (Ca) were minor elements that were found in very small levels. After that cocoa pod husks were removed from the soda-pulping process. It found that the SAQ5 formula produced pulp yield, moisture content, and water drainage. To achieve the basis weight by TAPPI T205 sp-02 standard, cocoa pod husk pulp and modified starch were mixed. The paper was coated with bioplastic PBAT. It was produced using bioplastic resin from the blown film extrusion technique. It showed the contact angle, dispersion component and polar component. It is an effective hydrophobic material for rigid packaging applications.Keywords: cocoa pod husks, agricultural residue, composite material, rigid packaging
Procedia PDF Downloads 762375 The Practice and Research of Computer-Aided Language Learning in China
Authors: Huang Yajing
Abstract:
Context: Computer-aided language learning (CALL) in China has undergone significant development over the past few decades, with distinct stages marking its evolution. This paper aims to provide a comprehensive review of the practice and research in this field in China, tracing its journey from the early stages of audio-visual education to the current multimedia network integration stage. Research Aim: The study aims to analyze the historical progression of CALL in China, identify key developments in the field, and provide recommendations for enhancing CALL practices in the future. Methodology: The research employs document analysis and literature review to synthesize existing knowledge on CALL in China, drawing on a range of sources to construct a detailed overview of the evolution of CALL practices and research in the country. Findings: The review highlights the significant advancements in CALL in China, showcasing the transition from traditional audio-visual educational approaches to the current integrated multimedia network stage. The study identifies key milestones, technological advancements, and theoretical influences that have shaped CALL practices in China. Theoretical Importance: The evolution of CALL in China reflects not only technological progress but also shifts in educational paradigms and theories. The study underscores the significance of cognitive psychology as a theoretical underpinning for CALL practices, emphasizing the learner's active role in the learning process. Data Collection and Analysis Procedures: Data collection involved extensive review and analysis of documents and literature related to CALL in China. The analysis was carried out systematically to identify trends, developments, and challenges in the field. Questions Addressed: The study addresses the historical development of CALL in China, the impact of technological advancements on teaching practices, the role of cognitive psychology in shaping CALL methodologies, and the future outlook for CALL in the country. Conclusion: The review provides a comprehensive overview of the evolution of CALL in China, highlighting key stages of development and emerging trends. The study concludes by offering recommendations to further enhance CALL practices in the Chinese context.Keywords: English education, educational technology, computer-aided language teaching, applied linguistics
Procedia PDF Downloads 552374 ACOPIN: An ACO Algorithm with TSP Approach for Clustering Proteins in Protein Interaction Networks
Authors: Jamaludin Sallim, Rozlina Mohamed, Roslina Abdul Hamid
Abstract:
In this paper, we proposed an Ant Colony Optimization (ACO) algorithm together with Traveling Salesman Problem (TSP) approach to investigate the clustering problem in Protein Interaction Networks (PIN). We named this combination as ACOPIN. The purpose of this work is two-fold. First, to test the efficacy of ACO in clustering PIN and second, to propose the simple generalization of the ACO algorithm that might allow its application in clustering proteins in PIN. We split this paper to three main sections. First, we describe the PIN and clustering proteins in PIN. Second, we discuss the steps involved in each phase of ACO algorithm. Finally, we present some results of the investigation with the clustering patterns.Keywords: ant colony optimization algorithm, searching algorithm, protein functional module, protein interaction network
Procedia PDF Downloads 6112373 Prediction of Sepsis Illness from Patients Vital Signs Using Long Short-Term Memory Network and Dynamic Analysis
Authors: Marcio Freire Cruz, Naoaki Ono, Shigehiko Kanaya, Carlos Arthur Mattos Teixeira Cavalcante
Abstract:
The systems that record patient care information, known as Electronic Medical Record (EMR) and those that monitor vital signs of patients, such as heart rate, body temperature, and blood pressure have been extremely valuable for the effectiveness of the patient’s treatment. Several kinds of research have been using data from EMRs and vital signs of patients to predict illnesses. Among them, we highlight those that intend to predict, classify, or, at least identify patterns, of sepsis illness in patients under vital signs monitoring. Sepsis is an organic dysfunction caused by a dysregulated patient's response to an infection that affects millions of people worldwide. Early detection of sepsis is expected to provide a significant improvement in its treatment. Preceding works usually combined medical, statistical, mathematical and computational models to develop detection methods for early prediction, getting higher accuracies, and using the smallest number of variables. Among other techniques, we could find researches using survival analysis, specialist systems, machine learning and deep learning that reached great results. In our research, patients are modeled as points moving each hour in an n-dimensional space where n is the number of vital signs (variables). These points can reach a sepsis target point after some time. For now, the sepsis target point was calculated using the median of all patients’ variables on the sepsis onset. From these points, we calculate for each hour the position vector, the first derivative (velocity vector) and the second derivative (acceleration vector) of the variables to evaluate their behavior. And we construct a prediction model based on a Long Short-Term Memory (LSTM) Network, including these derivatives as explanatory variables. The accuracy of the prediction 6 hours before the time of sepsis, considering only the vital signs reached 83.24% and by including the vectors position, speed, and acceleration, we obtained 94.96%. The data are being collected from Medical Information Mart for Intensive Care (MIMIC) Database, a public database that contains vital signs, laboratory test results, observations, notes, and so on, from more than 60.000 patients.Keywords: dynamic analysis, long short-term memory, prediction, sepsis
Procedia PDF Downloads 1252372 A Long Tail Study of eWOM Communities
Authors: M. Olmedilla, M. R. Martinez-Torres, S. L. Toral
Abstract:
Electronic Word-Of-Mouth (eWOM) communities represent today an important source of information in which more and more customers base their purchasing decisions. They include thousands of reviews concerning very different products and services posted by many individuals geographically distributed all over the world. Due to their massive audience, eWOM communities can help users to find the product they are looking for even if they are less popular or rare. This is known as the long tail effect, which leads to a larger number of lower-selling niche products. This paper analyzes the long tail effect in a well-known eWOM community and defines a tool for finding niche products unavailable through conventional channels.Keywords: eWOM, online user reviews, long tail theory, product categorization, social network analysis
Procedia PDF Downloads 4212371 Application of Deep Learning and Ensemble Methods for Biomarker Discovery in Diabetic Nephropathy through Fibrosis and Propionate Metabolism Pathways
Authors: Oluwafunmibi Omotayo Fasanya, Augustine Kena Adjei
Abstract:
Diabetic nephropathy (DN) is a major complication of diabetes, with fibrosis and propionate metabolism playing critical roles in its progression. Identifying biomarkers linked to these pathways may provide novel insights into DN diagnosis and treatment. This study aims to identify biomarkers associated with fibrosis and propionate metabolism in DN. Analyze the biological pathways and regulatory mechanisms of these biomarkers. Develop a machine learning model to predict DN-related biomarkers and validate their functional roles. Publicly available transcriptome datasets related to DN (GSE96804 and GSE104948) were obtained from the GEO database (https://www.ncbi.nlm.nih.gov/gds), and 924 propionate metabolism-related genes (PMRGs) and 656 fibrosis-related genes (FRGs) were identified. The analysis began with the extraction of DN-differentially expressed genes (DN-DEGs) and propionate metabolism-related DEGs (PM-DEGs), followed by the intersection of these with fibrosis-related genes to identify key intersected genes. Instead of relying on traditional models, we employed a combination of deep neural networks (DNNs) and ensemble methods such as Gradient Boosting Machines (GBM) and XGBoost to enhance feature selection and biomarker discovery. Recursive feature elimination (RFE) was coupled with these advanced algorithms to refine the selection of the most critical biomarkers. Functional validation was conducted using convolutional neural networks (CNN) for gene set enrichment and immunoinfiltration analysis, revealing seven significant biomarkers—SLC37A4, ACOX2, GPD1, ACE2, SLC9A3, AGT, and PLG. These biomarkers are involved in critical biological processes such as fatty acid metabolism and glomerular development, providing a mechanistic link to DN progression. Furthermore, a TF–miRNA–mRNA regulatory network was constructed using natural language processing models to identify 8 transcription factors and 60 miRNAs that regulate these biomarkers, while a drug–gene interaction network revealed potential therapeutic targets such as UROKINASE–PLG and ATENOLOL–AGT. This integrative approach, leveraging deep learning and ensemble models, not only enhances the accuracy of biomarker discovery but also offers new perspectives on DN diagnosis and treatment, specifically targeting fibrosis and propionate metabolism pathways.Keywords: diabetic nephropathy, deep neural networks, gradient boosting machines (GBM), XGBoost
Procedia PDF Downloads 82370 Harmony Search-Based K-Coverage Enhancement in Wireless Sensor Networks
Authors: Shaimaa M. Mohamed, Haitham S. Hamza, Imane A. Saroit
Abstract:
Many wireless sensor network applications require K-coverage of the monitored area. In this paper, we propose a scalable harmony search based algorithm in terms of execution time, K-Coverage Enhancement Algorithm (KCEA), it attempts to enhance initial coverage, and achieve the required K-coverage degree for a specific application efficiently. Simulation results show that the proposed algorithm achieves coverage improvement of 5.34% compared to K-Coverage Rate Deployment (K-CRD), which achieves 1.31% when deploying one additional sensor. Moreover, the proposed algorithm is more time efficient.Keywords: Wireless Sensor Networks (WSN), harmony search algorithms, K-Coverage, Mobile WSN
Procedia PDF Downloads 5262369 Modeling Spatio-Temporal Variation in Rainfall Using a Hierarchical Bayesian Regression Model
Authors: Sabyasachi Mukhopadhyay, Joseph Ogutu, Gundula Bartzke, Hans-Peter Piepho
Abstract:
Rainfall is a critical component of climate governing vegetation growth and production, forage availability and quality for herbivores. However, reliable rainfall measurements are not always available, making it necessary to predict rainfall values for particular locations through time. Predicting rainfall in space and time can be a complex and challenging task, especially where the rain gauge network is sparse and measurements are not recorded consistently for all rain gauges, leading to many missing values. Here, we develop a flexible Bayesian model for predicting rainfall in space and time and apply it to Narok County, situated in southwestern Kenya, using data collected at 23 rain gauges from 1965 to 2015. Narok County encompasses the Maasai Mara ecosystem, the northern-most section of the Mara-Serengeti ecosystem, famous for its diverse and abundant large mammal populations and spectacular migration of enormous herds of wildebeest, zebra and Thomson's gazelle. The model incorporates geographical and meteorological predictor variables, including elevation, distance to Lake Victoria and minimum temperature. We assess the efficiency of the model by comparing it empirically with the established Gaussian process, Kriging, simple linear and Bayesian linear models. We use the model to predict total monthly rainfall and its standard error for all 5 * 5 km grid cells in Narok County. Using the Monte Carlo integration method, we estimate seasonal and annual rainfall and their standard errors for 29 sub-regions in Narok. Finally, we use the predicted rainfall to predict large herbivore biomass in the Maasai Mara ecosystem on a 5 * 5 km grid for both the wet and dry seasons. We show that herbivore biomass increases with rainfall in both seasons. The model can handle data from a sparse network of observations with many missing values and performs at least as well as or better than four established and widely used models, on the Narok data set. The model produces rainfall predictions consistent with expectation and in good agreement with the blended station and satellite rainfall values. The predictions are precise enough for most practical purposes. The model is very general and applicable to other variables besides rainfall.Keywords: non-stationary covariance function, gaussian process, ungulate biomass, MCMC, maasai mara ecosystem
Procedia PDF Downloads 2942368 Phone Number Spoofing Attack in VoLTE 4G
Authors: Joo-Hyung Oh
Abstract:
The number of service users of 4G VoLTE (voice over LTE) using LTE data networks is rapidly growing. VoLTE based on all-IP network enables clearer and higher-quality voice calls than 3G. It does, however, pose new challenges; a voice call through IP networks makes it vulnerable to security threats such as wiretapping and forged or falsified information. And in particular, stealing other users’ phone numbers and forging or falsifying call request messages from outgoing voice calls within VoLTE result in considerable losses that include user billing and voice phishing to acquaintances. This paper focuses on the threats of caller phone number spoofing in the VoLTE and countermeasure technology as safety measures for mobile communication networks.Keywords: LTE, 4G, VoLTE, phone number spoofing
Procedia PDF Downloads 4322367 Using a GIS-Based Method for Green Infrastructure Accessibility of Different Socio-Economic Groups in Auckland, New Zealand
Authors: Jing Ma, Xindong An
Abstract:
Green infrastructure, the most important aspect of improving the quality of life, has been a crucial element of the liveability measurement. With demanding of more liveable urban environment from increasing population in city area, access to green infrastructure in walking distance should be taken into consideration. This article exemplifies the study on accessibility measurement of green infrastructure in central Auckland (New Zealand), using network analysis tool on the basis of GIS, to verify the accessibility levels of green infrastructure. It analyses the overall situation of green infrastructure and draws some conclusions on the city’s different levels of accessibility according to the categories and facilities distribution, which provides valuable references and guidance for the future facility improvement in planning strategies.Keywords: quality of life, green infrastructure, GIS, accessibility
Procedia PDF Downloads 2822366 Development of a Mathematical Model to Characterize the Oil Production in the Federal Republic of Nigeria Environment
Authors: Paul C. Njoku, Archana Swati Njoku
Abstract:
The study deals with the development of a mathematical model to characterize the oil production in Nigeria. This is calculated by initiating the dynamics of oil production in million barrels revenue plan cost of oil production in million nairas and unit cost of production from 1974-1982 in the contest of the federal Republic of Nigeria. This country export oil to other countries as well as importing specialized crude. The transport network from origin/destination tij to pairs is taking into account simulation runs, optimization have been considered in this study.Keywords: mathematical oil model development dynamics, Nigeria, characterization barrels, dynamics of oil production
Procedia PDF Downloads 3872365 Green Ports: Innovation Adopters or Innovation Developers
Authors: Marco Ferretti, Marcello Risitano, Maria Cristina Pietronudo, Lina Ozturk
Abstract:
A green port is the result of a sustainable long-term strategy adopted by an entire port infrastructure, therefore by the set of actors involved in port activities. The strategy aims to realise the development of sustainable port infrastructure focused on the reduction of negative environmental impacts without jeopardising economic growth. Green technology represents the core tool to implement sustainable solutions, however, they are not a magic bullet. Ports have always been integrated in the local territory affecting the environment in which they operate, therefore, the sustainable strategy should fit with the entire local systems. Therefore, adopting a sustainable strategy means to know how to involve and engage a wide stakeholders’ network (industries, production, markets, citizens, and public authority). The existing research on the topic has not well integrated this perspective with those of sustainability. Research on green ports have mixed the sustainability aspects with those on the maritime industry, neglecting dynamics that lead to the development of the green port phenomenon. We propose an analysis of green ports adopting the lens of ecosystem studies in the field of management. The ecosystem approach provides a way to model relations that enable green solutions and green practices in a port ecosystem. However, due to the local dimension of a port and the port trend on innovation, i.e., sustainable innovation, we draw to a specific concept of ecosystem, those on local innovation systems. More precisely, we explore if a green port is a local innovation system engaged in developing sustainable innovation with a large impact on the territory or merely an innovation adopter. To address this issue, we adopt a comparative case study selecting two innovative ports in Europe: Rotterdam and Genova. The case study is a research method focused on understanding the dynamics in a specific situation and can be used to provide a description of real circumstances. Preliminary results show two different approaches in supporting sustainable innovation: one represented by Rotterdam, a pioneer in competitiveness and sustainability, and the second one represented by Genoa, an example of technology adopter. The paper intends to provide a better understanding of how sustainable innovations are developed and in which manner a network of port and local stakeholder support this process. Furthermore, it proposes a taxonomy of green ports as developers and adopters of sustainable innovation, suggesting also best practices to model relationships that enable the port ecosystem in applying a sustainable strategy.Keywords: green port, innovation, sustainability, local innovation systems
Procedia PDF Downloads 1202364 Deep Learning Based on Image Decomposition for Restoration of Intrinsic Representation
Authors: Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Kensuke Nakamura, Dongeun Choi, Byung-Woo Hong
Abstract:
Artefacts are commonly encountered in the imaging process of clinical computed tomography (CT) where the artefact refers to any systematic discrepancy between the reconstructed observation and the true attenuation coefficient of the object. It is known that CT images are inherently more prone to artefacts due to its image formation process where a large number of independent detectors are involved, and they are assumed to yield consistent measurements. There are a number of different artefact types including noise, beam hardening, scatter, pseudo-enhancement, motion, helical, ring, and metal artefacts, which cause serious difficulties in reading images. Thus, it is desired to remove nuisance factors from the degraded image leaving the fundamental intrinsic information that can provide better interpretation of the anatomical and pathological characteristics. However, it is considered as a difficult task due to the high dimensionality and variability of data to be recovered, which naturally motivates the use of machine learning techniques. We propose an image restoration algorithm based on the deep neural network framework where the denoising auto-encoders are stacked building multiple layers. The denoising auto-encoder is a variant of a classical auto-encoder that takes an input data and maps it to a hidden representation through a deterministic mapping using a non-linear activation function. The latent representation is then mapped back into a reconstruction the size of which is the same as the size of the input data. The reconstruction error can be measured by the traditional squared error assuming the residual follows a normal distribution. In addition to the designed loss function, an effective regularization scheme using residual-driven dropout determined based on the gradient at each layer. The optimal weights are computed by the classical stochastic gradient descent algorithm combined with the back-propagation algorithm. In our algorithm, we initially decompose an input image into its intrinsic representation and the nuisance factors including artefacts based on the classical Total Variation problem that can be efficiently optimized by the convex optimization algorithm such as primal-dual method. The intrinsic forms of the input images are provided to the deep denosing auto-encoders with their original forms in the training phase. In the testing phase, a given image is first decomposed into the intrinsic form and then provided to the trained network to obtain its reconstruction. We apply our algorithm to the restoration of the corrupted CT images by the artefacts. It is shown that our algorithm improves the readability and enhances the anatomical and pathological properties of the object. The quantitative evaluation is performed in terms of the PSNR, and the qualitative evaluation provides significant improvement in reading images despite degrading artefacts. The experimental results indicate the potential of our algorithm as a prior solution to the image interpretation tasks in a variety of medical imaging applications. This work was supported by the MISP(Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by the IITP(Institute for Information and Communications Technology Promotion).Keywords: auto-encoder neural network, CT image artefact, deep learning, intrinsic image representation, noise reduction, total variation
Procedia PDF Downloads 1902363 A Corpus-Based Analysis of Japanese Learners' English Modal Auxiliary Verb Usage in Writing
Authors: S. Nakayama
Abstract:
For non-native English speakers, using English modal auxiliary verbs appropriately can be among the most challenging tasks. This research sought to identify differences in modal verb usage between Japanese non-native English speakers (JNNS) and native speakers (NS) from two different perspectives: frequency of use and distribution of verb phrase structures (VPS) where modal verbs occur. This study can contribute to the identification of JNNSs' interlanguage with regard to modal verbs; the main aim is to make a suggestion for the improvement of teaching materials as well as to help language teachers to be able to teach modal verbs in a way that is helpful for learners. To address the primary question in this study, usage of nine central modals (‘can’, ‘could’, ‘may’, ‘might’, ‘shall’, ‘should’, ‘will’, ‘would’, and ‘must’) by JNNS was compared with that by NSs in the International Corpus Network of Asian Learners of English (ICNALE). This corpus is one of the largest freely-available corpora focusing on Asian English learners’ language use. The ICNALE corpus consists of four modules: ‘Spoken Monologue’, ‘Spoken Dialogue’, ‘Written Essays’, and ‘Edited Essays’. Among these, this research adopted the ‘Written Essays’ module only, which is the set of 200-300 word essays and contains approximately 1.3 million words in total. Frequency analysis revealed gaps as well as similarities in frequency order. Specifically, both JNNSs and NSs used ‘can’ with the most frequency, followed by ‘should’ and ‘will’; however, usage of all the other modals except for ‘shall’ was not identical to each other. A log-likelihood test uncovered JNNSs’ overuse of ‘can’ and ‘must’ as well as their underuse of ‘will’ and ‘would’. VPS analysis revealed that JNNSs used modal verbs in a relatively narrow range of VPSs as compared to NSs. Results showed that JNNSs used most of the modals with bare infinitives or the passive voice only whereas NSs used the modals in a wide range of VPSs including the progressive construction and the perfect aspect, both of which were the structures where JNNSs rarely used the modals. Results of frequency analysis suggest that language teachers or teaching materials should explain other modality items so that learners can avoid relying heavily on certain modals and have a wide range of lexical items to reflect their feelings more accurately. Besides, the underused modals should be more stressed in the classroom because they are members of epistemic modals, which allow us to not only interject our views into propositions but also build a relationship with readers. As for VPSs, teaching materials should present more examples of the modals occurring in a wide range of VPSs to help learners to be able to express their opinions from a variety of viewpoints.Keywords: corpus linguistics, Japanese learners of English, modal auxiliary verbs, International Corpus Network of Asian Learners of English
Procedia PDF Downloads 1272362 Seismic Fragility Curves Methodologies for Bridges: A Review
Authors: Amirmozafar Benshams, Khatere Kashmari, Farzad Hatami, Mesbah Saybani
Abstract:
As a part of the transportation network, bridges are one of the most vulnerable structures. In order to investigate the vulnerability and seismic evaluation of bridges performance, identifying of bridge associated with various state of damage is important. Fragility curves provide important data about damage states and performance of bridges against earthquakes. The development of vulnerability information in the form of fragility curves is a widely practiced approach when the information is to be developed accounting for a multitude of uncertain source involved. This paper presents the fragility curve methodologies for bridges and investigates the practice and applications relating to the seismic fragility assessment of bridges.Keywords: fragility curve, bridge, uncertainty, NLTHA, IDA
Procedia PDF Downloads 2822361 Practical Techniques of Improving State Estimator Solution
Authors: Kiamran Radjabli
Abstract:
State Estimator became an intrinsic part of Energy Management Systems (EMS). The SCADA measurements received from the field are processed by the State Estimator in order to accurately determine the actual operating state of the power systems and provide that information to other real-time network applications. All EMS vendors offer a State Estimator functionality in their baseline products. However, setting up and ensuring that State Estimator consistently produces a reliable solution often consumes a substantial engineering effort. This paper provides generic recommendations and describes a simple practical approach to efficient tuning of State Estimator, based on the working experience with major EMS software platforms and consulting projects in many electrical utilities of the USA.Keywords: convergence, monitoring, state estimator, performance, troubleshooting, tuning, power systems
Procedia PDF Downloads 1562360 National Image in the Age of Mass Self-Communication: An Analysis of Internet Users' Perception of Portugal
Authors: L. Godinho, N. Teixeira
Abstract:
Nowadays, massification of Internet access represents one of the major challenges to the traditional powers of the State, among which the power to control its external image. The virtual world has also sparked the interest of social sciences which consider it a new field of study, an immense open text where sense is expressed. In this paper, that immense text has been accessed to so as to understand the perception Internet users from all over the world have of Portugal. Ours is a quantitative and qualitative approach, as we have resorted to buzz, thematic and category analysis. The results confirm the predominance of sea stereotype in others' vision of the Portuguese people, and evidence that national image has adapted to network communication through processes of individuation and paganization.Keywords: national image, internet, self-communication, perception
Procedia PDF Downloads 2562359 Long Term Survival after a First Transient Ischemic Attack in England: A Case-Control Study
Authors: Padma Chutoo, Elena Kulinskaya, Ilyas Bakbergenuly, Nicholas Steel, Dmitri Pchejetski
Abstract:
Transient ischaemic attacks (TIAs) are warning signs for future strokes. TIA patients are at increased risk of stroke and cardio-vascular events after a first episode. A majority of studies on TIA focused on the occurrence of these ancillary events after a TIA. Long-term mortality after TIA received only limited attention. We undertook this study to determine the long-term hazards of all-cause mortality following a first episode of a TIA using anonymised electronic health records (EHRs). We used a retrospective case-control study using electronic primary health care records from The Health Improvement Network (THIN) database. Patients born prior to or in year 1960, resident in England, with a first diagnosis of TIA between January 1986 and January 2017 were matched to three controls on age, sex and general medical practice. The primary outcome was all-cause mortality. The hazards of all-cause mortality were estimated using a time-varying Weibull-Cox survival model which included both scale and shape effects and a random frailty effect of GP practice. 20,633 cases and 58,634 controls were included. Cases aged 39 to 60 years at the first TIA event had the highest hazard ratio (HR) of mortality compared to matched controls (HR = 3.04, 95% CI (2.91 - 3.18)). The HRs for cases aged 61-70 years, 71-76 years and 77+ years were 1.98 (1.55 - 2.30), 1.79 (1.20 - 2.07) and 1.52 (1.15 - 1.97) compared to matched controls. Aspirin provided long-term survival benefits to cases. Cases aged 39-60 years on aspirin had HR of 0.93 (0.84 - 1.00), 0.90 (0.82 - 0.98) and 0.88 (0.80 - 0.96) at 5 years, 10 years and 15 years, respectively, compared to cases in the same age group who were not on antiplatelets. Similar beneficial effects of aspirin were observed in other age groups. There were no significant survival benefits with other antiplatelet options. No survival benefits of antiplatelet drugs were observed in controls. Our study highlights the excess long-term risk of death of TIA patients and cautions that TIA should not be treated as a benign condition. The study further recommends aspirin as the better option for secondary prevention for TIA patients compared to clopidogrel recommended by NICE guidelines. Management of risk factors and treatment strategies should be important challenges to reduce the burden of disease.Keywords: dual antiplatelet therapy (DAPT), General Practice, Multiple Imputation, The Health Improvement Network(THIN), hazard ratio (HR), Weibull-Cox model
Procedia PDF Downloads 1492358 SAFECARE: Integrated Cyber-Physical Security Solution for Healthcare Critical Infrastructure
Authors: Francesco Lubrano, Fabrizio Bertone, Federico Stirano
Abstract:
Modern societies strongly depend on Critical Infrastructures (CI). Hospitals, power supplies, water supplies, telecommunications are just few examples of CIs that provide vital functions to societies. CIs like hospitals are very complex environments, characterized by a huge number of cyber and physical systems that are becoming increasingly integrated. Ensuring a high level of security within such critical infrastructure requires a deep knowledge of vulnerabilities, threats, and potential attacks that may occur, as well as defence and prevention or mitigation strategies. The possibility to remotely monitor and control almost everything is pushing the adoption of network-connected devices. This implicitly introduces new threats and potential vulnerabilities, posing a risk, especially to those devices connected to the Internet. Modern medical devices used in hospitals are not an exception and are more and more being connected to enhance their functionalities and easing the management. Moreover, hospitals are environments with high flows of people, that are difficult to monitor and can somehow easily have access to the same places used by the staff, potentially creating damages. It is therefore clear that physical and cyber threats should be considered, analysed, and treated together as cyber-physical threats. This means that an integrated approach is required. SAFECARE, an integrated cyber-physical security solution, tries to respond to the presented issues within healthcare infrastructures. The challenge is to bring together the most advanced technologies from the physical and cyber security spheres, to achieve a global optimum for systemic security and for the management of combined cyber and physical threats and incidents and their interconnections. Moreover, potential impacts and cascading effects are evaluated through impact propagation models that rely on modular ontologies and a rule-based engine. Indeed, SAFECARE architecture foresees i) a macroblock related to cyber security field, where innovative tools are deployed to monitor network traffic, systems and medical devices; ii) a physical security macroblock, where video management systems are coupled with access control management, building management systems and innovative AI algorithms to detect behavior anomalies; iii) an integration system that collects all the incoming incidents, simulating their potential cascading effects, providing alerts and updated information regarding assets availability.Keywords: cyber security, defence strategies, impact propagation, integrated security, physical security
Procedia PDF Downloads 1652357 The Outcome of Using Machine Learning in Medical Imaging
Authors: Adel Edwar Waheeb Louka
Abstract:
Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.Keywords: artificial intelligence, convolutional neural networks, deeplearning, image processing, machine learningSarapin, intraarticular, chronic knee pain, osteoarthritisFNS, trauma, hip, neck femur fracture, minimally invasive surgery
Procedia PDF Downloads 732356 Security of Internet of Things: Challenges, Requirements and Future Directions
Authors: Amjad F. Alharbi, Bashayer A. Alotaibi, Fahd S. Alotaibi
Abstract:
The emergence of Internet of Things (IoT) technology provides capabilities for a huge number of smart devices, services and people to be communicate with each other for exchanging data and information over existing network. While as IoT is progressing, it provides many opportunities for new ways of communications as well it introduces many security and privacy threats and challenges which need to be considered for the future of IoT development. In this survey paper, an IoT security issues as threats and current challenges are summarized. The security architecture for IoT are presented from four main layers. Based on these layers, the IoT security requirements are presented to insure security in the whole system. Furthermore, some researches initiatives related to IoT security are discussed as well as the future direction for IoT security are highlighted.Keywords: Internet of Things (IoT), IoT security challenges, IoT security requirements, IoT security architecture
Procedia PDF Downloads 3742355 Design of Cloud Service Brokerage System Intermediating Integrated Services in Multiple Cloud Environment
Authors: Dongjae Kang, Sokho Son, Jinmee Kim
Abstract:
Cloud service brokering is a new service paradigm that provides interoperability and portability of application across multiple Cloud providers. In this paper, we designed cloud service brokerage system, any broker, supporting integrated service provisioning and SLA based service life cycle management. For the system design, we introduce the system concept and whole architecture, details of main components and use cases of primary operations in the system. These features ease the Cloud service provider and customer’s concern and support new Cloud service open market to increase cloud service profit and prompt Cloud service echo system in cloud computing related area.Keywords: cloud service brokerage, multiple Clouds, Integrated service provisioning, SLA, network service
Procedia PDF Downloads 488