Search results for: data quality filtering
28577 Design of Digital IIR Filter Using Opposition Learning and Artificial Bee Colony Algorithm
Authors: J. S. Dhillon, K. K. Dhaliwal
Abstract:
In almost all the digital filtering applications the digital infinite impulse response (IIR) filters are preferred over finite impulse response (FIR) filters because they provide much better performance, less computational cost and have smaller memory requirements for similar magnitude specifications. However, the digital IIR filters are generally multimodal with respect to the filter coefficients and therefore, reliable methods that can provide global optimal solutions are required. The artificial bee colony (ABC) algorithm is one such recently introduced meta-heuristic optimization algorithm. But in some cases it shows insufficiency while searching the solution space resulting in a weak exchange of information and hence is not able to return better solutions. To overcome this deficiency, the opposition based learning strategy is incorporated in ABC and hence a modified version called oppositional artificial bee colony (OABC) algorithm is proposed in this paper. Duplication of members is avoided during the run which also augments the exploration ability. The developed algorithm is then applied for the design of optimal and stable digital IIR filter structure where design of low-pass (LP) and high-pass (HP) filters is carried out. Fuzzy theory is applied to achieve maximize satisfaction of minimum magnitude error and stability constraints. To check the effectiveness of OABC, the results are compared with some well established filter design techniques and it is observed that in most cases OABC returns better or atleast comparable results.Keywords: digital infinite impulse response filter, artificial bee colony optimization, opposition based learning, digital filter design, multi-parameter optimization
Procedia PDF Downloads 48428576 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs
Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.
Abstract:
Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification
Procedia PDF Downloads 13628575 Complex Shaped Prepreg Part Drapability Using Vacuum Bagging
Authors: Saran Toure
Abstract:
Complex shaped parts manufactured using out of autoclave prepreg vacuum bagging has a high quality finish. This is not only due to in the control of resin to fibre ratio in prepregs, but also to a reduction in fibre misalignment, slippage and stresses occurring within plies during compaction. In a bid to further reduce deformation modes and control failure modes, we carried experiments where, we introduced wetted fabrics within a prepreg plybook during compaction. Here are presented the results obtained from the vacuum bagging of a complex shaped part. The shape is that of a turbine fan blade with smooth curves all throughout ending with sharp edged angles. The quality of the final part made from this blade is compared to that of the same blade made from standard vacuum bagging process of prepregs, without introducing wetted fabrics.Keywords: complex shaped part, prepregs, drapability, vacuum bagging
Procedia PDF Downloads 36728574 EFL Teachers’ Sequential Self-Led Reflection and Possible Modifications in Their Classroom Management Practices
Authors: Sima Modirkhameneh, Mohammad Mohammadpanah
Abstract:
In the process of EFL teachers’ development, self-led reflection (SLR) is thought to have an imperative role because it may help teachers analyze, evaluate, and contemplate what is happening in their classes. Such contemplations can not only enhance the quality of their instruction and provide better learning environments for learners but also improve the quality of their classroom management (CM). Accordingly, understanding the effect of teachers’ SLR practices may help us gain valuable insights into what possible modifications SLR may bring about in all aspects of EFL teachers' practitioners, especially their CM. The main purpose of this case study was, thus, to investigate the impact of SLR practices of 12 Iranian EFL teachers on their CM based on the universal classroom management checklist (UCMC). In addition, another objective of the current study was to have a clear image of EFL teachers’ perceptions of their own SLR practices and their possible outcomes. By conducting repeated reflective interviews, observations, and feedback of the participants over five teaching sessions, the researcher analyzed the outcomes qualitatively through the process of meaning categorization and data interpretation based on the principles of Grounded Theory. The results demonstrated that EFL teachers utilized SLR practices to improve different aspects of their language teaching skills and CM in different contexts. Almost all participants had positive comments and reactions about the effect of SLR on their CM procedures in different aspects (expectations and routines, behavior-specific praise, error corrections, prompts and precorrections, opportunity to respond, strengths and weaknesses of CM, teachers’ perception, CM ability, and learning process). Otherwise stated, results implied that familiarity with the UCMC criteria and reflective practices contributes to modifying teacher participants’ perceptions about their CM procedure and utilizing the reflective practices in their teaching styles. The results are thought to be valuably beneficial for teachers, teacher educators, and policymakers, who are recommended to pay special attention to the contributions as well as the complexity of reflective teaching. The study concludes with more detailed results and implications and useful directions for future research.Keywords: classroom management, EFL teachers, reflective practices, self-led reflection
Procedia PDF Downloads 6128573 Optimal Pricing Based on Real Estate Demand Data
Authors: Vanessa Kummer, Maik Meusel
Abstract:
Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning
Procedia PDF Downloads 29328572 Unified Power Quality Conditioner Presentation and Dimensioning
Authors: Abderrahmane Kechich, Othmane Abdelkhalek
Abstract:
Static converters behave as nonlinear loads that inject harmonic currents into the grid and increase the consumption of the inactive power. On the other hand, the increased use of sensitive equipment requires the application of sinusoidal voltages. As a result, the electrical power quality control has become a major concern in the field of power electronics. In this context, the active power conditioner (UPQC) was developed. It combines both serial and parallel structures; the series filter can protect sensitive loads and compensate for voltage disturbances such as voltage harmonics, voltage dips or flicker when the shunt filter compensates for current disturbances such as current harmonics, reactive currents and imbalance. This double feature is that it is one of the most appropriate devices. Calculating parameters is an important step and in the same time it’s not easy for that reason several researchers based on trial and error method for calculating parameters but this method is not easy for beginners researchers especially what about the controller’s parameters, for that reason this paper gives a mathematical way to calculate of almost all of UPQC parameters away from trial and error method. This paper gives also a new approach for calculating of PI regulators parameters for purpose to have a stable UPQC able to compensate for disturbances acting on the waveform of line voltage and load current in order to improve the electrical power quality.Keywords: UPQC, Shunt active filer, series active filer, PI controller, PWM control, dual-loop control
Procedia PDF Downloads 40728571 Conceptual Framework of Continuous Academic Lecturer Model in Islamic Higher Education
Authors: Lailial Muhtifah, Sirtul Marhamah
Abstract:
This article forwards the conceptual framework of continuous academic lecturer model in Islamic higher education (IHE). It is intended to make a contribution to the broader issue of how the concept of excellence can promote adherence to standards in higher education and drive quality enhancement. This model reveals a process and steps to increase performance and achievement of excellence regular lecturer gradually. Studies in this model are very significant to realize excellence academic culture in IHE. Several steps were identified from previous studies through literature study and empirical findings. A qualitative study was conducted at institute. Administrators and lecturers were interviewed, and lecturers learning communities observed to explore institute culture policies, and procedures. The original in this study presents and called Continuous Academic Lecturer Model (CALM) with its components, namely Standard, Quality, and Excellent as the basis for this framework (SQE). Innovation Excellence Framework requires Leaders to Support (LS) lecturers to achieve a excellence culture. So, the model named CALM-SQE+LS. Several components of performance and achievement of CALM-SQE+LS Model should be disseminated and cultivated to all lecturers in university excellence in terms of innovation. The purpose of this article is to define the concept of “CALM-SQE+LS”. Originally, there were three components in the Continuous Academic Lecturer Model i.e. standard, quality, and excellence plus leader support. This study is important to the community as specific cases that may inform educational leaders on mechanisms that may be leveraged to ensure successful implementation of policies and procedures outline of CALM with its components (SQE+LS) in institutional culture and professional leader literature. The findings of this study learn how continuous academic lecturer is part of a group's culture, how it benefits in university. This article blends the available criteria into several sub-component to give new insights towards empowering lecturer the innovation excellence at the IHE. The proposed conceptual framework is also presented.Keywords: continuous academic lecturer model, excellence, quality, standard
Procedia PDF Downloads 20628570 Microwave Dielectric Properties and Microstructures of Nd(Ti₀.₅W₀.₅)O₄ Ceramics for Application in Wireless Gas Sensors
Authors: Yih-Chien Chen, Yue-Xuan Du, Min-Zhe Weng
Abstract:
Carbon monoxide is a substance produced by the incomplete combustion. It is toxic even at concentrations of less than 100ppm. Since it is colorless and odorless, it is difficult to detect. CO sensors have been developed using a variety of physical mechanisms, including semiconductor oxides, solid electrolytes, and organic semiconductors. Many works have focused on using semiconducting sensors composed of sensitive layers such as ZnO, TiO₂, and NiO with high sensitivity for gases. However, these sensors working at high temperatures increased their power consumption. On the other hand, the dielectric resonator (DR) is attractive for gas detection due to its large surface area and sensitivity for external environments. Materials that are to be employed in sensing devices must have a high-quality factor. Numerous researches into the fergusonite-type structure and related ceramic systems have explored. Extensive research into RENbO₄ ceramics has explored their potential application in resonators, filters, and antennas in modern communication systems, which are operated at microwave frequencies. Nd(Ti₀.₅W₀.₅)O₄ ceramics were synthesized herein using the conventional mixed-oxide method. The Nd(Ti₀.₅W₀.₅)O₄ ceramics were prepared using the conventional solid-state method. Dielectric constants (εᵣ) of 15.4-19.4 and quality factor (Q×f) of 3,600-11,100 GHz were obtained at sintering temperatures in the range 1425-1525°C for 4 h. The dielectric properties of the Nd(Ti₀.₅W₀.₅)O₄ ceramics at microwave frequencies were found to vary with the sintering temperature. For a further understanding of these microwave dielectric properties, they were analyzed by densification, X-ray diffraction (XRD), and by making microstructural observations.Keywords: dielectric constant, dielectric resonators, sensors, quality factor
Procedia PDF Downloads 26328569 Validating Quantitative Stormwater Simulations in Edmonton Using MIKE URBAN
Authors: Mohamed Gaafar, Evan Davies
Abstract:
Many municipalities within Canada and abroad use chloramination to disinfect drinking water so as to avert the production of the disinfection by-products (DBPs) that result from conventional chlorination processes and their consequential public health risks. However, the long-lasting monochloramine disinfectant (NH2Cl) can pose a significant risk to the environment. As, it can be introduced into stormwater sewers, from different water uses, and thus freshwater sources. Little research has been undertaken to monitor and characterize the decay of NH2Cl and to study the parameters affecting its decomposition in stormwater networks. Therefore, the current study was intended to investigate this decay starting by building a stormwater model and validating its hydraulic and hydrologic computations, and then modelling water quality in the storm sewers and examining the effects of different parameters on chloramine decay. The presented work here is only the first stage of this study. The 30th Avenue basin in Southern Edmonton was chosen as a case study, because the well-developed basin has various land-use types including commercial, industrial, residential, parks and recreational. The City of Edmonton has already built a MIKE-URBAN stormwater model for modelling floods. Nevertheless, this model was built to the trunk level which means that only the main drainage features were presented. Additionally, this model was not calibrated and known to consistently compute pipe flows higher than the observed values; not to the benefit of studying water quality. So the first goal was to complete modelling and updating all stormwater network components. Then, available GIS Data was used to calculate different catchment properties such as slope, length and imperviousness. In order to calibrate and validate this model, data of two temporary pipe flow monitoring stations, collected during last summer, was used along with records of two other permanent stations available for eight consecutive summer seasons. The effect of various hydrological parameters on model results was investigated. It was found that model results were affected by the ratio of impervious areas. The catchment length was tested, however calculated, because it is approximate representation of the catchment shape. Surface roughness coefficients were calibrated using. Consequently, computed flows at the two temporary locations had correlation coefficients of values 0.846 and 0.815, where the lower value pertained to the larger attached catchment area. Other statistical measures, such as peak error of 0.65%, volume error of 5.6%, maximum positive and negative differences of 2.17 and -1.63 respectively, were all found in acceptable ranges.Keywords: stormwater, urban drainage, simulation, validation, MIKE URBAN
Procedia PDF Downloads 30428568 Unlocking the Puzzle of Borrowing Adult Data for Designing Hybrid Pediatric Clinical Trials
Authors: Rajesh Kumar G
Abstract:
A challenging aspect of any clinical trial is to carefully plan the study design to meet the study objective in optimum way and to validate the assumptions made during protocol designing. And when it is a pediatric study, there is the added challenge of stringent guidelines and difficulty in recruiting the necessary subjects. Unlike adult trials, there is not much historical data available for pediatrics, which is required to validate assumptions for planning pediatric trials. Typically, pediatric studies are initiated as soon as approval is obtained for a drug to be marketed for adults, so with the adult study historical information and with the available pediatric pilot study data or simulated pediatric data, the pediatric study can be well planned. Generalizing the historical adult study for new pediatric study is a tedious task; however, it is possible by integrating various statistical techniques and utilizing the advantage of hybrid study design, which will help to achieve the study objective in a smoother way even with the presence of many constraints. This research paper will explain how well the hybrid study design can be planned along with integrated technique (SEV) to plan the pediatric study; In brief the SEV technique (Simulation, Estimation (using borrowed adult data and applying Bayesian methods)) incorporates the use of simulating the planned study data and getting the desired estimates to Validate the assumptions.This method of validation can be used to improve the accuracy of data analysis, ensuring that results are as valid and reliable as possible, which allow us to make informed decisions well ahead of study initiation. With professional precision, this technique based on the collected data allows to gain insight into best practices when using data from historical study and simulated data alike.Keywords: adaptive design, simulation, borrowing data, bayesian model
Procedia PDF Downloads 8128567 Simulation of the Collimator Plug Design for Prompt-Gamma Activation Analysis in the IEA-R1 Nuclear Reactor
Authors: Carlos G. Santos, Frederico A. Genezini, A. P. Dos Santos, H. Yorivaz, P. T. D. Siqueira
Abstract:
The Prompt-Gamma Activation Analysis (PGAA) is a valuable technique for investigating the elemental composition of various samples. However, the installation of a PGAA system entails specific conditions such as filtering the neutron beam according to the target and providing adequate shielding for both users and detectors. These requirements incur substantial costs, exceeding $100,000, including manpower. Nevertheless, a cost-effective approach involves leveraging an existing neutron beam facility to create a hybrid system integrating PGAA and Neutron Tomography (NT). The IEA-R1 nuclear reactor at IPEN/USP possesses an NT facility with suitable conditions for adapting and implementing a PGAA device. The NT facility offers a thermal flux slightly colder and provides shielding for user protection. The key additional requirement involves designing detector shielding to mitigate high gamma ray background and safeguard the HPGe detector from neutron-induced damage. This study employs Monte Carlo simulations with the MCNP6 code to optimize the collimator plug for PGAA within the IEA-R1 NT facility. Three collimator models are proposed and simulated to assess their effectiveness in shielding gamma and neutron radiation from nucleon fission. The aim is to achieve a focused prompt-gamma signal while shielding ambient gamma radiation. The simulation results indicate that one of the proposed designs is particularly suitable for the PGAA-NT hybrid system.Keywords: MCNP6.1, neutron, prompt-gamma ray, prompt-gamma activation analysis
Procedia PDF Downloads 8228566 Effect Of E-banking On Performance Efficiency Of Commercial Banks In Pakistan
Authors: Naeem Hassan
Abstract:
The study intended to investigate the impact of the e banking system on the performance efficiency of the commercial banks in KP, Pakistan. In addition to this main purpose, the study also aimed at analyzing the impact of e banking on the service quality as well as satisfaction of the customers using e banking system. More over, the focus was also given to highlight the risks involved in the e banking system. The researcher has adopted the quantitative methodology in the study. in order to reach concrete finding, the researcher has analyzed the secondary data taken from the annual reports of selected banks and State bank of Pakistan as well as the primary data collected through the self-administrated questionnaire from the participants selected for the current study. The study highlighted that there is a significant impact of e banking on the financial efficiency on the commercial banks in KP, Pakistan. Additionally, the results of the study also show that the online banking is having significant effects on the customer satisfaction. The researcher recommends on the bases of findings that commercial banks should continue to adopt new technologies which will improve their margins and hence their net profit after tax in order to attract more investors. Additionally, commercial bank needs to minimize the time and risk in e-banking to attract more customers which will improve their net profit. Furthermore, the study findings also recommend the banking policy makers should also review policies related to promotion of innovation adoption and transfer of technology. Commercial banking system should encourage adoption of innovations that will improve profit of the banking industry.Keywords: E-banking, performance efficiency, commercial banks, effect
Procedia PDF Downloads 7728565 Wind Turbine Control Performance Evaluation Based on Minimum-Variance Principles
Authors: Zheming Cao
Abstract:
Control loops are the most important components in the wind turbine system. Product quality, operation safety, and the economic performance are directly or indirectly connected to the performance of control systems. This paper proposed a performance evaluation method based on minimum-variance for wind turbine control system. This method can be applied on PID controller for pitch control system in the wind turbine. The good performance result demonstrated in the paper was achieved by retuning and optimizing the controller settings based on the evaluation result. The concepts presented in this paper are illustrated with the actual data of the industrial wind farm.Keywords: control performance, evaluation, minimum-variance, wind turbine
Procedia PDF Downloads 37728564 Genetic Algorithm Optimization of Microcantilever Based Resonator
Authors: Manjula Sutagundar, B. G. Sheeparamatti, D. S. Jangamshetti
Abstract:
Micro Electro Mechanical Systems (MEMS) resonators have shown the potential of replacing quartz crystal technology for sensing and high frequency signal processing applications because of inherent advantages like small size, high quality factor, low cost, compatibility with integrated circuit chips. This paper presents the optimization and modelling and simulation of the optimized micro cantilever resonator. The objective of the work is to optimize the dimensions of a micro cantilever resonator for a specified range of resonant frequency and specific quality factor. Optimization is carried out using genetic algorithm. The genetic algorithm is implemented using MATLAB. The micro cantilever resonator is modelled in CoventorWare using the optimized dimensions obtained from genetic algorithm. The modeled cantilever is analysed for resonance frequency.Keywords: MEMS resonator, genetic algorithm, modelling and simulation, optimization
Procedia PDF Downloads 55428563 Seasonal Variability of the Price and Quality of Fresh Red Porgy Fish Sold in the Local Market of Igoumenitsa, NW Greece
Authors: C. Nathanailides, P. Logothetis, G. Kanlis S. Anastasiou, L. Kokokiris, P. Mpeza
Abstract:
Farmed Red porgy (Pagrus pagrus) is one of the “new candidate fish species” for the diversification of Mediterranean aquaculture which is predomintly based on the cultivation of the European sea bass, (Dicenfrarchus labrax), and the gilthead sea bream, (Sparus aurata). The quality of farmed red porgy (Pagrus pagrus) was investigated with samples obtained from the local fish market in the region of Igoumenitsa, NW Greece. Sample of the fish (ungutted and with scales) were purchased from three local fish mongers and transported to the laboratory within few minutes in foamed polystyrene boxes in ice. The average weight of whole fish ranged between 271-289g. A sample of the fish flesh taken from the upper epaxial region was transferred aseptically to a stomacher bag containing sterile Buffered Peptone Water solution (0.1%) and homogenized. After serial dilutions in 0.1% peptone water, the homogenates were spread on the surface of agar plates. Total viable counts (TVC) were determined using plate count agar after incubation at 30 oC for 3 days. The quality attributes monitored during the present work included bacterial load (total mesophilic) and the pH of the flesh. There was a marginal increase in the price of fresh red porgy sold during the summer time, with prices ranging, over a period of four seasons, from 5.85 to 7.5 per kilo. The results of the microbiological analysis indicate that with the exception of summer samples (which exhibited 5.23 (±0.13) log cfu/g), the bacterial load remained well below the legal limits and was around 3.1 log cfu/g. The pH values varied between 6.54 and 6.69. The results indicate a possible influence of season on the bacterial load of fish sold in the market. Nevertheless, the parameters investigated in the present work indicate that the bacteria load was well below the legal limit and that fish were sold within few days after harvesting. The peak of bacterial load in the summer samples may be a result of a post-harvesting contamination of the farmed fish and temperature fluctuations during handling and transportation.Keywords: fish quality, marketing, aquaculture, Pagrus pagrus
Procedia PDF Downloads 68428562 Comparative Analysis of the Third Generation of Research Data for Evaluation of Solar Energy Potential
Authors: Claudineia Brazil, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Rafael Haag
Abstract:
Renewable energy sources are dependent on climatic variability, so for adequate energy planning, observations of the meteorological variables are required, preferably representing long-period series. Despite the scientific and technological advances that meteorological measurement systems have undergone in the last decades, there is still a considerable lack of meteorological observations that form series of long periods. The reanalysis is a system of assimilation of data prepared using general atmospheric circulation models, based on the combination of data collected at surface stations, ocean buoys, satellites and radiosondes, allowing the production of long period data, for a wide gamma. The third generation of reanalysis data emerged in 2010, among them is the Climate Forecast System Reanalysis (CFSR) developed by the National Centers for Environmental Prediction (NCEP), these data have a spatial resolution of 0.50 x 0.50. In order to overcome these difficulties, it aims to evaluate the performance of solar radiation estimation through alternative data bases, such as data from Reanalysis and from meteorological satellites that satisfactorily meet the absence of observations of solar radiation at global and/or regional level. The results of the analysis of the solar radiation data indicated that the reanalysis data of the CFSR model presented a good performance in relation to the observed data, with determination coefficient around 0.90. Therefore, it is concluded that these data have the potential to be used as an alternative source in locations with no seasons or long series of solar radiation, important for the evaluation of solar energy potential.Keywords: climate, reanalysis, renewable energy, solar radiation
Procedia PDF Downloads 21128561 Modeling Atmospheric Correction for Global Navigation Satellite System Signal to Improve Urban Cadastre 3D Positional Accuracy Case of: TANA and ADIS IGS Stations
Authors: Asmamaw Yehun
Abstract:
The name “TANA” is one of International Geodetic Service (IGS) Global Positioning System (GPS) station which is found in Bahir Dar University in Institute of Land Administration. The station name taken from one of big Lakes in Africa ,Lake Tana. The Institute of Land Administration (ILA) is part of Bahir Dar University, located in the capital of the Amhara National Regional State, Bahir Dar. The institute is the first of its kind in East Africa. The station is installed by cooperation of ILA and Sweden International Development Agency (SIDA) fund support. The Continues Operating Reference Station (CORS) is a network of stations that provide global satellite system navigation data to help three dimensional positioning, meteorology, space, weather, and geophysical applications throughout the globe. TANA station was as CORS since 2013 and sites are independently owned and operated by governments, research and education facilities and others. The data collected by the reference station is downloadable through Internet for post processing purpose by interested parties who carry out GNSS measurements and want to achieve a higher accuracy. We made a first observation on TANA, monitor stations on May 29th 2013. We used Leica 1200 receivers and AX1202GG antennas and made observations from 11:30 until 15:20 for about 3h 50minutes. Processing of data was done in an automatic post processing service CSRS-PPP by Natural Resources Canada (NRCan) . Post processing was done June 27th 2013 so precise ephemeris was used 30 days after observation. We found Latitude (ITRF08): 11 34 08.6573 (dms) / 0.008 (m), Longitude (ITRF08): 37 19 44.7811 (dms) / 0.018 (m) and Ellipsoidal Height (ITRF08): 1850.958 (m) / 0.037 (m). We were compared this result with GAMIT/GLOBK processed data and it was very closed and accurate. TANA station is one of the second IGS station for Ethiopia since 2015 up to now. It provides data for any civilian users, researchers, governmental and nongovernmental users. TANA station is installed with very advanced choke ring antenna and GR25 Leica receiver and also the site is very good for satellite accessibility. In order to test hydrostatic and wet zenith delay for positional data quality, we used GAMIT/GLOBK and we found that TANA station is the most accurate IGS station in East Africa. Due to lower tropospheric zenith and ionospheric delay, TANA and ADIS IGS stations has 2 and 1.9 meters 3D positional accuracy respectively.Keywords: atmosphere, GNSS, neutral atmosphere, precipitable water vapour
Procedia PDF Downloads 10228560 Data Mining Spatial: Unsupervised Classification of Geographic Data
Authors: Chahrazed Zouaoui
Abstract:
In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.Keywords: mining, GIS, geo-clustering, neighborhood
Procedia PDF Downloads 37628559 Hyperchaos-Based Video Encryption for Device-To-Device Communications
Authors: Samir Benzegane, Said Sadoudi, Mustapha Djeddou
Abstract:
In this paper, we present a software development of video streaming encryption for Device-to-Device (D2D) communications by using Hyperchaos-based Random Number Generator (HRNG) implemented in C#. The software implements and uses the proposed HRNG to generate key stream for encrypting and decrypting real-time video data. The used HRNG consists of Hyperchaos Lorenz system which produces four signal outputs taken as encryption keys. The generated keys are characterized by high quality randomness which is confirmed by passing standard NIST statistical tests. Security analysis of the proposed encryption scheme confirms its robustness against different attacks.Keywords: hyperchaos Lorenz system, hyperchaos-based random number generator, D2D communications, C#
Procedia PDF Downloads 38028558 Comparative Study on Sensory Profiles of Liquor from Different Dried Cocoa Beans
Authors: Khairul Bariah Sulaiman, Tajul Aris Yang
Abstract:
Malaysian dried cocoa beans have been reported to have low quality flavour and are often sold at discounted prices. Various efforts have been made to improve the Malaysian beans quality. Among these efforts is introduction of the shallow box fermentation technique and pulp preconditioned through pods storage. However, after nearly four decades of the effort was done, Malaysian cocoa farmers still received lower prices for their beans. So, this study was carried out in order to assess the flavour quality of dried cocoa beans produced by shallow box fermentation techniques, combination of shallow box fermentation with pods storage and compared to dried cocoa beans obtained from Ghana. A total of eight samples of dried cocoa was used in this study, which one of the samples was Ghanaian beans (coded with no.8), while the rest were Malaysian cocoa beans with different post-harvest processing (coded with no. 1, 2, 3, 4, 5, 6 and 7). Cocoa liquor was prepared from all samples in the prescribed techniques and sensory evaluation was carried out using Quantitative Descriptive Analysis (QDA) Method with 0-10 scale by Malaysian Cocoa Board trained panelist. Sensory evaluation showed that cocoa attributes for all cocoa liquors ranging from 3.5 to 5.3, whereas bitterness was ranging from 3.4 to 4.6 and astringent attribute ranging from 3.9 to 5.5, respectively. Meanwhile, all cocoa liquors were having acid or sourness attribute ranging from 1.6 to 3.6, respectively. In general cocoa liquor prepared from sample coded no 4 has almost similar flavour profile and no significantly different at p < 0.05 with Ghana, in term of most flavour attributes as compared to the other six samples.Keywords: cocoa beans, flavour, fermentation, shallow box, pods storage
Procedia PDF Downloads 39728557 Effects of Acupuncture Treatment in Gait Parameters in Parkinson's Disease
Authors: Catarina Isabel Ramos Pereira, Jorge Machado, Begona Alonso Criado, Maria João Santos
Abstract:
Introduction: Gait disorders are one of the symptoms that have severe implications on the quality of life in Parkinson's disease (PD). Currently, there is no therapy to reverse or treat this condition. None of the drugs used in conventional medical treatment is entirely efficient, and all have a high incidence of side effects. Acupuncture treatment is believed to improve motor ability, but there is still little scientific evidence in individuals with PD. Aim: The aim of the study is to investigate the acute effect of acupuncture on gait parameters in Parkinson's disease. Methods: This is a randomized and controlled crossover study. The same individual patient was part of both the experimental (real acupuncture) and control group (false acupuncture/sham), and the sequence was randomized. Gait parameters were measured at two different moments, before and after treatment, using four force platforms as well as the collection of 3D markers positions taken by 11 cameras. Images were quantitatively analyzed using Qualisys Track Manager software that let us extract data related to the quality of gait and balance. Seven patients with the diagnosis of Parkinson's disease were included in the study. Results: Statistically significant differences were found in gait speed (p = 0.016), gait cadence (p = 0.006), support base width (p = 0.0001), medio-lateral oscillation (p = 0.017), left-right step length (p = 0.0002), and stride length: right-right (p = 0.0000) and left-left (p = 0.0018), time of left support phase (p = 0.029), right support phase (p = 0.025) and double support phase (p = 0.015), between the initial and final moments for the experimental group. Differences in right-left stride length were found for both groups. Conclusion: Our results show that acupuncture could enhance gait in Parkinson's disease patients. Deep research involving a larger number of volunteers should be accomplished to validate these encouraging findings.Keywords: acupuncture, traditional Chinese medicine, Parkinson's disease, gait
Procedia PDF Downloads 17428556 Experimental Study and Evaluation of Farm Environmental Monitoring System Based on the Internet of Things, Sudan
Authors: Farid Eltom A. E., Mustafa Abdul-Halim, Abdalla Markaz, Sami Atta, Mohamed Azhari, Ahmed Rashed
Abstract:
Smart environment sensors integrated with ‘Internet of Things’ (IoT) technology can provide a new concept in tracking, sensing, and monitoring objects in the environment. The aim of the study is to evaluate the farm environmental monitoring system based on (IoT) and to realize the automated management of agriculture and the implementation of precision production. Until now, irrigation monitoring operations in Sudan have been carried out using traditional methods, which is a very costly and unreliable mechanism. However, by utilizing soil moisture sensors, irrigation can be conducted only when needed without fear of plant water stress. The result showed that software application allows farmers to display current and historical data on soil moisture and nutrients in the form of line charts. Design measurements of the soil factors: moisture, electrical, humidity, conductivity, temperature, pH, phosphorus, and potassium; these factors, together with a timestamp, are sent to the data server using the Lora WAN interface. It is considered scientifically agreed upon in the modern era that artificial intelligence works to arrange the necessary procedures to take care of the terrain, predict the quality and quantity of production through deep analysis of the various operations in agricultural fields, and also support monitoring of weather conditions.Keywords: smart environment, monitoring systems, IoT, LoRa Gateway, center pivot
Procedia PDF Downloads 5028555 Analysis and Prediction of Netflix Viewing History Using Netflixlatte as an Enriched Real Data Pool
Authors: Amir Mabhout, Toktam Ghafarian, Amirhossein Farzin, Zahra Makki, Sajjad Alizadeh, Amirhossein Ghavi
Abstract:
The high number of Netflix subscribers makes it attractive for data scientists to extract valuable knowledge from the viewers' behavioural analyses. This paper presents a set of statistical insights into viewers' viewing history. After that, a deep learning model is used to predict the future watching behaviour of the users based on previous watching history within the Netflixlatte data pool. Netflixlatte in an aggregated and anonymized data pool of 320 Netflix viewers with a length 250 000 data points recorded between 2008-2022. We observe insightful correlations between the distribution of viewing time and the COVID-19 pandemic outbreak. The presented deep learning model predicts future movie and TV series viewing habits with an average loss of 0.175.Keywords: data analysis, deep learning, LSTM neural network, netflix
Procedia PDF Downloads 26428554 Analysis of User Data Usage Trends on Cellular and Wi-Fi Networks
Authors: Jayesh M. Patel, Bharat P. Modi
Abstract:
The availability of on mobile devices that can invoke the demonstrated that the total data demand from users is far higher than previously articulated by measurements based solely on a cellular-centric view of smart-phone usage. The ratio of Wi-Fi to cellular traffic varies significantly between countries, This paper is shown the compression between the cellular data usage and Wi-Fi data usage by the user. This strategy helps operators to understand the growing importance and application of yield management strategies designed to squeeze maximum returns from their investments into the networks and devices that enable the mobile data ecosystem. The transition from unlimited data plans towards tiered pricing and, in the future, towards more value-centric pricing offers significant revenue upside potential for mobile operators, but, without a complete insight into all aspects of smartphone customer behavior, operators will unlikely be able to capture the maximum return from this billion-dollar market opportunity.Keywords: cellular, Wi-Fi, mobile, smart phone
Procedia PDF Downloads 37128553 Audit Outcome Cardiac Arrest Cases (2019-2020) in Emergency Department RIPAS Hospital, Brunei Darussalam
Authors: Victor Au, Khin Maung Than, Zaw Win Aung, Linawati Jumat
Abstract:
Background & Objectives: Cardiac arrests can occur anywhere or anytime, and most of the cases will be brought to the emergency department except the cases that happened in at in-patient setting. Raja IsteriPangiran Anak Saleha (RIPAS) Hospital is the only tertiary government hospital which located in Brunei Muara district and received all referral from other Brunei districts. Data of cardiac arrests in Brunei Darussalam scattered between Emergency Medical Ambulance Services (EMAS), Emergency Department (ED), general inpatient wards, and Intensive Care Unit (ICU). In this audit, we only focused on cardiac arrest cases which had happened or presented to the emergency department RIPAS Hospital. Theobjectives of this audit were to look at demographic of cardiac arrest cases and the survival to discharge rate of In-Hospital Cardiac Arrest (IHCA) and Out-Hospital Cardiac Arrest (OHCA). Methodology: This audit retrospective study was conducted on all cardiac arrest cases that underwent Cardiopulmonary Resuscitation (CPR) in ED RIPAS Hospital, Brunei Muara, in the year 2019-2020. All cardiac arrest cases that happened or were brought in to emergency department were included. All the relevant data were retrieved from ED visit registry book and electronic medical record “Bru-HIMS” with keyword diagnosis of “cardiac arrest”. Data were analyzed and tabulated using Excel software. Result: 313 cardiac arrests were recorded in the emergency department in year 2019-2020. 92% cases were categorized as OHCA, and the remaining 8% as IHCA. Majority of the cases were male with age between 50-60 years old. In OHCA subgroup, only 12.4% received bystander CPR, and 0.4% received Automatic External Defibrillator (AED) before emergency medical personnel arrived. Initial shockable rhythm in IHCA group accounted for 12% compare to 4.9% in OHCA group. Outcome of ED resuscitation, 32% of IHCA group achieved return of spontaneous circulation (ROSC) with a survival to discharge rate was 16%. For OHCA group, 12.35% achieved ROSC, but unfortunately, none of them survive till discharge. Conclusion: Standardized registry for cardiac arrest in the emergency department is required to provide valid baseline data to measure the quality and outcome of cardiac arrest. Zero survival rate for out hospital cardiac arrest is very concerning, and it might represent the significant breach in cardiac arrest chains of survival. Systematic prospective data collection is needed to identify contributing factors and to improve resuscitation outcome.Keywords: cardiac arrest, OHCA, IHCA, resuscitation, emergency department
Procedia PDF Downloads 10828552 Implementation of Risk Management System to Improve the Quality of Higher Education Institutes
Authors: Muhammad Wasif, Asif Ahmed Shaikh, Sarosh Hashmat Lodi, Muhammad Aslam Bhutto, Riazuddin
Abstract:
Risk Management System is quite popular in profit- based organizations, health and safety and project management fields since the last few decades. But due to rapidly changing environment and requirement of ISO 9001:2015 standards, public-sector institution, especially higher education institutes are also performing risk assessment to monitor the performance of the institution and aligning it with the latest benchmark. In this context, NED University of Engineering and Technology performed research and developed a Standard Operating Procedure (SOP) for the risk assessment, its monitoring and control. In this research, risks are broken into the four sources, namely; Internal Academics Risks, External Academics Risks, Internal Non-academic Risks, External Non-academic Risks. Risks are identified by the management at all levels. Severity and likelihood of the risks are assigned based on the previous audit results and the customer complains. Risk Ratings are calculated to orderly arrange the risk according to the Risk Rating, and controls for the risks are designed, which are assigned to the responsible person. At the end of the article, result and analysis on the different sources of risk are discussed in details and the conclusion is drawn. Discussion on few sample risks are presented in this article. Hence it is presented in the research that the Risk Management System can be applied in a Higher Education Institute to effectively control the risks which might affect the scope and Quality Management System of an organization.Keywords: higher education, quality management system, risk assessment, risk management
Procedia PDF Downloads 31728551 Data Driven Infrastructure Planning for Offshore Wind farms
Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree
Abstract:
The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data
Procedia PDF Downloads 8928550 A Post-Occupancy Evaluation of the Impact of Indoor Environmental Quality on Health and Well-Being in Office Buildings
Authors: Suyeon Bae, Abimbola Asojo, Denise Guerin, Caren Martin
Abstract:
Post-occupancy evaluations (POEs) have been recognized for documenting occupant well-being and responses to indoor environmental quality (IEQ) factors such as thermal, lighting, and acoustic conditions. Sustainable Post-Occupancy evaluation survey (SPOES) developed by an interdisciplinary team at a Midwest University provides an evidence-based quantitative analysis of occupants’ satisfaction in office, classroom, and residential spaces to help direct attention to successful areas and areas that need improvement in buildings. SPOES is a self-administered and Internet-based questionnaire completed by building occupants. In this study, employees in three different office buildings rated their satisfaction on a Likert-type scale about 12 IEQ criteria including thermal condition, indoor air quality, acoustic quality, daylighting, electric lighting, privacy, view conditions, furnishings, appearance, cleaning and maintenance, vibration and movement, and technology. Employees rated their level of satisfaction on a Likert-type scale from 1 (very dissatisfied) to 7 (very satisfied). They also rate the influence of their physical environment on their perception of their work performance and the impact of their primary workspaces on their health on a scale from 1 (hinders) to 7 (enhances). Building A is a three-story building that includes private and group offices, classrooms, and conference rooms and amounted to 55,000 square-feet for primary workplace (N=75). Building B, a six-story building, consisted of private offices, shared enclosed office, workstations, and open desk areas for employees and amounted to 14,193 square-feet (N=75). Building C is a three-story 56,000 square-feet building that included classrooms, therapy rooms, an outdoor playground, gym, restrooms, and training rooms for clinicians (N=76). The results indicated that 10 IEQs for Building A except acoustic quality and privacy showed statistically significant correlations on the impact of the primary workspace on health. In Building B, 11 IEQs except technology showed statistically significant correlations on the impact of the primary workspace on health. Building C had statistically significant correlations between all 12 IEQ and the employees’ perception of the impact of their primary workspace on their health in two-tailed correlations (P ≤ 0.05). Out of 33 statistically significant correlations, 25 correlations (76%) showed at least moderate relationship (r ≥ 0.35). For the three buildings, daylighting, furnishings, and indoor air quality IEQs ranked highest on the impact on health. IEQs about vibration and movement, view condition, and electric lighting ranked second, followed by IEQs about cleaning and maintenance and appearance. These results imply that 12 IEQs developed in SPOES are highly related to employees’ perception of how their primary workplaces impact their health. The IEQs in this study offer an opportunity for improving occupants’ well-being and the built environment.Keywords: post-occupancy evaluation, built environment, sustainability, well-being, indoor air quality
Procedia PDF Downloads 29228549 Catalytic Pyrolysis of Sewage Sludge for Upgrading Bio-Oil Quality Using Sludge-Based Activated Char as an Alternative to HZSM5
Abstract:
Due to the concerns about the depletion of fossil fuel sources and the deteriorating environment, the attempt to investigate the production of renewable energy will play a crucial role as a potential to alleviate the dependency on mineral fuels. One particular area of interest is the generation of bio-oil through sewage sludge (SS) pyrolysis. SS can be a potential candidate in contrast to other types of biomasses due to its availability and low cost. However, the presence of high molecular weight hydrocarbons and oxygenated compounds in the SS bio-oil hinders some of its fuel applications. In this context, catalytic pyrolysis is another attainable route to upgrade bio-oil quality. Among different catalysts (i.e., zeolites) studied for SS pyrolysis, activated chars (AC) are eco-friendly alternatives. The beneficial features of AC derived from SS comprise the comparatively large surface area, porosity, enriched surface functional groups, and presence of a high amount of metal species that can improve the catalytic activity. Hence, a sludge-based AC catalyst was fabricated in a single-step pyrolysis reaction with NaOH as the activation agent and was compared with HZSM5 zeolite in this study. The thermal decomposition and kinetics were invested via thermogravimetric analysis (TGA) for guidance and control of pyrolysis and catalytic pyrolysis and the design of the pyrolysis setup. The results indicated that the pyrolysis and catalytic pyrolysis contains four obvious stages, and the main decomposition reaction occurred in the range of 200-600°C. The Coats-Redfern method was applied in the 2nd and 3rd devolatilization stages to estimate the reaction order and activation energy (E) from the mass loss data. The average activation energy (Em) values for the reaction orders n = 1, 2, and 3 were in the range of 6.67-20.37 kJ for SS; 1.51-6.87 kJ for HZSM5; and 2.29-9.17 kJ for AC, respectively. According to the results, AC and HZSM5 both were able to improve the reaction rate of SS pyrolysis by abridging the Em value. Moreover, to generate and examine the effect of the catalysts on the quality of bio-oil, a fixed-bed pyrolysis system was designed and implemented. The composition analysis of the produced bio-oil was carried out via gas chromatography/mass spectrometry (GC/MS). The selected SS to catalyst ratios were 1:1, 2:1, and 4:1. The optimum ratio in terms of cracking the long-chain hydrocarbons and removing oxygen-containing compounds was 1:1 for both catalysts. The upgraded bio-oils with AC and HZSM5 were in the total range of C4-C17, with around 72% in the range of C4-C9. The bio-oil from pyrolysis of SS contained 49.27% oxygenated compounds, while with the presence of AC and HZSM5 dropped to 13.02% and 7.3%, respectively. Meanwhile, the generation of benzene, toluene, and xylene (BTX) compounds was significantly improved in the catalytic process. Furthermore, the fabricated AC catalyst was characterized by BET, SEM-EDX, FT-IR, and TGA techniques. Overall, this research demonstrated AC is an efficient catalyst in the pyrolysis of SS and can be used as a cost-competitive catalyst in contrast to HZSM5.Keywords: catalytic pyrolysis, sewage sludge, activated char, HZSM5, bio-oil
Procedia PDF Downloads 18428548 The Impact of Centralisation on Radical Prostatectomy Outcomes: Our Outcomes
Authors: Jemini Vyas, Oluwatobi Adeyoe, Jenny Branagan, Chandran Tanabalan, John Beatty, Aakash Pai
Abstract:
Introduction: The development of robotic surgery has accelerated centralisation to tertiary centres, where robotic radical prostatectomy (RP) is offered. The purpose of concentrating treatment in high volume specialist centres is to improve the quality of care and patient outcomes. The aim of this study was to assess the impact on clinical outcomes of centralisation for locally diagnosed patients undergoing RP. Methods: Clinical outcomes for 169 consecutive laparoscopic & open RP pre-centralisation were retrospectively compared with 50 consecutive robotic RP conducted over a similar period post-centralisation. Preoperative risk stratification and time to surgery were collected. Perioperative outcomes, including length of stay (LOS) and complications, were collated. Post-operative outcomes, including erectile dysfunction (ED), biochemical recurrence (BCR), and urinary continence, were assessed. Results: Preoperative risk stratification showed no difference between the two groups. The median time from diagnosis to treatment was similar between the two groups (pre-centralisation, 121 days, post-centralisation, 117 days). The mean length of stay (pre-centralisation, 2.1 days, post-centralisation, 1.6 days) showed no significant difference (p=0.073). Proportion of overall complications (pre-centralisation, 11.4%, post-centralisation, 8.7%) and complications, above Clavien-Dindo 2, were similar between the two groups (pre-centralisation1.2%, post-centralisation 2.2%). Post operative functional parameters, including continence and ED, were comparable. Five-year BCR free rate was 78% for the pre-centralisation group and 79% for the post centralisation group. Conclusion: For our cohort of patients, clinical outcomes have remained static during centralisation. It is imperative that centralisation is accompanied by increased capacity, streamlining of pathways, and training to ensure that improved quality of care is achieved. Our institution has newly acquired a robot, and prospectively studying this data may support the reversal of centralisation for RP surgery.Keywords: prostate, cancer, prostatectomy, clinical
Procedia PDF Downloads 102