Search results for: geographic feature distribution
6046 An Approach for Vocal Register Recognition Based on Spectral Analysis of Singing
Authors: Aleksandra Zysk, Pawel Badura
Abstract:
Recognizing and controlling vocal registers during singing is a difficult task for beginner vocalist. It requires among others identifying which part of natural resonators is being used when a sound propagates through the body. Thus, an application has been designed allowing for sound recording, automatic vocal register recognition (VRR), and a graphical user interface providing real-time visualization of the signal and recognition results. Six spectral features are determined for each time frame and passed to the support vector machine classifier yielding a binary decision on the head or chest register assignment of the segment. The classification training and testing data have been recorded by ten professional female singers (soprano, aged 19-29) performing sounds for both chest and head register. The classification accuracy exceeded 93% in each of various validation schemes. Apart from a hard two-class clustering, the support vector classifier returns also information on the distance between particular feature vector and the discrimination hyperplane in a feature space. Such an information reflects the level of certainty of the vocal register classification in a fuzzy way. Thus, the designed recognition and training application is able to assess and visualize the continuous trend in singing in a user-friendly graphical mode providing an easy way to control the vocal emission.Keywords: classification, singing, spectral analysis, vocal emission, vocal register
Procedia PDF Downloads 3056045 Advanced Hybrid Particle Swarm Optimization for Congestion and Power Loss Reduction in Distribution Networks with High Distributed Generation Penetration through Network Reconfiguration
Authors: C. Iraklis, G. Evmiridis, A. Iraklis
Abstract:
Renewable energy sources and distributed power generation units already have an important role in electrical power generation. A mixture of different technologies penetrating the electrical grid, adds complexity in the management of distribution networks. High penetration of distributed power generation units creates node over-voltages, huge power losses, unreliable power management, reverse power flow and congestion. This paper presents an optimization algorithm capable of reducing congestion and power losses, both described as a function of weighted sum. Two factors that describe congestion are being proposed. An upgraded selective particle swarm optimization algorithm (SPSO) is used as a solution tool focusing on the technique of network reconfiguration. The upgraded SPSO algorithm is achieved with the addition of a heuristic algorithm specializing in reduction of power losses, with several scenarios being tested. Results show significant improvement in minimization of losses and congestion while achieving very small calculation times.Keywords: congestion, distribution networks, loss reduction, particle swarm optimization, smart grid
Procedia PDF Downloads 4456044 Sensor Validation Using Bottleneck Neural Network and Variable Reconstruction
Authors: Somia Bouzid, Messaoud Ramdani
Abstract:
The success of any diagnosis strategy critically depends on the sensors measuring process variables. This paper presents a detection and diagnosis sensor faults method based on a Bottleneck Neural Network (BNN). The BNN approach is used as a statistical process control tool for drinking water distribution (DWD) systems to detect and isolate the sensor faults. Variable reconstruction approach is very useful for sensor fault isolation, this method is validated in simulation on a nonlinear system: actual drinking water distribution system. Several results are presented.Keywords: fault detection, localization, PCA, NLPCA, auto-associative neural network
Procedia PDF Downloads 3896043 Research on Modern Semiconductor Converters and the Usage of SiC Devices in the Technology Centre of Ostrava
Authors: P. Vaculík, P. Kaňovský
Abstract:
The following article presents Technology Centre of Ostrava (TCO) in the Czech Republic. Describes the structure and main research areas realized by the project ENET-Energy Units for Utilization of non-traditional Energy Sources. More details are presented from the research program dealing with transformation, accumulation, and distribution of electric energy. Technology Centre has its own energy mix consisting of alternative sources of fuel sources that use of process gases from the storage part and also the energy from distribution network. The article will focus on the properties and application possibilities SiC semiconductor devices for power semiconductor converter for photo-voltaic systems.Keywords: SiC, Si, technology centre of Ostrava, photovoltaic systems, DC/DC Converter, simulation
Procedia PDF Downloads 6106042 Temperature-Dependent Barrier Characteristics of Inhomogeneous Pd/n-GaN Schottky Barrier Diodes Surface
Authors: K. Al-Heuseen, M. R. Hashim
Abstract:
The current-voltage (I-V) characteristics of Pd/n-GaN Schottky barrier were studied at temperatures over room temperature (300-470K). The values of ideality factor (n), zero-bias barrier height (φB0), flat barrier height (φBF) and series resistance (Rs) obtained from I-V-T measurements were found to be strongly temperature dependent while (φBo) increase, (n), (φBF) and (Rs) decrease with increasing temperature. The apparent Richardson constant was found to be 2.1x10-9 Acm-2K-2 and mean barrier height of 0.19 eV. After barrier height inhomogeneities correction, by assuming a Gaussian distribution (GD) of the barrier heights, the Richardson constant and the mean barrier height were obtained as 23 Acm-2K-2 and 1.78eV, respectively. The corrected Richardson constant was very closer to theoretical value of 26 Acm-2K-2.Keywords: electrical properties, Gaussian distribution, Pd-GaN Schottky diodes, thermionic emission
Procedia PDF Downloads 2776041 Temporal Variation of Shorebirds Population in Two Different Mudflats Areas
Authors: N. Norazlimi, R. Ramli
Abstract:
A study was conducted to determine the diversity and abundance of shorebird species habituating the mudflat area of Jeram Beach and Remis Beach, Selangor, Peninsular Malaysia. Direct observation technique (using binoculars and video camera) was applied to record the presence of bird species in the sampling sites from August 2013 until July 2014. A total of 32 species of shorebird were recorded during both migratory and non-migratory seasons. Of these, eleven species (47.8%) are migrants, six species (26.1%) have both migrant and resident populations, four species (17.4%) are vagrants and two species (8.7%) are residents. The compositions of the birds differed significantly in all months (χ2=84.35, p<0.001). There is a significant difference in avian abundance between migratory and non-migratory seasons (Mann-Whitney, t=2.39, p=0.036). The avian abundance were differed significantly in Jeram and Remis Beaches during migratory periods (t=4.39, p=0.001) but not during non-migratory periods (t=0.78, p=0.456). Shorebird diversity was also affected by tidal cycle. There is a significance difference between high tide and low tide (Mann-Whitney, t=78.0, p<0.005). Frequency of disturbance also affected the shorebird distribution (Mann-Whitney, t=57.0, p= 0.0134). Therefore, this study concluded that tides and disturbances are two factors that affecting temporal distribution of shorebird in mudflats area.Keywords: biodiversity, distribution, migratory birds, direct observation
Procedia PDF Downloads 3916040 Using Machine Learning to Classify Human Fetal Health and Analyze Feature Importance
Authors: Yash Bingi, Yiqiao Yin
Abstract:
Reduction of child mortality is an ongoing struggle and a commonly used factor in determining progress in the medical field. The under-5 mortality number is around 5 million around the world, with many of the deaths being preventable. In light of this issue, Cardiotocograms (CTGs) have emerged as a leading tool to determine fetal health. By using ultrasound pulses and reading the responses, CTGs help healthcare professionals assess the overall health of the fetus to determine the risk of child mortality. However, interpreting the results of the CTGs is time-consuming and inefficient, especially in underdeveloped areas where an expert obstetrician is hard to come by. Using a support vector machine (SVM) and oversampling, this paper proposed a model that classifies fetal health with an accuracy of 99.59%. To further explain the CTG measurements, an algorithm based on Randomized Input Sampling for Explanation ((RISE) of Black-box Models was created, called Feature Alteration for explanation of Black Box Models (FAB), and compared the findings to Shapley Additive Explanations (SHAP) and Local Interpretable Model Agnostic Explanations (LIME). This allows doctors and medical professionals to classify fetal health with high accuracy and determine which features were most influential in the process.Keywords: machine learning, fetal health, gradient boosting, support vector machine, Shapley values, local interpretable model agnostic explanations
Procedia PDF Downloads 1446039 Verbal Prefix Selection in Old Japanese: A Corpus-Based Study
Authors: Zixi You
Abstract:
There are a number of verbal prefixes in Old Japanese. However, the selection or the compatibility of verbs and verbal prefixes is among the least investigated topics on Old Japanese language. Unlike other types of prefixes, verbal prefixes in dictionaries are more often than not listed with very brief information such as ‘unknown meaning’ or ‘rhythmic function only’. To fill in a part of this knowledge gap, this paper presents an exhaustive investigation based on the newly developed ‘Oxford Corpus of Old Japanese’ (OCOJ), which included nearly all existing resource of Old Japanese language, with detailed linguistics information in TEI-XML tags. In this paper, we propose the possibility that the following three prefixes, i-, sa-, ta- (with ta- being considered as a variation of sa-), are relevant to split intransitivity in Old Japanese, with evidence that unergative verbs favor i- and that unergative verbs favor sa-(ta-). This might be undermined by the fact that transitives are also found to follow i-. However, with several manifestations of split intransitivity in Old Japanese discussed, the behavior of transitives in verbal prefix selection is no longer as surprising as it may seem to be when one look at the selection of verbal prefix in isolation. It is possible that there are one or more features that played essential roles in determining the selection of i-, and the attested transitive verbs happen to have these features. The data suggest that this feature is a sense of ‘change’ of location or state involved in the event donated by the verb, which is a feature of typical unaccusatives. This is further discussed in the ‘affectedness’ hierarchy. The presentation of this paper, which includes a brief demonstration of the OCOJ, is expected to be of the interest of both specialists and general audiences.Keywords: old Japanese, split intransitivity, unaccusatives, unergatives, verbal prefix selection
Procedia PDF Downloads 4156038 Geospatial Analysis for Predicting Sinkhole Susceptibility in Greene County, Missouri
Authors: Shishay Kidanu, Abdullah Alhaj
Abstract:
Sinkholes in the karst terrain of Greene County, Missouri, pose significant geohazards, imposing challenges on construction and infrastructure development, with potential threats to lives and property. To address these issues, understanding the influencing factors and modeling sinkhole susceptibility is crucial for effective mitigation through strategic changes in land use planning and practices. This study utilizes geographic information system (GIS) software to collect and process diverse data, including topographic, geologic, hydrogeologic, and anthropogenic information. Nine key sinkhole influencing factors, ranging from slope characteristics to proximity to geological structures, were carefully analyzed. The Frequency Ratio method establishes relationships between attribute classes of these factors and sinkhole events, deriving class weights to indicate their relative importance. Weighted integration of these factors is accomplished using the Analytic Hierarchy Process (AHP) and the Weighted Linear Combination (WLC) method in a GIS environment, resulting in a comprehensive sinkhole susceptibility index (SSI) model for the study area. Employing Jenk's natural break classifier method, the SSI values are categorized into five distinct sinkhole susceptibility zones: very low, low, moderate, high, and very high. Validation of the model, conducted through the Area Under Curve (AUC) and Sinkhole Density Index (SDI) methods, demonstrates a robust correlation with sinkhole inventory data. The prediction rate curve yields an AUC value of 74%, indicating a 74% validation accuracy. The SDI result further supports the success of the sinkhole susceptibility model. This model offers reliable predictions for the future distribution of sinkholes, providing valuable insights for planners and engineers in the formulation of development plans and land-use strategies. Its application extends to enhancing preparedness and minimizing the impact of sinkhole-related geohazards on both infrastructure and the community.Keywords: sinkhole, GIS, analytical hierarchy process, frequency ratio, susceptibility, Missouri
Procedia PDF Downloads 746037 Methods of Livable Goal-Oriented Master Urban Design: A Case Study on Zibo City
Authors: Xiaoping Zhang, Fengying Yan
Abstract:
The implementation of the 'Urban Design Management Measures' requires that the master urban design should aim at creating a livable urban space. However, to our best knowledge, the existing researches and practices of master urban design not only focus less on the livable space but also face a number of problems such as paying more attention to the image of the city, ignoring the people-oriented and lacking dynamic continuity. In order to make the master urban design can better guide the construction of city. Firstly, the paper proposes the livable city hierarchy system to meet the needs of different groups of people and then constructs the framework of livable goal-oriented master urban design based on the theory of livable content and the ideological origin of people-oriented. Secondly, the paper takes the master urban design practice of Zibo as a sample and puts forward the design strategy of strengthening the pattern, improve the quality of space, shape the feature, and establish a series of action plans based on the strategy of urban space development. Finally, the paper explores the method system of livable goal-oriented master urban design from the aspects of safety pattern, morphology pattern, neighborhood scale, open space, street space, public interface, style feature, public participation and action plans.Keywords: livable, master urban design, public participation, zibo city
Procedia PDF Downloads 3176036 Laparoscopic Management of Cysts Mimicking Hepatic Cystic Echinococcosis in Children (A Case Series)
Authors: Assia Haif, Djelloul Achouri, Zineddine Soualili
Abstract:
Introduction: Laparoscopic treatment of liver echinococcosis cyst has become popular. In parallel, the diagnostic approach of cystic liver lesions is based on the number of lesions and their distribution. The etiologies of cystic masses in children are different, and the role of imaging in their characterization and pre-therapeutic evaluation is essential. The main differential diagnoses of hepatic hydatid cysts can be discovered intraoperatively by minimally invasive surgery. Methods: The clinical data contained seven patients with hepatic cystic who underwent laparoscopic surgery in the Department of Pediatric Surgery, SETIF, Algeria, from 2015 to 2022. Results: Of reported seven patients, five are male, and the remaining two are female. Abdominal pain was the most frequent clinical signs. Biological parameters were within normal limits, Abdominal ultrasound, practiced in all cases, completed by abdominal computed tomography (CT), showed a hydatid cystic. For all patients, surgical procedures were performed under laparoscopy. Total cystectomy in four patients, fenestration or subtotal cystectomy in three patients, respectively. A histopathological feature confirmed the nature of the cysts. During the follow-up period, there was no recurrence. Conclusions: Laparoscopic liver surgery is a safe and effective approach, it is an alternative to conventional surgery and a reproducible method. Laparoscopic surgery approach should follow the same principals with those of open surgery. This surgical technique can rectify the diagnosis of hydatid cyst, the histopathological examination confirms the nature of the cystic lesion.Keywords: children, cyst, echinococcosis, laparoscopic, liver
Procedia PDF Downloads 1386035 Assessment of Environmental Quality of an Urban Setting
Authors: Namrata Khatri
Abstract:
The rapid growth of cities is transforming the urban environment and posing significant challenges for environmental quality. This study examines the urban environment of Belagavi in Karnataka, India, using geostatistical methods to assess the spatial pattern and land use distribution of the city and to evaluate the quality of the urban environment. The study is driven by the necessity to assess the environmental impact of urbanisation. Satellite data was utilised to derive information on land use and land cover. The investigation revealed that land use had changed significantly over time, with a drop in plant cover and an increase in built-up areas. High-resolution satellite data was also utilised to map the city's open areas and gardens. GIS-based research was used to assess public green space accessibility and to identify regions with inadequate waste management practises. The findings revealed that garbage collection and disposal techniques in specific areas of the city needed to be improved. Moreover, the study evaluated the city's thermal environment using Landsat 8 land surface temperature (LST) data. The investigation found that built-up regions had higher LST values than green areas, pointing to the city's urban heat island (UHI) impact. The study's conclusions have far-reaching ramifications for urban planners and politicians in Belgaum and other similar cities. The findings may be utilised to create sustainable urban planning strategies that address the environmental effect of urbanisation while also improving the quality of life for city dwellers. Satellite data and high-resolution satellite pictures were gathered for the study, and remote sensing and GIS tools were utilised to process and analyse the data. Ground truthing surveys were also carried out to confirm the accuracy of the remote sensing and GIS-based data. Overall, this study provides a complete assessment of Belgaum's environmental quality and emphasizes the potential of remote sensing and geographic information systems (GIS) approaches in environmental assessment and management.Keywords: environmental quality, UEQ, remote sensing, GIS
Procedia PDF Downloads 806034 Repair Workshop Queue System Modification Using Priority Scheme
Authors: C. Okonkwo Ugochukwu, E. Sinebe Jude, N. Odoh Blessing, E. Okafor Christian
Abstract:
In this paper, a modification on repair workshop queuing system using multi priority scheme was carried out. Chi square goodness of fit test was used to determine the random distribution of the inter arrival time and service time of crankshafts that come for maintenance in the workshop. The chi square values obtained for all the prioritized classes show that the distribution conforms to Poisson distribution. The mean waiting time in queue results of non-preemptive priority for 1st, 2nd and 3rd classes show 0.066, 0.09, and 0.224 day respectively, while preemptive priority show 0.007, 0.036 and 0.258 day. However, when non priority is used, which obviously has no class distinction it amounts to 0.17 days. From the results, one can observe that the preemptive priority system provides a very dramatic improvement over the non preemptive priority as it concerns arrivals that are of higher priority. However, the improvement has a detrimental effect on the low priority class. The trend of the results is similar to the mean waiting time in the system as a result of addition of the actual service time. Even though the mean waiting time for the queue and that of the system for no priority takes the least time when compared with the least priority, urgent and semi-urgent jobs will terribly suffer which will most likely result in reneging or balking of many urgent jobs. Hence, the adoption of priority scheme in this type of scenario will result in huge profit to the Company and more customer satisfaction.Keywords: queue, priority class, preemptive, non-preemptive, mean waiting time
Procedia PDF Downloads 3966033 Features of Normative and Pathological Realizations of Sibilant Sounds for Computer-Aided Pronunciation Evaluation in Children
Authors: Zuzanna Miodonska, Michal Krecichwost, Pawel Badura
Abstract:
Sigmatism (lisping) is a speech disorder in which sibilant consonants are mispronounced. The diagnosis of this phenomenon is usually based on the auditory assessment. However, the progress in speech analysis techniques creates a possibility of developing computer-aided sigmatism diagnosis tools. The aim of the study is to statistically verify whether specific acoustic features of sibilant sounds may be related to pronunciation correctness. Such knowledge can be of great importance while implementing classifiers and designing novel tools for automatic sibilants pronunciation evaluation. The study covers analysis of various speech signal measures, including features proposed in the literature for the description of normative sibilants realization. Amplitudes and frequencies of three fricative formants (FF) are extracted based on local spectral maxima of the friction noise. Skewness, kurtosis, four normalized spectral moments (SM) and 13 mel-frequency cepstral coefficients (MFCC) with their 1st and 2nd derivatives (13 Delta and 13 Delta-Delta MFCC) are included in the analysis as well. The resulting feature vector contains 51 measures. The experiments are performed on the speech corpus containing words with selected sibilant sounds (/ʃ, ʒ/) pronounced by 60 preschool children with proper pronunciation or with natural pathologies. In total, 224 /ʃ/ segments and 191 /ʒ/ segments are employed in the study. The Mann-Whitney U test is employed for the analysis of stigmatism and normative pronunciation. Statistically, significant differences are obtained in most of the proposed features in children divided into these two groups at p < 0.05. All spectral moments and fricative formants appear to be distinctive between pathology and proper pronunciation. These metrics describe the friction noise characteristic for sibilants, which makes them particularly promising for the use in sibilants evaluation tools. Correspondences found between phoneme feature values and an expert evaluation of the pronunciation correctness encourage to involve speech analysis tools in diagnosis and therapy of sigmatism. Proposed feature extraction methods could be used in a computer-assisted stigmatism diagnosis or therapy systems.Keywords: computer-aided pronunciation evaluation, sigmatism diagnosis, speech signal analysis, statistical verification
Procedia PDF Downloads 3016032 Slip Limit Prediction of High-Strength Bolt Joints Based on Local Approach
Authors: Chang He, Hiroshi Tamura, Hiroshi Katsuchi, Jiaqi Wang
Abstract:
In this study, the aim is to infer the slip limit (static friction limit) of contact interfaces in bolt friction joints by analyzing other bolt friction joints with the same contact surface but in a different shape. By using the Weibull distribution to deal with microelements on the contact surface statistically, the slip limit of a certain type of bolt joint was predicted from other types of bolt joint with the same contact surface. As a result, this research succeeded in predicting the slip limit of bolt joins with different numbers of contact surfaces and with different numbers of bolt rows.Keywords: bolt joints, slip coefficient, finite element method, Weibull distribution
Procedia PDF Downloads 1706031 Real-Time Monitoring of Drinking Water Quality Using Advanced Devices
Authors: Amani Abdallah, Isam Shahrour
Abstract:
The quality of drinking water is a major concern of public health. The control of this quality is generally performed in the laboratory, which requires a long time. This type of control is not adapted for accidental pollution from sudden events, which can have serious consequences on population health. Therefore, it is of major interest to develop real-time innovative solutions for the detection of accidental contamination in drinking water systems This paper presents researches conducted within the SunRise Demonstrator for ‘Smart and Sustainable Cities’ with a particular focus on the supervision of the water quality. This work aims at (i) implementing a smart water system in a large water network (Campus of the University Lille1) including innovative equipment for real-time detection of abnormal events, such as those related to the contamination of drinking water and (ii) develop a numerical modeling of the contamination diffusion in the water distribution system. The first step included verification of the water quality sensors and their effectiveness on a network prototype of 50m length. This part included the evaluation of the efficiency of these sensors in the detection both bacterial and chemical contamination events in drinking water distribution systems. An on-line optical sensor integral with a laboratory-scale distribution system (LDS) was shown to respond rapidly to changes in refractive index induced by injected loads of chemical (cadmium, mercury) and biological contaminations (Escherichia coli). All injected substances were detected by the sensor; the magnitude of the response depends on the type of contaminant introduced and it is proportional to the injected substance concentration.Keywords: distribution system, drinking water, refraction index, sensor, real-time
Procedia PDF Downloads 3556030 Landslide Susceptibility Mapping Using Soft Computing in Amhara Saint
Authors: Semachew M. Kassa, Africa M Geremew, Tezera F. Azmatch, Nandyala Darga Kumar
Abstract:
Frequency ratio (FR) and analytical hierarchy process (AHP) methods are developed based on past landslide failure points to identify the landslide susceptibility mapping because landslides can seriously harm both the environment and society. However, it is still difficult to select the most efficient method and correctly identify the main driving factors for particular regions. In this study, we used fourteen landslide conditioning factors (LCFs) and five soft computing algorithms, including Random Forest (RF), Support Vector Machine (SVM), Logistic Regression (LR), Artificial Neural Network (ANN), and Naïve Bayes (NB), to predict the landslide susceptibility at 12.5 m spatial scale. The performance of the RF (F1-score: 0.88, AUC: 0.94), ANN (F1-score: 0.85, AUC: 0.92), and SVM (F1-score: 0.82, AUC: 0.86) methods was significantly better than the LR (F1-score: 0.75, AUC: 0.76) and NB (F1-score: 0.73, AUC: 0.75) method, according to the classification results based on inventory landslide points. The findings also showed that around 35% of the study region was made up of places with high and very high landslide risk (susceptibility greater than 0.5). The very high-risk locations were primarily found in the western and southeastern regions, and all five models showed good agreement and similar geographic distribution patterns in landslide susceptibility. The towns with the highest landslide risk include Amhara Saint Town's western part, the Northern part, and St. Gebreal Church villages, with mean susceptibility values greater than 0.5. However, rainfall, distance to road, and slope were typically among the top leading factors for most villages. The primary contributing factors to landslide vulnerability were slightly varied for the five models. Decision-makers and policy planners can use the information from our study to make informed decisions and establish policies. It also suggests that various places should take different safeguards to reduce or prevent serious damage from landslide events.Keywords: artificial neural network, logistic regression, landslide susceptibility, naïve Bayes, random forest, support vector machine
Procedia PDF Downloads 826029 Reliability Based Topology Optimization: An Efficient Method for Material Uncertainty
Authors: Mehdi Jalalpour, Mazdak Tootkaboni
Abstract:
We present a computationally efficient method for reliability-based topology optimization under material properties uncertainty, which is assumed to be lognormally distributed and correlated within the domain. Computational efficiency is achieved through estimating the response statistics with stochastic perturbation of second order, using these statistics to fit an appropriate distribution that follows the empirical distribution of the response, and employing an efficient gradient-based optimizer. The proposed algorithm is utilized for design of new structures and the changes in the optimized topology is discussed for various levels of target reliability and correlation strength. Predictions were verified thorough comparison with results obtained using Monte Carlo simulation.Keywords: material uncertainty, stochastic perturbation, structural reliability, topology optimization
Procedia PDF Downloads 6056028 Optimal Capacitors Placement and Sizing Improvement Based on Voltage Reduction for Energy Efficiency
Authors: Zilaila Zakaria, Muhd Azri Abdul Razak, Muhammad Murtadha Othman, Mohd Ainor Yahya, Ismail Musirin, Mat Nasir Kari, Mohd Fazli Osman, Mohd Zaini Hassan, Baihaki Azraee
Abstract:
Energy efficiency can be realized by minimizing the power loss with a sufficient amount of energy used in an electrical distribution system. In this report, a detailed analysis of the energy efficiency of an electric distribution system was carried out with an implementation of the optimal capacitor placement and sizing (OCPS). The particle swarm optimization (PSO) will be used to determine optimal location and sizing for the capacitors whereas energy consumption and power losses minimization will improve the energy efficiency. In addition, a certain number of busbars or locations are identified in advance before the PSO is performed to solve OCPS. In this case study, three techniques are performed for the pre-selection of busbar or locations which are the power-loss-index (PLI). The particle swarm optimization (PSO) is designed to provide a new population with improved sizing and location of capacitors. The total cost of power losses, energy consumption and capacitor installation are the components considered in the objective and fitness functions of the proposed optimization technique. Voltage magnitude limit, total harmonic distortion (THD) limit, power factor limit and capacitor size limit are the parameters considered as the constraints for the proposed of optimization technique. In this research, the proposed methodologies implemented in the MATLAB® software will transfer the information, execute the three-phase unbalanced load flow solution and retrieve then collect the results or data from the three-phase unbalanced electrical distribution systems modeled in the SIMULINK® software. Effectiveness of the proposed methods used to improve the energy efficiency has been verified through several case studies and the results are obtained from the test systems of IEEE 13-bus unbalanced electrical distribution system and also the practical electrical distribution system model of Sultan Salahuddin Abdul Aziz Shah (SSAAS) government building in Shah Alam, Selangor.Keywords: particle swarm optimization, pre-determine of capacitor locations, optimal capacitors placement and sizing, unbalanced electrical distribution system
Procedia PDF Downloads 4346027 Distribution and Historical Trends of PAHs Deposition in Recent Sediment Cores of the Imo River, SE Nigeria
Authors: Miranda I. Dosunmu, Orok E. Oyo-Ita, Inyang O. Oyo-Ita
Abstract:
Polycyclic aromatic hydrocarbons (PAHs) are a class of priority listed organic pollutants due to their carcinogenicity, mutagenity, acute toxicity and persistency in the environment. The distribution and historical changes of PAHs contamination in recent sediment cores from the Imo River were investigated using gas chromatography coupled with mass spectrometer. The concentrations of total PAHs (TPAHs) ranging from 402.37 ng/g dry weight (dw) at the surface layer of the Estuary zone (ESC6; 0-5 cm) to 92,388.59 ng/g dw at the near surface layer of the Afam zone (ASC5; 5-10 cm) indicate that PAHs contamination was localized not only between sample sites but also within the same cores. Sediment-depth profiles for the four (Afam, Mangrove, Estuary and illegal Petroleum refinery) cores revealed irregular distribution patterns in the TPAH concentrations except the fact that these levels became maximized at the near surface layers (5-10 cm) corresponding to a geological time-frame of about 1996-2004. This time scale coincided with the period of intensive bunkering and oil pipeline vandalization by the Niger Delta militant groups. Also a general slight decline was found in the TPAHs levels from near the surface layers (5-10 cm) to the most recent top layers (0-5 cm) of the cores, attributable to the recent effort by the Nigerian government in clamping down the illegal activity of the economic saboteurs. Therefore, the recent amnesty period granted to the militant groups should be extended. Although mechanism of perylene formation still remains enigmatic, examination of its distributions down cores indicates natural biogenic, pyrogenic and petrogenic origins for the compound at different zones. Thus, the characteristic features of the Imo River environment provide a means of tracing diverse origins for perylene.Keywords: perylene, historical trend, distribution, origin, Imo River
Procedia PDF Downloads 2516026 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia
Authors: Yuyun Wabula, B. J. Dewancker
Abstract:
In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.Keywords: geolocation, Twitter, distribution analysis, human mobility
Procedia PDF Downloads 3146025 Assessing Significance of Correlation with Binomial Distribution
Authors: Vijay Kumar Singh, Pooja Kushwaha, Prabhat Ranjan, Krishna Kumar Ojha, Jitendra Kumar
Abstract:
Present day high-throughput genomic technologies, NGS/microarrays, are producing large volume of data that require improved analysis methods to make sense of the data. The correlation between genes and samples has been regularly used to gain insight into many biological phenomena including, but not limited to, co-expression/co-regulation, gene regulatory networks, clustering and pattern identification. However, presence of outliers and violation of assumptions underlying Pearson correlation is frequent and may distort the actual correlation between the genes and lead to spurious conclusions. Here, we report a method to measure the strength of association between genes. The method assumes that the expression values of a gene are Bernoulli random variables whose outcome depends on the sample being probed. The method considers the two genes as uncorrelated if the number of sample with same outcome for both the genes (Ns) is equal to certainly expected number (Es). The extent of correlation depends on how far Ns can deviate from the Es. The method does not assume normality for the parent population, fairly unaffected by the presence of outliers, can be applied to qualitative data and it uses the binomial distribution to assess the significance of association. At this stage, we would not claim about the superiority of the method over other existing correlation methods, but our method could be another way of calculating correlation in addition to existing methods. The method uses binomial distribution, which has not been used until yet, to assess the significance of association between two variables. We are evaluating the performance of our method on NGS/microarray data, which is noisy and pierce by the outliers, to see if our method can differentiate between spurious and actual correlation. While working with the method, it has not escaped our notice that the method could also be generalized to measure the association of more than two variables which has been proven difficult with the existing methods.Keywords: binomial distribution, correlation, microarray, outliers, transcriptome
Procedia PDF Downloads 4156024 From Shallow Semantic Representation to Deeper One: Verb Decomposition Approach
Authors: Aliaksandr Huminski
Abstract:
Semantic Role Labeling (SRL) as shallow semantic parsing approach includes recognition and labeling arguments of a verb in a sentence. Verb participants are linked with specific semantic roles (Agent, Patient, Instrument, Location, etc.). Thus, SRL can answer on key questions such as ‘Who’, ‘When’, ‘What’, ‘Where’ in a text and it is widely applied in dialog systems, question-answering, named entity recognition, information retrieval, and other fields of NLP. However, SRL has the following flaw: Two sentences with identical (or almost identical) meaning can have different semantic role structures. Let consider 2 sentences: (1) John put butter on the bread. (2) John buttered the bread. SRL for (1) and (2) will be significantly different. For the verb put in (1) it is [Agent + Patient + Goal], but for the verb butter in (2) it is [Agent + Goal]. It happens because of one of the most interesting and intriguing features of a verb: Its ability to capture participants as in the case of the verb butter, or their features as, say, in the case of the verb drink where the participant’s feature being liquid is shared with the verb. This capture looks like a total fusion of meaning and cannot be decomposed in direct way (in comparison with compound verbs like babysit or breastfeed). From this perspective, SRL looks really shallow to represent semantic structure. If the key point in semantic representation is an opportunity to use it for making inferences and finding hidden reasons, it assumes by default that two different but semantically identical sentences must have the same semantic structure. Otherwise we will have different inferences from the same meaning. To overcome the above-mentioned flaw, the following approach is suggested. Assume that: P is a participant of relation; F is a feature of a participant; Vcp is a verb that captures a participant; Vcf is a verb that captures a feature of a participant; Vpr is a primitive verb or a verb that does not capture any participant and represents only a relation. In another word, a primitive verb is a verb whose meaning does not include meanings from its surroundings. Then Vcp and Vcf can be decomposed as: Vcp = Vpr +P; Vcf = Vpr +F. If all Vcp and Vcf will be represented this way, then primitive verbs Vpr can be considered as a canonical form for SRL. As a result of that, there will be no hidden participants caught by a verb since all participants will be explicitly unfolded. An obvious example of Vpr is the verb go, which represents pure movement. In this case the verb drink can be represented as man-made movement of liquid into specific direction. Extraction and using primitive verbs for SRL create a canonical representation unique for semantically identical sentences. It leads to the unification of semantic representation. In this case, the critical flaw related to SRL will be resolved.Keywords: decomposition, labeling, primitive verbs, semantic roles
Procedia PDF Downloads 3676023 Undercooling of Refractory High-Entropy Alloy
Authors: Liang Hu
Abstract:
The innovation of refractory high-entropy alloy (RHEA) formed from refractory metals W, Ta, Mo, Nb, Hf, V, and Zr was firstly implemented in 2010 to obtain better strength at high temperature than conventional HEAs based on Al, Co, Cr, Cu, Fe and Ni. Due to the refractory characteristic and high chemical activity at elevated temperature, electrostatic levitation technique has been utilized to fulfill the rapid solidification of RHEA. Several RHEAs consisting W, Ta, Mo, Nb, Zr have been selected to perform the undercooling and rapid solidification by ESL. They are substantially undercooled by up to 0.2TL. The evolution of as-solidified microstructure and component redistribution with undercooling have been investigated by SEM, EBSD, and EPMA analysis. According to the EPMA results of composing elements at different undercooling levels, the chemical distribution relevant to undercooling was also analyzed.Keywords: chemical distribution, high-entropy alloy, rapid solidification, undercooling
Procedia PDF Downloads 1286022 Reliability Indices Evaluation of SEIG Rotor Core Magnetization with Minimum Capacitive Excitation for WECs
Authors: Lokesh Varshney, R. K. Saket
Abstract:
This paper presents reliability indices evaluation of the rotor core magnetization of the induction motor operated as a self-excited induction generator by using probability distribution approach and Monte Carlo simulation. Parallel capacitors with calculated minimum capacitive value across the terminals of the induction motor operating as a SEIG with unregulated shaft speed have been connected during the experimental study. A three phase, 4 poles, 50Hz, 5.5 hp, 12.3A, 230V induction motor coupled with DC Shunt Motor was tested in the electrical machine laboratory with variable reactive loads. Based on this experimental study, it is possible to choose a reliable induction machine operating as a SEIG for unregulated renewable energy application in remote area or where grid is not available. Failure density function, cumulative failure distribution function, survivor function, hazard model, probability of success and probability of failure for reliability evaluation of the three phase induction motor operating as a SEIG have been presented graphically in this paper.Keywords: residual magnetism, magnetization curve, induction motor, self excited induction generator, probability distribution, Monte Carlo simulation
Procedia PDF Downloads 5586021 Optical and Double Folding Analysis for 6Li+16O Elastic Scattering
Authors: Abd Elrahman Elgamala, N. Darwish, I. Bondouk, Sh. Hamada
Abstract:
Available experimental angular distributions for 6Li elastically scattered from 16O nucleus in the energy range 13.0–50.0 MeV are investigated and reanalyzed using optical model of the conventional phenomenological potential and also using double folding optical model of different interaction models: DDM3Y1, CDM3Y1, CDM3Y2, and CDM3Y3. All the involved models of interaction are of M3Y Paris except DDM3Y1 which is of M3Y Reid and the main difference between them lies in the different values for the parameters of the incorporated density distribution function F(ρ). We have extracted the renormalization factor NR for 6Li+16O nuclear system in the energy range 13.0–50.0 MeV using the aforementioned interaction models.Keywords: elastic scattering, optical model, folding potential, density distribution
Procedia PDF Downloads 1416020 A Succinct Method for Allocation of Reactive Power Loss in Deregulated Scenario
Authors: J. S. Savier
Abstract:
Real power is the component power which is converted into useful energy whereas reactive power is the component of power which cannot be converted to useful energy but it is required for the magnetization of various electrical machineries. If the reactive power is compensated at the consumer end, the need for reactive power flow from generators to the load can be avoided and hence the overall power loss can be reduced. In this scenario, this paper presents a succinct method called JSS method for allocation of reactive power losses to consumers connected to radial distribution networks in a deregulated environment. The proposed method has the advantage that no assumptions are made while deriving the reactive power loss allocation method.Keywords: deregulation, reactive power loss allocation, radial distribution systems, succinct method
Procedia PDF Downloads 3766019 A Prediction Method of Pollutants Distribution Pattern: Flare Motion Using Computational Fluid Dynamics (CFD) Fluent Model with Weather Research Forecast Input Model during Transition Season
Authors: Benedictus Asriparusa, Lathifah Al Hakimi, Aulia Husada
Abstract:
A large amount of energy is being wasted by the release of natural gas associated with the oil industry. This release interrupts the environment particularly atmosphere layer condition globally which contributes to global warming impact. This research presents an overview of the methods employed by researchers in PT. Chevron Pacific Indonesia in the Minas area to determine a new prediction method of measuring and reducing gas flaring and its emission. The method emphasizes advanced research which involved analytical studies, numerical studies, modeling, and computer simulations, amongst other techniques. A flaring system is the controlled burning of natural gas in the course of routine oil and gas production operations. This burning occurs at the end of a flare stack or boom. The combustion process releases emissions of greenhouse gases such as NO2, CO2, SO2, etc. This condition will affect the chemical composition of air and environment around the boundary layer mainly during transition season. Transition season in Indonesia is absolutely very difficult condition to predict its pattern caused by the difference of two air mass conditions. This paper research focused on transition season in 2013. A simulation to create the new pattern of the pollutants distribution is needed. This paper has outlines trends in gas flaring modeling and current developments to predict the dominant variables in the pollutants distribution. A Fluent model is used to simulate the distribution of pollutants gas coming out of the stack, whereas WRF model output is used to overcome the limitations of the analysis of meteorological data and atmospheric conditions in the study area. Based on the running model, the most influence factor was wind speed. The goal of the simulation is to predict the new pattern based on the time of fastest wind and slowest wind occurs for pollutants distribution. According to the simulation results, it can be seen that the fastest wind (last of March) moves pollutants in a horizontal direction and the slowest wind (middle of May) moves pollutants vertically. Besides, the design of flare stack in compliance according to EPA Oil and Gas Facility Stack Parameters likely shows pollutants concentration remains on the under threshold NAAQS (National Ambient Air Quality Standards).Keywords: flare motion, new prediction, pollutants distribution, transition season, WRF model
Procedia PDF Downloads 5566018 Simulation Study on Particle Fluidization and Drying in a Spray Fluidized Bed
Authors: Jinnan Guo, Daoyin Liu
Abstract:
The quality of final products in the coating process significantly depends on particle fluidization and drying in the spray-fluidized bed. In this study, fluidizing gas temperature and velocity are changed, and their effects on particle flow, moisture content, and heat transfer in a spray fluidized bed are investigated by the CFD – Discrete Element Model (DEM). The gas flow velocity distribution of the fluidized bed is symmetrical, with high velocity in the middle and low velocity on both sides. During the heating process, the particles inside the central tube and at the bottom of the bed are rapidly heated. The particle circulation in the annular area is heated slowly and the temperature is low. The inconsistency of particle circulation results in two peaks in the probability density distribution of the particle temperature during the heating process, and the overall temperature of the particles increases uniformly. During the drying process, the distribution of particle moisture transitions from initial uniform moisture to two peaks, and then the number of completely dried (moisture content of 0) particles gradually increases. Increasing the fluidizing gas temperature and velocity improves particle circulation, drying and heat transfer in the bed. The current study provides an effective method for studying the hydrodynamics of spray fluidized beds with simultaneous processes of heating and particle fluidization.Keywords: heat transfer, CFD-DEM, spray fluidized bed, drying
Procedia PDF Downloads 716017 An Algebraic Geometric Imaging Approach for Automatic Dairy Cow Body Condition Scoring System
Authors: Thi Thi Zin, Pyke Tin, Ikuo Kobayashi, Yoichiro Horii
Abstract:
Today dairy farm experts and farmers have well recognized the importance of dairy cow Body Condition Score (BCS) since these scores can be used to optimize milk production, managing feeding system and as an indicator for abnormality in health even can be utilized to manage for having healthy calving times and process. In tradition, BCS measures are done by animal experts or trained technicians based on visual observations focusing on pin bones, pin, thurl and hook area, tail heads shapes, hook angles and short and long ribs. Since the traditional technique is very manual and subjective, the results can lead to different scores as well as not cost effective. Thus this paper proposes an algebraic geometric imaging approach for an automatic dairy cow BCS system. The proposed system consists of three functional modules. In the first module, significant landmarks or anatomical points from the cow image region are automatically extracted by using image processing techniques. To be specific, there are 23 anatomical points in the regions of ribs, hook bones, pin bone, thurl and tail head. These points are extracted by using block region based vertical and horizontal histogram methods. According to animal experts, the body condition scores depend mainly on the shape structure these regions. Therefore the second module will investigate some algebraic and geometric properties of the extracted anatomical points. Specifically, the second order polynomial regression is employed to a subset of anatomical points to produce the regression coefficients which are to be utilized as a part of feature vector in scoring process. In addition, the angles at thurl, pin, tail head and hook bone area are computed to extend the feature vector. Finally, in the third module, the extracted feature vectors are trained by using Markov Classification process to assign BCS for individual cows. Then the assigned BCS are revised by using multiple regression method to produce the final BCS score for dairy cows. In order to confirm the validity of proposed method, a monitoring video camera is set up at the milk rotary parlor to take top view images of cows. The proposed method extracts the key anatomical points and the corresponding feature vectors for each individual cows. Then the multiple regression calculator and Markov Chain Classification process are utilized to produce the estimated body condition score for each cow. The experimental results tested on 100 dairy cows from self-collected dataset and public bench mark dataset show very promising with accuracy of 98%.Keywords: algebraic geometric imaging approach, body condition score, Markov classification, polynomial regression
Procedia PDF Downloads 159