Search results for: evolved bat algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4089

Search results for: evolved bat algorithm

2229 Local Directional Encoded Derivative Binary Pattern Based Coral Image Classification Using Weighted Distance Gray Wolf Optimization Algorithm

Authors: Annalakshmi G., Sakthivel Murugan S.

Abstract:

This paper presents a local directional encoded derivative binary pattern (LDEDBP) feature extraction method that can be applied for the classification of submarine coral reef images. The classification of coral reef images using texture features is difficult due to the dissimilarities in class samples. In coral reef image classification, texture features are extracted using the proposed method called local directional encoded derivative binary pattern (LDEDBP). The proposed approach extracts the complete structural arrangement of the local region using local binary batten (LBP) and also extracts the edge information using local directional pattern (LDP) from the edge response available in a particular region, thereby achieving extra discriminative feature value. Typically the LDP extracts the edge details in all eight directions. The process of integrating edge responses along with the local binary pattern achieves a more robust texture descriptor than the other descriptors used in texture feature extraction methods. Finally, the proposed technique is applied to an extreme learning machine (ELM) method with a meta-heuristic algorithm known as weighted distance grey wolf optimizer (GWO) to optimize the input weight and biases of single-hidden-layer feed-forward neural networks (SLFN). In the empirical results, ELM-WDGWO demonstrated their better performance in terms of accuracy on all coral datasets, namely RSMAS, EILAT, EILAT2, and MLC, compared with other state-of-the-art algorithms. The proposed method achieves the highest overall classification accuracy of 94% compared to the other state of art methods.

Keywords: feature extraction, local directional pattern, ELM classifier, GWO optimization

Procedia PDF Downloads 167
2228 The Utility and the Consequences of Counter Terrorism Financing

Authors: Fatemah Alzubairi

Abstract:

Terrorism financing is a theme that dramatically evolved post-9/11. Supra-national bodies, above all UN Security Council and the Financial Action Task Form (FATF), have established an executive-like mechanism, which allows blacklisting individuals and groups, freezing their funds, and restricting their travel, all of which have become part of states’ anti-terrorism frameworks. A number of problems arise from building counter-terrorism measures on the foundation of a vague definition of terrorism. This paper examines the utility and consequences of counter-terrorism financing with considering the lack of an international definition of terrorism. The main problem with national and international anti-terrorism legislation is the lack of a clear objective definition of terrorism. Most, if not all, national laws are broad and vague. Determining what terrorism remains the crucial underpinning of any successful discussion of counter-terrorism, and of the future success of counter-terrorist measures. This paper focuses on the legal and political consequences of equalizing the treatment of violent terrorist crimes, such as bombing, with non-violent terrorism-related crimes, such as funding terrorist groups. While both sorts of acts requires criminalization, treating them equally risks wrongfully or unfairly condemning innocent people who have associated with “terrorists” but are not involved in terrorist activities. This paper examines whether global obligations to counter terrorism financing focus on controlling terrorist groups more than terrorist activities. It also examines the utility of the obligations adopted by the UN Security Council and FATF, and whether they serve global security; or whether the utility is largely restricted to Western security, with little attention paid to the unique needs and demands of other regions.

Keywords: counter-terrorism, definition of terrorism, FATF, security, terrorism financing, UN Security Council

Procedia PDF Downloads 328
2227 The Design and Implementation of an Enhanced 2D Mesh Switch

Authors: Manel Langar, Riad Bourguiba, Jaouhar Mouine

Abstract:

In this paper, we propose the design and implementation of an enhanced wormhole virtual channel on chip router. It is a heart of a mesh NoC using the XY deterministic routing algorithm. It is characterized by its simple virtual channel allocation strategy which allows reducing area and complexity of connections without affecting the performance. We implemented our router on a Tezzaron process to validate its performances. This router is a basic element that will be used later to design a 3D mesh NoC.

Keywords: NoC, mesh, router, 3D NoC

Procedia PDF Downloads 570
2226 Investigating the Algorithm to Maintain a Constant Speed in the Wankel Engine

Authors: Adam Majczak, Michał Bialy, Zbigniew Czyż, Zdzislaw Kaminski

Abstract:

Increasingly stringent emission standards for passenger cars require us to find alternative drives. The share of electric vehicles in the sale of new cars increases every year. However, their performance and, above all, range cannot be today successfully compared to those of cars with a traditional internal combustion engine. Battery recharging lasts hours, which can be hardly accepted due to the time needed to refill a fuel tank. Therefore, the ways to reduce the adverse features of cars equipped with electric motors only are searched for. One of the methods is a combination of an electric engine as a main source of power and a small internal combustion engine as an electricity generator. This type of drive enables an electric vehicle to achieve a radically increased range and low emissions of toxic substances. For several years, the leading automotive manufacturers like the Mazda and the Audi together with the best companies in the automotive industry, e.g., AVL have developed some electric drive systems capable of recharging themselves while driving, known as a range extender. An electricity generator is powered by a Wankel engine that has seemed to pass into history. This low weight and small engine with a rotating piston and a very low vibration level turned out to be an excellent source in such applications. Its operation as an energy source for a generator almost entirely eliminates its disadvantages like high fuel consumption, high emission of toxic substances, or short lifetime typical of its traditional application. The operation of the engine at a constant rotational speed enables a significant increase in its lifetime, and its small external dimensions enable us to make compact modules to drive even small urban cars like the Audi A1 or the Mazda 2. The algorithm to maintain a constant speed was investigated on the engine dynamometer with an eddy current brake and the necessary measuring apparatus. The research object was the Aixro XR50 rotary engine with the electronic power supply developed at the Lublin University of Technology. The load torque of the engine was altered during the research by means of the eddy current brake capable of giving any number of load cycles. The parameters recorded included speed and torque as well as a position of a throttle in an inlet system. Increasing and decreasing load did not significantly change engine speed, which means that control algorithm parameters are correctly selected. This work has been financed by the Polish Ministry of Science and Higher Education.

Keywords: electric vehicle, power generator, range extender, Wankel engine

Procedia PDF Downloads 157
2225 Ensemble Machine Learning Approach for Estimating Missing Data from CO₂ Time Series

Authors: Atbin Mahabbati, Jason Beringer, Matthias Leopold

Abstract:

To address the global challenges of climate and environmental changes, there is a need for quantifying and reducing uncertainties in environmental data, including observations of carbon, water, and energy. Global eddy covariance flux tower networks (FLUXNET), and their regional counterparts (i.e., OzFlux, AmeriFlux, China Flux, etc.) were established in the late 1990s and early 2000s to address the demand. Despite the capability of eddy covariance in validating process modelling analyses, field surveys and remote sensing assessments, there are some serious concerns regarding the challenges associated with the technique, e.g. data gaps and uncertainties. To address these concerns, this research has developed an ensemble model to fill the data gaps of CO₂ flux to avoid the limitations of using a single algorithm, and therefore, provide less error and decline the uncertainties associated with the gap-filling process. In this study, the data of five towers in the OzFlux Network (Alice Springs Mulga, Calperum, Gingin, Howard Springs and Tumbarumba) during 2013 were used to develop an ensemble machine learning model, using five feedforward neural networks (FFNN) with different structures combined with an eXtreme Gradient Boosting (XGB) algorithm. The former methods, FFNN, provided the primary estimations in the first layer, while the later, XGB, used the outputs of the first layer as its input to provide the final estimations of CO₂ flux. The introduced model showed slight superiority over each single FFNN and the XGB, while each of these two methods was used individually, overall RMSE: 2.64, 2.91, and 3.54 g C m⁻² yr⁻¹ respectively (3.54 provided by the best FFNN). The most significant improvement happened to the estimation of the extreme diurnal values (during midday and sunrise), as well as nocturnal estimations, which is generally considered as one of the most challenging parts of CO₂ flux gap-filling. The towers, as well as seasonality, showed different levels of sensitivity to improvements provided by the ensemble model. For instance, Tumbarumba showed more sensitivity compared to Calperum, where the differences between the Ensemble model on the one hand and the FFNNs and XGB, on the other hand, were the least of all 5 sites. Besides, the performance difference between the ensemble model and its components individually were more significant during the warm season (Jan, Feb, Mar, Oct, Nov, and Dec) compared to the cold season (Apr, May, Jun, Jul, Aug, and Sep) due to the higher amount of photosynthesis of plants, which led to a larger range of CO₂ exchange. In conclusion, the introduced ensemble model slightly improved the accuracy of CO₂ flux gap-filling and robustness of the model. Therefore, using ensemble machine learning models is potentially capable of improving data estimation and regression outcome when it seems to be no more room for improvement while using a single algorithm.

Keywords: carbon flux, Eddy covariance, extreme gradient boosting, gap-filling comparison, hybrid model, OzFlux network

Procedia PDF Downloads 145
2224 Feature Evaluation Based on Random Subspace and Multiple-K Ensemble

Authors: Jaehong Yu, Seoung Bum Kim

Abstract:

Clustering analysis can facilitate the extraction of intrinsic patterns in a dataset and reveal its natural groupings without requiring class information. For effective clustering analysis in high dimensional datasets, unsupervised dimensionality reduction is an important task. Unsupervised dimensionality reduction can generally be achieved by feature extraction or feature selection. In many situations, feature selection methods are more appropriate than feature extraction methods because of their clear interpretation with respect to the original features. The unsupervised feature selection can be categorized as feature subset selection and feature ranking method, and we focused on unsupervised feature ranking methods which evaluate the features based on their importance scores. Recently, several unsupervised feature ranking methods were developed based on ensemble approaches to achieve their higher accuracy and stability. However, most of the ensemble-based feature ranking methods require the true number of clusters. Furthermore, these algorithms evaluate the feature importance depending on the ensemble clustering solution, and they produce undesirable evaluation results if the clustering solutions are inaccurate. To address these limitations, we proposed an ensemble-based feature ranking method with random subspace and multiple-k ensemble (FRRM). The proposed FRRM algorithm evaluates the importance of each feature with the random subspace ensemble, and all evaluation results are combined with the ensemble importance scores. Moreover, FRRM does not require the determination of the true number of clusters in advance through the use of the multiple-k ensemble idea. Experiments on various benchmark datasets were conducted to examine the properties of the proposed FRRM algorithm and to compare its performance with that of existing feature ranking methods. The experimental results demonstrated that the proposed FRRM outperformed the competitors.

Keywords: clustering analysis, multiple-k ensemble, random subspace-based feature evaluation, unsupervised feature ranking

Procedia PDF Downloads 342
2223 Irrigation Challenges, Climate Change Adaptation and Sustainable Water Usage in Developing Countries. A Case Study, Nigeria

Authors: Faith Eweluegim Enahoro-Ofagbe

Abstract:

Worldwide, every nation is experiencing the effects of global warming. In developing countries, due to the heavy reliance on agriculture for socioeconomic growth and security, among other things, these countries are more affected by climate change, particularly with the availability of water. Floods, droughts, rising temperatures, saltwater intrusion, groundwater depletion, and other severe environmental alterations are all brought on by climatic change. Life depends on water, a vital resource; these ecological changes affect all water use, including agriculture and household water use. Therefore adequate and adaptive water usage strategies for sustainability are essential in developing countries. Therefore, this paper investigates Nigeria's challenges due to climate change and adaptive techniques that have evolved in response to such issues to ensure water management and sustainability for irrigation and provide quality water to residents. Questionnaires were distributed to respondents in the study area, central Nigeria, for quantitative evaluation of sustainable water resource management techniques. Physicochemical analysis was done, collecting soil and water samples from several locations under investigation. Findings show that farmers use different methods, ranging from intelligent technologies to traditional strategies for water resource management. Also, farmers need to learn better water resource management techniques for sustainability. Since more residents obtain their water from privately held sources, the government should enforce legislation to ensure that private borehole construction businesses treat water sources of poor quality before the general public uses them.

Keywords: developing countries, irrigation, strategies, sustainability, water resource management, water usage

Procedia PDF Downloads 119
2222 Modified Weibull Approach for Bridge Deterioration Modelling

Authors: Niroshan K. Walgama Wellalage, Tieling Zhang, Richard Dwight

Abstract:

State-based Markov deterioration models (SMDM) sometimes fail to find accurate transition probability matrix (TPM) values, and hence lead to invalid future condition prediction or incorrect average deterioration rates mainly due to drawbacks of existing nonlinear optimization-based algorithms and/or subjective function types used for regression analysis. Furthermore, a set of separate functions for each condition state with age cannot be directly derived by using Markov model for a given bridge element group, which however is of interest to industrial partners. This paper presents a new approach for generating Homogeneous SMDM model output, namely, the Modified Weibull approach, which consists of a set of appropriate functions to describe the percentage condition prediction of bridge elements in each state. These functions are combined with Bayesian approach and Metropolis Hasting Algorithm (MHA) based Markov Chain Monte Carlo (MCMC) simulation technique for quantifying the uncertainty in model parameter estimates. In this study, factors contributing to rail bridge deterioration were identified. The inspection data for 1,000 Australian railway bridges over 15 years were reviewed and filtered accordingly based on the real operational experience. Network level deterioration model for a typical bridge element group was developed using the proposed Modified Weibull approach. The condition state predictions obtained from this method were validated using statistical hypothesis tests with a test data set. Results show that the proposed model is able to not only predict the conditions in network-level accurately but also capture the model uncertainties with given confidence interval.

Keywords: bridge deterioration modelling, modified weibull approach, MCMC, metropolis-hasting algorithm, bayesian approach, Markov deterioration models

Procedia PDF Downloads 732
2221 The International Constitutional Order and Elements of Human Rights

Authors: Girma Y. Iyassu Menelik

Abstract:

“The world is now like a global village!” so goes the saying that shows that due to development and technology the countries of the world are now closely linked. In the field of Human rights there is a close relationship in the way that rights are recognised and enforced. This paper will show that human rights have evolved from ancient times through important landmarks such as the Magna Carta, the French Declaration of Rights of Man and of the Citizen and the American Bill of Rights. The formation of the United Nations after the Second World War resulted in the need to codify and protect human rights. There are some rights which are so fundamental that they are found in international and continental instruments, national constitutions and domestic legislation. In the civil and political sphere they include the right to vote, to freedom of association, speech and assembly, right to life, privacy and fair trial. In the economic and social sphere you have the right to work, protection of the family, social security and rights to education, health and shelter. In some instance some rights can be suspended in times of public emergency but such derogations shall be circumscribed by the law and in most constitutions such limitations are subject to judicial review. However, some rights are so crucial that they cannot be derogated from under any circumstances and these include the right to life, recognition before the law, freedom from torture and slavery and of thought, conscience and religion. International jurisprudence has been developed to protect fundamental rights and avoid discrimination on the grounds of race, colour, sex, language or social origin. The elaborate protection system go to show that these rights have become part of the international order and they have universal application. We have now got to a stage where UDHR, ICCPR and ICESCR and have come to be regarded as part of an international bill of rights with horizontal and vertical enforcement mechanisms involving state parties, NGO’s , international bodies and other organs.

Keywords: rights, international, constitutional, state, judiciary

Procedia PDF Downloads 457
2220 Graphical Theoretical Construction of Discrete time Share Price Paths from Matroid

Authors: Min Wang, Sergey Utev

Abstract:

The lessons from the 2007-09 global financial crisis have driven scientific research, which considers the design of new methodologies and financial models in the global market. The quantum mechanics approach was introduced in the unpredictable stock market modeling. One famous quantum tool is Feynman path integral method, which was used to model insurance risk by Tamturk and Utev and adapted to formalize the path-dependent option pricing by Hao and Utev. The research is based on the path-dependent calculation method, which is motivated by the Feynman path integral method. The path calculation can be studied in two ways, one way is to label, and the other is computational. Labeling is a part of the representation of objects, and generating functions can provide many different ways of representing share price paths. In this paper, the recent works on graphical theoretical construction of individual share price path via matroid is presented. Firstly, a study is done on the knowledge of matroid, relationship between lattice path matroid and Tutte polynomials and ways to connect points in the lattice path matroid and Tutte polynomials is suggested. Secondly, It is found that a general binary tree can be validly constructed from a connected lattice path matroid rather than general lattice path matroid. Lastly, it is suggested that there is a way to represent share price paths via a general binary tree, and an algorithm is developed to construct share price paths from general binary trees. A relationship is also provided between lattice integer points and Tutte polynomials of a transversal matroid. Use this way of connection together with the algorithm, a share price path can be constructed from a given connected lattice path matroid.

Keywords: combinatorial construction, graphical representation, matroid, path calculation, share price, Tutte polynomial

Procedia PDF Downloads 143
2219 The Development Stages of Transformation of Water Policy Management in Victoria

Authors: Ratri Werdiningtyas, Yongping Wei, Andrew Western

Abstract:

The status quo of social-ecological systems is the results of not only natural processes but also the accumulated consequence of policies applied in the past. Often water management objectives are challenging and are only achieved to a limited degree on the ground. In choosing water management approaches, it is important to account for current conditions and important differences due to varied histories. Since the mid-nineteenth century, Victorian water management has evolved through a series of policy regime shifts. The main goal of this research to explore and identify the stages of the evolution of the water policy instruments as practiced in Victoria from 1890-2016. This comparative historical analysis has identified four stages in Victorian policy instrument development. In the first stage, the creation of policy instruments aimed to match the demand and supply of the resource (reserve condition). The second stage begins after natural system alone failed to balance supply and demand. The focus of the policy instrument shifted to an authority perspective in this stage. Later, the increasing number of actors interested in water led to another change in policy instrument. The third stage focused on the significant role of information from different relevant actors. The fourth and current stage is the most advanced, in that it involved the creation of a policy instrument for synergizing the previous three focal factors: reserve, authority, and information. When considering policy in other jurisdiction, these findings suggest that a key priority should be to reflect on the jurisdictions current position among these four evolutionary stages and try to make improve progressively rather than directly adopting approaches from elsewhere without understanding the current position.

Keywords: policy instrument, policy transformation, socio-ecolgical system, water management

Procedia PDF Downloads 148
2218 Trade Openness, Productivity Growth And Economic Growth: Nigeria’s Experience

Authors: S. O. Okoro

Abstract:

Some words become the catch phrase of a particular decade. Globalization, Openness, and Privatization are certainly among the most frequently encapsulation of 1990’s; the market is ‘in’, ‘the state is out’. In the 1970’s, there were many political economists who spoke of autarky as one possible response to global economic forces. Be self-contained, go it alone, put up barriers to trans-nationalities, put in place import-substitution industrialization policy and grow domestic industries. In 1990’s, the emasculation of the state is by no means complete, but there is an acceptance that the state’s power is circumscribed by forces beyond its control and potential leverage. Autarky is no longer as a policy option. Nigeria, since its emergence as an independent nation, has evolved two macroeconomic management regimes of the interventionist and market friendly styles. This paper investigates Nigeria’s growth performance over the periods incorporating these two regimes and finds that there is no structural break in Total Factor Productivity, (TFP) growth and besides, the TFP growth over the entire period of study 1970-2012 is very negligible and hence growth can only be achieved by the unsustainable factor accumulation. Another important finding of this work is that the openness-human capital interaction term has a significant impact on the TFP growth, but the sign of the estimated coefficient does not meet it a theoretical expectation. This is because the negative coefficient on the human capital outweighs the positive openness effect. The poor quality of human capital is considered to have given rise to this. Given these results a massive investment in the education sector is required. The investment should be targeted at reforms that go beyond mere structural reforms to a reform agenda that will improve the quality of human capital in Nigeria.

Keywords: globalization, emasculation, openness and privatization, total factor productivity

Procedia PDF Downloads 245
2217 A Case-Study Analysis on the Necessity of Testing for Cyber Risk Mitigation on Maritime Transport

Authors: Polychronis Kapalidis

Abstract:

In recent years, researchers have started to turn their attention to cyber security and maritime security independently, neglecting, in most cases, to examine the areas where these two critical issues are intertwined. The impact of cybersecurity issues on the maritime economy is emerging dramatically. Maritime transport and all related activities are conducted by technology-intensive platforms, which today rely heavily on information systems. The paper’s argument is that when no defense is completely effective against cyber attacks, it is vital to test responses to the inevitable incursions. Hence, preparedness in the form of testing existing cybersecurity structure via different tools for potential attacks is vital for minimizing risks. Traditional criminal activities may further be facilitated and evolved through the misuse of cyberspace. Kidnap, piracy, fraud, theft of cargo and imposition of ransomware are the major of these activities that mainly target the industry’s most valuable asset; the ship. The paper, adopting a case-study analysis, based on stakeholder consultation and secondary data analysis, namely policy and strategic-related documentation, presents the importance of holistic testing in the sector. Arguing that poor understanding of the issue leads to the adoption of ineffective policies the paper will present the level of awareness within the industry and assess the risks and vulnerabilities of ships to these cybercriminal activities. It will conclude by suggesting that testing procedures must be focused on three main pillars within the maritime transport sector: the human factor, the infrastructure, and the procedures.

Keywords: cybercrime, cybersecurity, organized crime, risk mitigation

Procedia PDF Downloads 164
2216 A Preparatory Method for Building Construction Implemented in a Case Study in Brazil

Authors: Aline Valverde Arroteia, Tatiana Gondim do Amaral, Silvio Burrattino Melhado

Abstract:

During the last twenty years, the construction field in Brazil has evolved significantly in response to its market growing and competitiveness. However, this evolving path has faced many obstacles such as cultural barriers and the lack of efforts to achieve quality at the construction site. At the same time, the greatest amount of information generated on the designing or construction phases is lost due to the lack of an effective coordination of these activities. Face this problem, the aim of this research was to implement a French method named PEO which means preparation for building construction (in Portuguese) seeking to understand the design management process and its interface with the building construction phase. The research method applied was qualitative, and it was carried out through two case studies in the city of Goiania, in Goias, Brazil. The research was divided into two stages called pilot study at Company A and implementation of PEO at Company B. After the implementation; the results demonstrated the PEO method's effectiveness and feasibility while a booster on the quality improvement of design management. The analysis showed that the method has a purpose to improve the design and allow the reduction of failures, errors and rework commonly found in the production of buildings. Therefore, it can be concluded that the PEO is feasible to be applied to real estate and building companies. But, companies need to believe in the contribution they can make to the discovery of design failures in conjunction with other stakeholders forming a construction team. The result of PEO can be maximized when adopting the principles of simultaneous engineering and insertion of new computer technologies, which use a three-dimensional model of the building with BIM process.

Keywords: communication, design and construction interface management, preparation for building construction (PEO), proactive coordination (CPA)

Procedia PDF Downloads 165
2215 Building Information Modelling Based Value for Money Assessment in Public-Private Partnership

Authors: Guoqian Ren, Haijiang Li, Jisong Zhang

Abstract:

Over the past 40 years, urban development has undergone large-scale, high-speed expansion, beyond what was previously considered normal and in a manner not proportionally related to population growth or physical considerations. With more scientific and refined decision-making in the urban construction process, new urbanization approaches, aligned with public-private partnerships (PPPs) which evolved in the early 1990s, have become acceptable and, in some situations, even better solutions to outstanding urban municipal construction projects, especially in developing countries. However, as the main driving force to deal with urban public services, PPPs are still problematic regarding value for money (VFM) process in most large-scale construction projects. This paper therefore reviews recent PPP articles in popular project management journals and relevant toolkits, published in the last 10 years, to identify the indicators that influence VFM within PPPs across regions. With increasing concerns about profitability and environmental and social impacts, the current PPP structure requires a more integrated platform to manage multi-performance project life cycles. Building information modelling (BIM), a popular approach to the procurement process in AEC sectors, provides the potential to ensure VFM while also working in tandem with the semantic approach to holistically measure life cycle costs (LCC) and achieve better sustainability. This paper suggests that BIM applied to the entire PPP life cycle could support holistic decision-making regarding VFM processes and thus meet service targets.

Keywords: public-private partnership, value for money, building information modelling, semantic approach

Procedia PDF Downloads 215
2214 Web Data Scraping Technology Using Term Frequency Inverse Document Frequency to Enhance the Big Data Quality on Sentiment Analysis

Authors: Sangita Pokhrel, Nalinda Somasiri, Rebecca Jeyavadhanam, Swathi Ganesan

Abstract:

Tourism is a booming industry with huge future potential for global wealth and employment. There are countless data generated over social media sites every day, creating numerous opportunities to bring more insights to decision-makers. The integration of Big Data Technology into the tourism industry will allow companies to conclude where their customers have been and what they like. This information can then be used by businesses, such as those in charge of managing visitor centers or hotels, etc., and the tourist can get a clear idea of places before visiting. The technical perspective of natural language is processed by analysing the sentiment features of online reviews from tourists, and we then supply an enhanced long short-term memory (LSTM) framework for sentiment feature extraction of travel reviews. We have constructed a web review database using a crawler and web scraping technique for experimental validation to evaluate the effectiveness of our methodology. The text form of sentences was first classified through Vader and Roberta model to get the polarity of the reviews. In this paper, we have conducted study methods for feature extraction, such as Count Vectorization and TFIDF Vectorization, and implemented Convolutional Neural Network (CNN) classifier algorithm for the sentiment analysis to decide the tourist’s attitude towards the destinations is positive, negative, or simply neutral based on the review text that they posted online. The results demonstrated that from the CNN algorithm, after pre-processing and cleaning the dataset, we received an accuracy of 96.12% for the positive and negative sentiment analysis.

Keywords: counter vectorization, convolutional neural network, crawler, data technology, long short-term memory, web scraping, sentiment analysis

Procedia PDF Downloads 93
2213 Probabilistic Graphical Model for the Web

Authors: M. Nekri, A. Khelladi

Abstract:

The world wide web network is a network with a complex topology, the main properties of which are the distribution of degrees in power law, A low clustering coefficient and a weak average distance. Modeling the web as a graph allows locating the information in little time and consequently offering a help in the construction of the research engine. Here, we present a model based on the already existing probabilistic graphs with all the aforesaid characteristics. This work will consist in studying the web in order to know its structuring thus it will enable us to modelize it more easily and propose a possible algorithm for its exploration.

Keywords: clustering coefficient, preferential attachment, small world, web community

Procedia PDF Downloads 272
2212 Exploring the Intrinsic Ecology and Suitable Density of Historic Districts Through a Comparative Analysis of Ancient and Modern Ecological Smart Practices

Authors: Hu Changjuan, Gong Cong, Long Hao

Abstract:

Although urban ecological policies and the public's aspiration for livable environments have expedited the pace of ecological revitalization, historic districts that have evolved through natural ecological processes often become obsolete and less habitable amid rapid urbanization. This raises a critical question about historic districts inherently incapable of being ecological and livable. The thriving concept of ‘intrinsic ecology,’ characterized by its ability to transform city-district systems into healthy ecosystems with diverse environments, stable functions, and rapid restoration capabilities, holds potential for guiding the integration of ancient and modern ecological wisdom while supporting the dynamic involvement of cultures. This study explores the intrinsic ecology of historic districts from three aspects: 1) Population Density: By comparing the population density before urban population expansion to the present day, determine the reasonable population density for historic districts. 2) Building Density: Using the ‘Space-mate’ tool for comparative analysis, form a spatial matrix to explore the intrinsic ecology of building density in Chinese historic districts. 3) Green Capacity Ratio: By using ecological districts as control samples, conduct dual comparative analyses (related comparison and upgraded comparison) to determine the intrinsic ecological advantages of the two-dimensional and three-dimensional green volume in historic districts. The study inform a density optimization strategy that supports cultural, social, natural, and economic ecology, contributing to the creation of eco-historic districts.

Keywords: eco-historic districts, intrinsic ecology, suitable density, green capacity ratio.

Procedia PDF Downloads 30
2211 Neural Network Based Control Algorithm for Inhabitable Spaces Applying Emotional Domotics

Authors: Sergio A. Navarro Tuch, Martin Rogelio Bustamante Bello, Leopoldo Julian Lechuga Lopez

Abstract:

In recent years, Mexico’s population has seen a rise of different physiological and mental negative states. Two main consequences of this problematic are deficient work performance and high levels of stress generating and important impact on a person’s physical, mental and emotional health. Several approaches, such as the use of audiovisual stimulus to induce emotions and modify a person’s emotional state, can be applied in an effort to decreases these negative effects. With the use of different non-invasive physiological sensors such as EEG, luminosity and face recognition we gather information of the subject’s current emotional state. In a controlled environment, a subject is shown a series of selected images from the International Affective Picture System (IAPS) in order to induce a specific set of emotions and obtain information from the sensors. The raw data obtained is statistically analyzed in order to filter only the specific groups of information that relate to a subject’s emotions and current values of the physical variables in the controlled environment such as, luminosity, RGB light color, temperature, oxygen level and noise. Finally, a neural network based control algorithm is given the data obtained in order to feedback the system and automate the modification of the environment variables and audiovisual content shown in an effort that these changes can positively alter the subject’s emotional state. During the research, it was found that the light color was directly related to the type of impact generated by the audiovisual content on the subject’s emotional state. Red illumination increased the impact of violent images and green illumination along with relaxing images decreased the subject’s levels of anxiety. Specific differences between men and women were found as to which type of images generated a greater impact in either gender. The population sample was mainly constituted by college students whose data analysis showed a decreased sensibility to violence towards humans. Despite the early stage of the control algorithm, the results obtained from the population sample give us a better insight into the possibilities of emotional domotics and the applications that can be created towards the improvement of performance in people’s lives. The objective of this research is to create a positive impact with the application of technology to everyday activities; nonetheless, an ethical problem arises since this can also be applied to control a person’s emotions and shift their decision making.

Keywords: data analysis, emotional domotics, performance improvement, neural network

Procedia PDF Downloads 145
2210 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression

Authors: Anne M. Denton, Rahul Gomes, David W. Franzen

Abstract:

High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.

Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression

Procedia PDF Downloads 132
2209 Applying Multiplicative Weight Update to Skin Cancer Classifiers

Authors: Animish Jain

Abstract:

This study deals with using Multiplicative Weight Update within artificial intelligence and machine learning to create models that can diagnose skin cancer using microscopic images of cancer samples. In this study, the multiplicative weight update method is used to take the predictions of multiple models to try and acquire more accurate results. Logistic Regression, Convolutional Neural Network (CNN), and Support Vector Machine Classifier (SVMC) models are employed within the Multiplicative Weight Update system. These models are trained on pictures of skin cancer from the ISIC-Archive, to look for patterns to label unseen scans as either benign or malignant. These models are utilized in a multiplicative weight update algorithm which takes into account the precision and accuracy of each model through each successive guess to apply weights to their guess. These guesses and weights are then analyzed together to try and obtain the correct predictions. The research hypothesis for this study stated that there would be a significant difference in the accuracy of the three models and the Multiplicative Weight Update system. The SVMC model had an accuracy of 77.88%. The CNN model had an accuracy of 85.30%. The Logistic Regression model had an accuracy of 79.09%. Using Multiplicative Weight Update, the algorithm received an accuracy of 72.27%. The final conclusion that was drawn was that there was a significant difference in the accuracy of the three models and the Multiplicative Weight Update system. The conclusion was made that using a CNN model would be the best option for this problem rather than a Multiplicative Weight Update system. This is due to the possibility that Multiplicative Weight Update is not effective in a binary setting where there are only two possible classifications. In a categorical setting with multiple classes and groupings, a Multiplicative Weight Update system might become more proficient as it takes into account the strengths of multiple different models to classify images into multiple categories rather than only two categories, as shown in this study. This experimentation and computer science project can help to create better algorithms and models for the future of artificial intelligence in the medical imaging field.

Keywords: artificial intelligence, machine learning, multiplicative weight update, skin cancer

Procedia PDF Downloads 83
2208 Local Availability Influences Choice of Radical Treatment for Prostate Cancer

Authors: Jemini Vyas, Oluwatobi Adeyoe, Jenny Branagan, Chandran Tanabalan, Aakash Pai

Abstract:

Introduction: Radical prostatectomy and radiotherapy are both viable options for the treatment of localised prostate cancer. Over the years medicine has evolved towards a patient-centred approach. Patient decision-making is not motivated by clinical outcomes alone. Geographical location and ease of access to treating clinician are contributory factors. With the development of robotic surgery, prostatectomy has been centralised into tertiary centres. This has impacted on the distances that patients and their families are expected to travel. Methods: A single centre retrospective study was undertaken over a five-year period. All patients with localised prostate cancer, undergoing radical radiotherapy or prostatectomy were collected pre-centralisation. This was compared to the total number undergoing these treatments post centralisation. Results: Pre-centralisation, both radiotherapy and prostatectomy groups had to travel a median of less than five miles for treatment. Post-centralisation of pelvic surgery, prostatectomy patients had to travel a median of more than 40 miles, whilst travel distance for the radiotherapy group was unchanged. In the post centralisation cohort, there was a 63% decline in the number of patients undergoing radical prostatectomy per month from a mean of 5.1 to 1.9. The radical radiotherapy group had a concurrent 41% increase in patient numbers with a mean increase from 13.3 to 18.8 patients per month. Conclusion: Choice of radical treatment in localised prostate cancer is based on multiple factors. This study infers that local availability can influence choice of radical treatment. It is imperative that efforts are made to maintain accessibility to all viable options for prostate cancer patients, so that patient choice is not compromised.

Keywords: prostate, prostatectomy, radiotherapy, centralisation

Procedia PDF Downloads 101
2207 The Effect of Technology and Artifical Intelligence on Legal Securities and Privacy Issues

Authors: Kerolis Samoul Zaghloul Noaman

Abstract:

area law is the brand new access in the basket of worldwide law in the latter half of the 20 th Century. inside the last hundred and fifty years, courts and pupils advanced a consensus that, the custom is an vital supply of global law. Article 38(1) (b) of the statute of the international court of Justice identified global custom as a supply of global law. country practices and usages have a more role to play in formulating commonplace international regulation. This paper examines those country practices which may be certified to emerge as global standard law. due to the fact that, 1979 (after Moon Treaty) no hard law had been developed within the vicinity of space exploration. It attempts to link among country practices and custom in area exploration and development of standard global regulation in area activities. The paper makes use of doctrinal approach of felony research for inspecting the current questions of worldwide regulation. The paper explores exceptional worldwide prison files which include general meeting Resolutions, Treaty standards, working papers of UN, cases relating to commonplace global law and writing of jurists regarding area law and standard international law. it's far argued that, ideas such as common background of mankind, non-navy region, sovereign equality, nuclear weapon unfastened area and protection of outer area environment, etc. evolved nation practices a number of the worldwide community which can be certified to turn out to be international customary regulation.

Keywords: social networks privacy issues, social networks security issues, social networks privacy precautions measures, social networks security precautions measures

Procedia PDF Downloads 33
2206 Development of Wave-Dissipating Block Installation Simulation for Inexperienced Worker Training

Authors: Hao Min Chuah, Tatsuya Yamazaki, Ryosui Iwasawa, Tatsumi Suto

Abstract:

In recent years, with the advancement of digital technology, the movement to introduce so-called ICT (Information and Communication Technology), such as computer technology and network technology, to civil engineering construction sites and construction sites is accelerating. As part of this movement, attempts are being made in various situations to reproduce actual sites inside computers and use them for designing and construction planning, as well as for training inexperienced engineers. The installation of wave-dissipating blocks on coasts, etc., is a type of work that has been carried out by skilled workers based on their years of experience and is one of the tasks that is difficult for inexperienced workers to carry out on site. Wave-dissipating blocks are structures that are designed to protect coasts, beaches, and so on from erosion by reducing the energy of ocean waves. Wave-dissipating blocks usually weigh more than 1 t and are installed by being suspended by a crane, so it would be time-consuming and costly for inexperienced workers to train on-site. In this paper, therefore, a block installation simulator is developed based on Unity 3D, a game development engine. The simulator computes porosity. Porosity is defined as the ratio of the total volume of the wave breaker blocks inside the structure to the final shape of the ideal structure. Using the evaluation of porosity, the simulator can determine how well the user is able to install the blocks. The voxelization technique is used to calculate the porosity of the structure, simplifying the calculations. Other techniques, such as raycasting and box overlapping, are employed for accurate simulation. In the near future, the simulator will install an automatic block installation algorithm based on combinatorial optimization solutions and compare the user-demonstrated block installation and the appropriate installation solved by the algorithm.

Keywords: 3D simulator, porosity, user interface, voxelization, wave-dissipating blocks

Procedia PDF Downloads 109
2205 Enhancing Traditional Saudi Designs Pattern Cutting to Integrate Them Into Current Clothing Offers

Authors: Faizah Almalki, Simeon Gill, Steve G. Hayes, Lisa Taylor

Abstract:

A core element of cultural identity is the traditional costumes that provide insight into the heritage that has been acquired over time. This heritage is apparent in the use of colour, the styles and the functions of the clothing and it also reflects the skills of those who created the items and the time taken to produce them. Modern flat pattern drafting methods for making garment patterns are simple in comparison to the relatively laborious traditional approaches that would require personal interaction with the wearer throughout the production process. The current study reflects on the main elements of the pattern cutting system and how this has evolved in Saudi Arabia to affect the design of the Sawan garment. Analysis of the traditional methods for constructing Sawan garments was undertaken through observation of the practice and the garments and consulting documented guidance. This provided a foundation through which to explore how modern technology can be applied to improve the process. In this research, modern methods are proposed for producing traditional Saudi garments more efficiently while retaining elements of the conventional style and design. The current study has documented the vital aspects of Sawan garment style. The result showed that the method had been used to take the body measurements and pattern making was elementary and offered simple geometric shape and the Sawan garment is composed of four pieces. Consequently, this research allows for classical pattern shapes to be embedded in garments now worn in Saudi Arabia and for the continuation of cultural heritage.

Keywords: traditional Sawan garment technique, modern pattern cutting technique, the shape of the garment and software, Lectra Modaris

Procedia PDF Downloads 137
2204 Traits and Dilemma: Feminism and Multiple Demands in Young Chinese Female-Directed Films

Authors: Deng Qiaoshan

Abstract:

With the rise of feminism in the global film industry, feminist expressions in Chinese films have also evolved, reflecting societal focus on gender issues. This article focuses on young Chinese female directors such as Yang Lina, Teng Congcong, and Yang Mingming. Their films now present richer female perspectives and consciously incorporate unique female life experiences. They highlight women's real-life struggles, portraying ’struggling’ female identities—characters facing professional failures and desire identity issues, ultimately returning to family roles. These films commonly explore the ‘mother-daughter relationship’, with some using genre storytelling for commercial appeal and others deconstructing the ‘myth of motherhood’ to reflect reality, rewriting traditional maternal roles. The ‘struggling’ female identity in these directors' films shows an aesthetic of ‘pseudo-reality’, blending realistic situations with poetic, lyrical elements, reflecting their creative traits and internal conflicts. These contradictions are closely related to the unique creative context of Chinese cinema in which they operate. Emerging under China's strict film censorship system, film industrialization, consumerist culture, and internet environment, new-generation directors face multiple demands. How to ‘survive’ amidst complex commercial requirements while creating films with a clear feminist consciousness is the fundamental dilemma faced by young Chinese female directors.

Keywords: female directors, feminism film, female dilemma, film censorship system

Procedia PDF Downloads 49
2203 Analysis of a IncResU-Net Model for R-Peak Detection in ECG Signals

Authors: Beatriz Lafuente Alcázar, Yash Wani, Amit J. Nimunkar

Abstract:

Cardiovascular Diseases (CVDs) are the leading cause of death globally, and around 80% of sudden cardiac deaths are due to arrhythmias or irregular heartbeats. The majority of these pathologies are revealed by either short-term or long-term alterations in the electrocardiogram (ECG) morphology. The ECG is the main diagnostic tool in cardiology. It is a non-invasive, pain free procedure that measures the heart’s electrical activity and that allows the detecting of abnormal rhythms and underlying conditions. A cardiologist can diagnose a wide range of pathologies based on ECG’s form alterations, but the human interpretation is subjective and it is contingent to error. Moreover, ECG records can be quite prolonged in time, which can further complicate visual diagnosis, and deeply retard disease detection. In this context, deep learning methods have risen as a promising strategy to extract relevant features and eliminate individual subjectivity in ECG analysis. They facilitate the computation of large sets of data and can provide early and precise diagnoses. Therefore, the cardiology field is one of the areas that can most benefit from the implementation of deep learning algorithms. In the present study, a deep learning algorithm is trained following a novel approach, using a combination of different databases as the training set. The goal of the algorithm is to achieve the detection of R-peaks in ECG signals. Its performance is further evaluated in ECG signals with different origins and features to test the model’s ability to generalize its outcomes. Performance of the model for detection of R-peaks for clean and noisy ECGs is presented. The model is able to detect R-peaks in the presence of various types of noise, and when presented with data, it has not been trained. It is expected that this approach will increase the effectiveness and capacity of cardiologists to detect divergences in the normal cardiac activity of their patients.

Keywords: arrhythmia, deep learning, electrocardiogram, machine learning, R-peaks

Procedia PDF Downloads 193
2202 The Professor’s Bayonet: An Educational Podcast Splicing the Literary with Social Commentary and Theology

Authors: Jason Dew

Abstract:

Podcasts are increasingly sources of intellectual content for many who desire to broaden their worldview. Topics range from sports to folklore, entertainment to spirituality. The list from which to choose is large, demonstrating the public’s interest in this medium. While traditional classrooms continue to serve the curious and upward bound, podcasts also satisfy intellectual cravings, especially for those on the go. The paper will explore how the podcast, The Professor’s Bayonet, attempts to scratch these itches by offering 4-5 minute commentaries on literary works, both classic and contemporary, through the dual lenses of current trends in society and theology. The reason for this approach is borne out of the direction many students take in exchanges of ideas. They have a sincere interest in how the books that are covered are relevant to their lives, and their questions are probing to the extent that dips into theology are helpful. Cursory examinations of whatever topic just won’t suffice. Those in Generation Z, especially, are parched for real and true answers. The paper, therefore, will share some excerpts from a selection of episodes, explaining the reasons behind why certain works were showcased. In an episode entitled “The Possibility of Evil,” for example, Shirley Jackson’s 1965 short story of the same name is explored, focusing on why the protagonist, Adela Strangeworth, leaves nasty little notes in the mailboxes of those in her small community she deems deserving of a good tongue-lashing. There is a negative result and the opportunity to make the connection to social media and how millions of individuals are guilty of the very same thing Adela Strangeworth is guilty of, making Jackson’s work somewhat prophetic. Reasons for this behavior are explored, namely what it says about how we as a society have evolved both interpersonally and spiritually.

Keywords: podcast, social commentary, theology, literary

Procedia PDF Downloads 58
2201 Numerical Simulation of Filtration Gas Combustion: Front Propagation Velocity

Authors: Yuri Laevsky, Tatyana Nosova

Abstract:

The phenomenon of filtration gas combustion (FGC) had been discovered experimentally at the beginning of 80’s of the previous century. It has a number of important applications in such areas as chemical technologies, fire-explosion safety, energy-saving technologies, oil production. From the physical point of view, FGC may be defined as the propagation of region of gaseous exothermic reaction in chemically inert porous medium, as the gaseous reactants seep into the region of chemical transformation. The movement of the combustion front has different modes, and this investigation is focused on the low-velocity regime. The main characteristic of the process is the velocity of the combustion front propagation. Computation of this characteristic encounters substantial difficulties because of the strong heterogeneity of the process. The mathematical model of FGC is formed by the energy conservation laws for the temperature of the porous medium and the temperature of gas and the mass conservation law for the relative concentration of the reacting component of the gas mixture. In this case the homogenization of the model is performed with the use of the two-temperature approach when at each point of the continuous medium we specify the solid and gas phases with a Newtonian heat exchange between them. The construction of a computational scheme is based on the principles of mixed finite element method with the usage of a regular mesh. The approximation in time is performed by an explicit–implicit difference scheme. Special attention was given to determination of the combustion front propagation velocity. Straight computation of the velocity as grid derivative leads to extremely unstable algorithm. It is worth to note that the term ‘front propagation velocity’ makes sense for settled motion when some analytical formulae linking velocity and equilibrium temperature are correct. The numerical implementation of one of such formulae leading to the stable computation of instantaneous front velocity has been proposed. The algorithm obtained has been applied in subsequent numerical investigation of the FGC process. This way the dependence of the main characteristics of the process on various physical parameters has been studied. In particular, the influence of the combustible gas mixture consumption on the front propagation velocity has been investigated. It also has been reaffirmed numerically that there is an interval of critical values of the interfacial heat transfer coefficient at which a sort of a breakdown occurs from a slow combustion front propagation to a rapid one. Approximate boundaries of such an interval have been calculated for some specific parameters. All the results obtained are in full agreement with both experimental and theoretical data, confirming the adequacy of the model and the algorithm constructed. The presence of stable techniques to calculate the instantaneous velocity of the combustion wave allows considering the semi-Lagrangian approach to the solution of the problem.

Keywords: filtration gas combustion, low-velocity regime, mixed finite element method, numerical simulation

Procedia PDF Downloads 304
2200 The Location-Routing Problem with Pickup Facilities and Heterogeneous Demand: Formulation and Heuristics Approach

Authors: Mao Zhaofang, Xu Yida, Fang Kan, Fu Enyuan, Zhao Zhao

Abstract:

Nowadays, last-mile distribution plays an increasingly important role in the whole industrial chain delivery link and accounts for a large proportion of the whole distribution process cost. Promoting the upgrading of logistics networks and improving the layout of final distribution points has become one of the trends in the development of modern logistics. Due to the discrete and heterogeneous needs and spatial distribution of customer demand, which will lead to a higher delivery failure rate and lower vehicle utilization, last-mile delivery has become a time-consuming and uncertain process. As a result, courier companies have introduced a range of innovative parcel storage facilities, including pick-up points and lockers. The introduction of pick-up points and lockers has not only improved the users’ experience but has also helped logistics and courier companies achieve large-scale economy. Against the backdrop of the COVID-19 of the previous period, contactless delivery has become a new hotspot, which has also created new opportunities for the development of collection services. Therefore, a key issue for logistics companies is how to design/redesign their last-mile distribution network systems to create integrated logistics and distribution networks that consider pick-up points and lockers. This paper focuses on the introduction of self-pickup facilities in new logistics and distribution scenarios and the heterogeneous demands of customers. In this paper, we consider two types of demand, including ordinary products and refrigerated products, as well as corresponding transportation vehicles. We consider the constraints associated with self-pickup points and lockers and then address the location-routing problem with self-pickup facilities and heterogeneous demands (LRP-PFHD). To solve this challenging problem, we propose a mixed integer linear programming (MILP) model that aims to minimize the total cost, which includes the facility opening cost, the variable transport cost, and the fixed transport cost. Due to the NP-hardness of the problem, we propose a hybrid adaptive large-neighbourhood search algorithm to solve LRP-PFHD. We evaluate the effectiveness and efficiency of the proposed algorithm by using instances generated based on benchmark instances. The results demonstrate that the hybrid adaptive large neighbourhood search algorithm is more efficient than MILP solvers such as Gurobi for LRP-PFHD, especially for large-scale instances. In addition, we made a comprehensive analysis of some important parameters (e.g., facility opening cost and transportation cost) to explore their impacts on the results and suggested helpful managerial insights for courier companies.

Keywords: city logistics, last-mile delivery, location-routing, adaptive large neighborhood search

Procedia PDF Downloads 88