Search results for: fast Fourier algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4632

Search results for: fast Fourier algorithms

1392 Border Trade Policy to Promote Thailand - Myanmar Mae Sai, Chiang Rai Province

Authors: Sakapas Saengchai, Pichamon Chansuchai

Abstract:

Research Thai- Myanmar Border Trade Promotion Policy, Mae Sai District, Chiang Rai Province The objectives of this study were to study the policy of promoting Thai- Myanmar border trade in Mae Sai district, Chiang Rai province. And suitable models for the development of border trade in Mae Sai. Chiang Rai province This research uses qualitative methodology. The method of collecting data from research papers. Participatory Observation In-depth interviews in which the information is important, the governor of Chiang Rai. Chiang Rai Customs Service Executive Office of Mae Sai Immigration Bureau Maesai Chamber of Commerce and Private Entrepreneurs By specific sampling Data analysis uses content analysis. The study indicated that Border Trade Promotion Policy The direction taken by the government to focus on developing 1. Security is further reducing crime. Smuggling and human trafficking Including the preparation to protect people from terrorism and natural disasters. And cooperation with Burma on border security. 2. The development of wealth is the promotion of investment. The transport links, logistics value chain. Products and services across the Thai-Myanmar border. Improve the regulations and laws to promote fair trade. Convenient and fast 3. Sustainable development is the ability to generate income, quality of life of people in the Thai border to increase continuously. By using balanced natural resources, production and consumption are environmentally friendly. Which featured the participation of all sectors of the public and private sectors in the region to drive the development of the border with Thailand. Chiang Rai province To be more competitive .

Keywords: Border, Trade, Policy, Promote

Procedia PDF Downloads 177
1391 Process Optimization and Automation of Information Technology Services in a Heterogenic Digital Environment

Authors: Tasneem Halawani, Yamen Khateeb

Abstract:

With customers’ ever-increasing expectations for fast services provisioning for all their business needs, information technology (IT) organizations, as business partners, have to cope with this demanding environment and deliver their services in the most effective and efficient way. The purpose of this paper is to identify optimization and automation opportunities for the top requested IT services in a heterogenic digital environment and widely spread customer base. In collaboration with systems, processes, and subject matter experts (SMEs), the processes in scope were approached by analyzing four-year related historical data, identifying and surveying stakeholders, modeling the as-is processes, and studying systems integration/automation capabilities. This effort resulted in identifying several pain areas, including standardization, unnecessary customer and IT involvement, manual steps, systems integration, and performance measurement. These pain areas were addressed by standardizing the top five requested IT services, eliminating/automating 43 steps, and utilizing a single platform for end-to-end process execution. In conclusion, the optimization of IT service request processes in a heterogenic digital environment and widely spread customer base is challenging, yet achievable without compromising the service quality and customers’ added value. Further studies can focus on measuring the value of the eliminated/automated process steps to quantify the enhancement impact. Moreover, a similar approach can be utilized to optimize other IT service requests, with a focus on business criticality.

Keywords: automation, customer value, heterogenic, integration, IT services, optimization, processes

Procedia PDF Downloads 111
1390 Traffic Forecasting for Open Radio Access Networks Virtualized Network Functions in 5G Networks

Authors: Khalid Ali, Manar Jammal

Abstract:

In order to meet the stringent latency and reliability requirements of the upcoming 5G networks, Open Radio Access Networks (O-RAN) have been proposed. The virtualization of O-RAN has allowed it to be treated as a Network Function Virtualization (NFV) architecture, while its components are considered Virtualized Network Functions (VNFs). Hence, intelligent Machine Learning (ML) based solutions can be utilized to apply different resource management and allocation techniques on O-RAN. However, intelligently allocating resources for O-RAN VNFs can prove challenging due to the dynamicity of traffic in mobile networks. Network providers need to dynamically scale the allocated resources in response to the incoming traffic. Elastically allocating resources can provide a higher level of flexibility in the network in addition to reducing the OPerational EXpenditure (OPEX) and increasing the resources utilization. Most of the existing elastic solutions are reactive in nature, despite the fact that proactive approaches are more agile since they scale instances ahead of time by predicting the incoming traffic. In this work, we propose and evaluate traffic forecasting models based on the ML algorithm. The algorithms aim at predicting future O-RAN traffic by using previous traffic data. Detailed analysis of the traffic data was carried out to validate the quality and applicability of the traffic dataset. Hence, two ML models were proposed and evaluated based on their prediction capabilities.

Keywords: O-RAN, traffic forecasting, NFV, ARIMA, LSTM, elasticity

Procedia PDF Downloads 236
1389 Evolving Convolutional Filter Using Genetic Algorithm for Image Classification

Authors: Rujia Chen, Ajit Narayanan

Abstract:

Convolutional neural networks (CNN), as typically applied in deep learning, use layer-wise backpropagation (BP) to construct filters and kernels for feature extraction. Such filters are 2D or 3D groups of weights for constructing feature maps at subsequent layers of the CNN and are shared across the entire input. BP as a gradient descent algorithm has well-known problems of getting stuck at local optima. The use of genetic algorithms (GAs) for evolving weights between layers of standard artificial neural networks (ANNs) is a well-established area of neuroevolution. In particular, the use of crossover techniques when optimizing weights can help to overcome problems of local optima. However, the application of GAs for evolving the weights of filters and kernels in CNNs is not yet an established area of neuroevolution. In this paper, a GA-based filter development algorithm is proposed. The results of the proof-of-concept experiments described in this paper show the proposed GA algorithm can find filter weights through evolutionary techniques rather than BP learning. For some simple classification tasks like geometric shape recognition, the proposed algorithm can achieve 100% accuracy. The results for MNIST classification, while not as good as possible through standard filter learning through BP, show that filter and kernel evolution warrants further investigation as a new subarea of neuroevolution for deep architectures.

Keywords: neuroevolution, convolutional neural network, genetic algorithm, filters, kernels

Procedia PDF Downloads 192
1388 Radial Basis Surrogate Model Integrated to Evolutionary Algorithm for Solving Computation Intensive Black-Box Problems

Authors: Abdulbaset Saad, Adel Younis, Zuomin Dong

Abstract:

For design optimization with high-dimensional expensive problems, an effective and efficient optimization methodology is desired. This work proposes a series of modification to the Differential Evolution (DE) algorithm for solving computation Intensive Black-Box Problems. The proposed methodology is called Radial Basis Meta-Model Algorithm Assisted Differential Evolutionary (RBF-DE), which is a global optimization algorithm based on the meta-modeling techniques. A meta-modeling assisted DE is proposed to solve computationally expensive optimization problems. The Radial Basis Function (RBF) model is used as a surrogate model to approximate the expensive objective function, while DE employs a mechanism to dynamically select the best performing combination of parameters such as differential rate, cross over probability, and population size. The proposed algorithm is tested on benchmark functions and real life practical applications and problems. The test results demonstrate that the proposed algorithm is promising and performs well compared to other optimization algorithms. The proposed algorithm is capable of converging to acceptable and good solutions in terms of accuracy, number of evaluations, and time needed to converge.

Keywords: differential evolution, engineering design, expensive computations, meta-modeling, radial basis function, optimization

Procedia PDF Downloads 402
1387 Nano-Sized Iron Oxides/ZnMe Layered Double Hydroxides as Highly Efficient Fenton-Like Catalysts for Degrading Specific Pharmaceutical Agents

Authors: Marius Sebastian Secula, Mihaela Darie, Gabriela Carja

Abstract:

Persistent organic pollutant discharged by various industries or urban regions into the aquatic ecosystems represent a serious threat to fauna and human health. The endocrine disrupting compounds are known to have toxic effects even at very low values of concentration. The anti-inflammatory agent Ibuprofen is an endocrine disrupting compound and is considered as model pollutant in the present study. The use of light energy to accomplish the latest requirements concerning wastewater discharge demands highly-performant and robust photo-catalysts. Many efforts have been paid to obtain efficient photo-responsive materials. Among the promising photo-catalysts, layered double hydroxides (LDHs) attracted significant consideration especially due to their composition flexibility, high surface area and tailored redox features. This work presents Fe(II) self-supported on ZnMeLDHs (Me =Al3+, Fe3+) as novel efficient photo-catalysts for Fenton-like catalysis. The co-precipitation method was used to prepare ZnAlLDH, ZnFeAlLDH and ZnCrLDH (Zn2+/Me3+ = 2 molar ratio). Fe(II) was self-supported on the LDHs matrices by using the reconstruction method, at two different values of weight concentration. X-ray diffraction (XRD), thermogravimetric analysis (TG/DTG), Fourier transform infrared (FTIR) and transmission electron microscopy (TEM) were used to investigate the structural, textural, and micromorphology of the catalysts. The Fe(II)/ZnMeLDHs nano-hybrids were tested for the degradation of a model pharmaceutical agent, the anti-inflammatory agent ibuprofen, by photocatalysis and photo-Fenton catalysis, respectively. The results point out that the embedment Fe(II) into ZnFeAlLDH and ZnCrLDH lead to a slight enhancement of ibuprofen degradation by light irradiation, whereas in case of ZnAlLDH, the degradation process is relatively low. A remarkable enhancement of ibuprofen degradation was found in the case of Fe(II)/ZnMeLDHs by photo-Fenton process. Acknowledgements: This work was supported by a grant of the Romanian National Authority for Scientific Research and Innovation, CNCS - UEFISCDI, project number PN-II-RU-TE-2014-4-0405.

Keywords: layered double hydroxide, heterogeneous Fenton, micropollutant, photocatalysis

Procedia PDF Downloads 299
1386 Performance Optimization on Waiting Time Using Queuing Theory in an Advanced Manufacturing Environment: Robotics to Enhance Productivity

Authors: Ganiyat Soliu, Glen Bright, Chiemela Onunka

Abstract:

Performance optimization plays a key role in controlling the waiting time during manufacturing in an advanced manufacturing environment to improve productivity. Queuing mathematical modeling theory was used to examine the performance of the multi-stage production line. Robotics as a disruptive technology was implemented into a virtual manufacturing scenario during the packaging process to study the effect of waiting time on productivity. The queuing mathematical model was used to determine the optimum service rate required by robots during the packaging stage of manufacturing to yield an optimum production cost. Different rates of production were assumed in a virtual manufacturing environment, cost of packaging was estimated with optimum production cost. An equation was generated using queuing mathematical modeling theory and the theorem adopted for analysis of the scenario is the Newton Raphson theorem. Queuing theory presented here provides an adequate analysis of the number of robots required to regulate waiting time in order to increase the number of output. Arrival rate of the product was fast which shows that queuing mathematical model was effective in minimizing service cost and the waiting time during manufacturing. At a reduced waiting time, there was an improvement in the number of products obtained per hour. The overall productivity was improved based on the assumptions used in the queuing modeling theory implemented in the virtual manufacturing scenario.

Keywords: performance optimization, productivity, queuing theory, robotics

Procedia PDF Downloads 156
1385 Comparison Study of Machine Learning Classifiers for Speech Emotion Recognition

Authors: Aishwarya Ravindra Fursule, Shruti Kshirsagar

Abstract:

In the intersection of artificial intelligence and human-centered computing, this paper delves into speech emotion recognition (SER). It presents a comparative analysis of machine learning models such as K-Nearest Neighbors (KNN),logistic regression, support vector machines (SVM), decision trees, ensemble classifiers, and random forests, applied to SER. The research employs four datasets: Crema D, SAVEE, TESS, and RAVDESS. It focuses on extracting salient audio signal features like Zero Crossing Rate (ZCR), Chroma_stft, Mel Frequency Cepstral Coefficients (MFCC), root mean square (RMS) value, and MelSpectogram. These features are used to train and evaluate the models’ ability to recognize eight types of emotions from speech: happy, sad, neutral, angry, calm, disgust, fear, and surprise. Among the models, the Random Forest algorithm demonstrated superior performance, achieving approximately 79% accuracy. This suggests its suitability for SER within the parameters of this study. The research contributes to SER by showcasing the effectiveness of various machine learning algorithms and feature extraction techniques. The findings hold promise for the development of more precise emotion recognition systems in the future. This abstract provides a succinct overview of the paper’s content, methods, and results.

Keywords: comparison, ML classifiers, KNN, decision tree, SVM, random forest, logistic regression, ensemble classifiers

Procedia PDF Downloads 48
1384 Amperometric Biosensor for Glucose Determination Based on a Recombinant Mn Peroxidase from Corn Cross-linked to a Gold Electrode

Authors: Anahita Izadyar, My Ni Van, Kayleigh Amber Rodriguez, Ilwoo Seok, Elizabeth E. Hood

Abstract:

Using a recombinant enzyme derived from corn and a simple modification, we fabricated a facile, fast, and cost-beneficial biosensor to measure glucose. The Nafion/ Plant Produced Mn Peroxidase (PPMP)– glucose oxidase (GOx)- Bovine serum albumin (BSA) /Au electrode showed an excellent amperometric response to detect glucose. This biosensor is capable of responding to a wide range of glucose—20.0 µM−15.0 mM and has a lower detection limit (LOD) of 2.90µM. The reproducibility response using six electrodes is also very substantial and indicates the high capability of this biosensor to detect a wide range of 3.10±0.19µM to 13.2±1.8 mM glucose concentration. Selectivity of this electrode was investigated in an optimized experimental solution contains 10% diet green tea with citrus containing ascorbic acid (AA), and citric acid (CA) in a wide concentration of glucose at 0.02 to 14.0mM with an LOD of 3.10µM. Reproducibility was also investigated using 4 electrodes in this sample and shows notable results in the wide concentration range of 3.35±0.45µM to of 13.0 ± 0.81 mM. We also used other voltammetry methods to evaluate this biosensor. We applied linear sweep voltammetry (LSV) and this technique shows a wide range of 0.10−15.0 mM to detect glucose with a lower detection limit of 19.5µM. The performance and strength of this enzyme biosensor were the simplicity, wide linear ranges, sensitivities, selectivity, and low limits of detection. We expect that the modified biosensor has the potential for monitoring various biofluids.

Keywords: plant-produced manganese peroxidase, enzyme-based biosensors, glucose, modified gold electrode, glucose oxidase

Procedia PDF Downloads 145
1383 Eugenol Effects on Metabolic Syndrome Induced Liver Damages

Authors: Fatemeh Kourkinejad Gharaei, Tahereh Safari, Zahra Saebinasab

Abstract:

Metabolic syndrome (MetS) is a set of risk factors associated with cardiovascular diseases, atherosclerosis, and type 2 diabetes. Nonalcoholic fatty liver disease (NAFLD) is the most important liver disorder in metabolic syndrome. High fructose consumption increases the risk of NAFLD. Eugenol shows anti-thrombotic, insulin-sensitive, fat-reducing effects. This study was designed to investigate the protective role of eugenol in NAFLD caused by metabolic syndrome. Methods: Thirty male Wistar rats were randomly divided into five groups; group 1, drinking water intake animals; group 2, fructose, group 3, fructose+eugenol solvent; group 4, fructose+ eugenol 50mg/kg and group 5, fructose+ eugenol 100mg/kg. At the end of the experiment, after 12 hours of fasting and under anesthesia, blood samples were taken for measurement of fast blood glucose (FBS), SGOT, AGPT, LDL, HDL, cholesterol, triglyceride. Results: FBG significantly increased in group 2 compared to group 1 (p < 0.001); however, it significantly decreased in groups 4 and 5 compared to group 2 (p < 0.05). SGOT and SGPT levels significantly increased in group 2 compared to drinking water alone (p < 0.001). However, SGOT and SGPT levels significantly decreased in groups 4 and 5. MDA and LTDS significantly increased in group 2 compared with drinking water alone (p < 0.01), while MDA and LTDS decreased in 4 and 5 groups compared to group 2 (p < 0.05), which confirms the pathology results related to the liver damage. Conclusion: Eugenol has protective effects on the liver and fat accumulation in liver cells.

Keywords: eugenol, fructose, metabolic syndrome, nonalcoholic fatty liver disease

Procedia PDF Downloads 128
1382 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution

Authors: Najrullah Khan, Athar Ali Khan

Abstract:

The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.

Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation

Procedia PDF Downloads 538
1381 Error Detection and Correction for Onboard Satellite Computers Using Hamming Code

Authors: Rafsan Al Mamun, Md. Motaharul Islam, Rabana Tajrin, Nabiha Noor, Shafinaz Qader

Abstract:

In an attempt to enrich the lives of billions of people by providing proper information, security and a way of communicating with others, the need for efficient and improved satellites is constantly growing. Thus, there is an increasing demand for better error detection and correction (EDAC) schemes, which are capable of protecting the data onboard the satellites. The paper is aimed towards detecting and correcting such errors using a special algorithm called the Hamming Code, which uses the concept of parity and parity bits to prevent single-bit errors onboard a satellite in Low Earth Orbit. This paper focuses on the study of Low Earth Orbit satellites and the process of generating the Hamming Code matrix to be used for EDAC using computer programs. The most effective version of Hamming Code generated was the Hamming (16, 11, 4) version using MATLAB, and the paper compares this particular scheme with other EDAC mechanisms, including other versions of Hamming Codes and Cyclic Redundancy Check (CRC), and the limitations of this scheme. This particular version of the Hamming Code guarantees single-bit error corrections as well as double-bit error detections. Furthermore, this version of Hamming Code has proved to be fast with a checking time of 5.669 nanoseconds, that has a relatively higher code rate and lower bit overhead compared to the other versions and can detect a greater percentage of errors per length of code than other EDAC schemes with similar capabilities. In conclusion, with the proper implementation of the system, it is quite possible to ensure a relatively uncorrupted satellite storage system.

Keywords: bit-flips, Hamming code, low earth orbit, parity bits, satellite, single error upset

Procedia PDF Downloads 133
1380 Three Tier Indoor Localization System for Digital Forensics

Authors: Dennis L. Owuor, Okuthe P. Kogeda, Johnson I. Agbinya

Abstract:

Mobile localization has attracted a great deal of attention recently due to the introduction of wireless networks. Although several localization algorithms and systems have been implemented and discussed in the literature, very few researchers have exploited the gap that exists between indoor localization, tracking, external storage of location information and outdoor localization for the purpose of digital forensics during and after a disaster. The contribution of this paper lies in the implementation of a robust system that is capable of locating, tracking mobile device users and store location information for both indoor and partially outdoor the cloud. The system can be used during disaster to track and locate mobile phone users. The developed system is a mobile application built based on Android, Hypertext Preprocessor (PHP), Cascading Style Sheets (CSS), JavaScript and MATLAB for the Android mobile users. Using Waterfall model of software development, we have implemented a three level system that is able to track, locate and store mobile device information in secure database (cloud) on almost a real time basis. The outcome of the study showed that the developed system is efficient with regard to the tracking and locating mobile devices. The system is also flexible, i.e. can be used in any building with fewer adjustments. Finally, the system is accurate for both indoor and outdoor in terms of locating and tracking mobile devices.

Keywords: indoor localization, digital forensics, fingerprinting, tracking and cloud

Procedia PDF Downloads 343
1379 A Study on the Importance and Contributions of Transforming from OEM to ODM in Malaysian Furniture Industry

Authors: Nurul Ain Haron, Saiful Hazmi Bachek, Hafez Zainudin

Abstract:

This study is aimed to establish the importance and contribution of Original Design Manufacturing (ODM) in Malaysian Furniture Industry and to close the gap between the players in the industry. The study confirms that today’s furniture industry favor Original Equipment Manufacturing (OEM) basis. Thus, resulting in the local industry lacking the expertise of designing furniture to a state of no contest. This study method used consists of literature reviews, observation, questionnaire and sessions of interviews. The result shows that the public has from minimum to almost no knowledge of the term Original Design Manufacturing (ODM) concept and the impact towards our current future industry. The manufacturers however, prefers Original Equipment Manufacturing (OEM) concept due to its easy and fast investment returns with the need of product designing process, while the interviews carried out with the authorized council had some convincing urges of doing their part promoting the awareness through trainings and seminars. Findings show that, in the rush of archiving ODM status needs extensive cooperation from many parties that are authorized council, furniture manufacturers, designers and also the public perceptions of labeling local made goods as the black goat. The current mind set of OEM manufacturers need to be change for industry’s future. As Malaysia’s living status constantly increases, so will the demands of a better salary. If these current issues are not resolved, OEM international buyers will definitely have to shift to some other lower cost manufacturer like China or Taiwan. When vendors stopped to order, today’s OEM manufacturers will no longer exist in the future.

Keywords: design manufacturing, furniture design, original design manufacturing, original equipment manufacturing

Procedia PDF Downloads 449
1378 A Review on Cloud Computing and Internet of Things

Authors: Sahar S. Tabrizi, Dogan Ibrahim

Abstract:

Cloud Computing is a convenient model for on-demand networks that uses shared pools of virtual configurable computing resources, such as servers, networks, storage devices, applications, etc. The cloud serves as an environment for companies and organizations to use infrastructure resources without making any purchases and they can access such resources wherever and whenever they need. Cloud computing is useful to overcome a number of problems in various Information Technology (IT) domains such as Geographical Information Systems (GIS), Scientific Research, e-Governance Systems, Decision Support Systems, ERP, Web Application Development, Mobile Technology, etc. Companies can use Cloud Computing services to store large amounts of data that can be accessed from anywhere on Earth and also at any time. Such services are rented by the client companies where the actual rent depends upon the amount of data stored on the cloud and also the amount of processing power used in a given time period. The resources offered by the cloud service companies are flexible in the sense that the user companies can increase or decrease their storage requirements or the processing power requirements at any time, thus minimizing the overall rental cost of the service they receive. In addition, the Cloud Computing service providers offer fast processors and applications software that can be shared by their clients. This is especially important for small companies with limited budgets which cannot afford to purchase their own expensive hardware and software. This paper is an overview of the Cloud Computing, giving its types, principles, advantages, and disadvantages. In addition, the paper gives some example engineering applications of Cloud Computing and makes suggestions for possible future applications in the field of engineering.

Keywords: cloud computing, cloud systems, cloud services, IaaS, PaaS, SaaS

Procedia PDF Downloads 235
1377 Machine Learning for Targeting of Conditional Cash Transfers: Improving the Effectiveness of Proxy Means Tests to Identify Future School Dropouts and the Poor

Authors: Cristian Crespo

Abstract:

Conditional cash transfers (CCTs) have been targeted towards the poor. Thus, their targeting assessments check whether these schemes have been allocated to low-income households or individuals. However, CCTs have more than one goal and target group. An additional goal of CCTs is to increase school enrolment. Hence, students at risk of dropping out of school also are a target group. This paper analyses whether one of the most common targeting mechanisms of CCTs, a proxy means test (PMT), is suitable to identify the poor and future school dropouts. The PMT is compared with alternative approaches that use the outputs of a predictive model of school dropout. This model was built using machine learning algorithms and rich administrative datasets from Chile. The paper shows that using machine learning outputs in conjunction with the PMT increases targeting effectiveness by identifying more students who are either poor or future dropouts. This joint targeting approach increases effectiveness in different scenarios except when the social valuation of the two target groups largely differs. In these cases, the most likely optimal approach is to solely adopt the targeting mechanism designed to find the highly valued group.

Keywords: conditional cash transfers, machine learning, poverty, proxy means tests, school dropout prediction, targeting

Procedia PDF Downloads 206
1376 The Impact of Technology on Media Content Regulation

Authors: Eugene Mashapa

Abstract:

The age of information has witnessed countless unprecedented technological developments, which signal the articulation of succinct technological capabilities that can match these cutting-edge technological trends. These changes have impacted patterns in the production, distribution, and consumption of media content, a space that the Film and Publication Board (FPB) is concerned with. Consequently, the FPB is keen to understand the nature and impact of these technological changes on media content regulation. This exploratory study sought to investigate how content regulators in high and middle-income economies have adapted to the changes in this space, seeking insights into innovations, technological and operational, that facilitate continued relevance during this fast-changing environment. The study is aimed at developing recommendations that could assist and inform the organisation in regulating media content as it evolves. Thus, the overall research strategy in this analysis is applied research, and the analytical model adopted is a mixed research design guided by both qualitative and quantitative research instruments. It was revealed in the study that the FPB was significantly impacted by the unprecedented technological advancements in the media regulation space. Additionally, there exists a need for the FPB to understand the current and future penetrations of 4IR technology in the industry and its impact on media governance and policy implementation. This will range from reskilling officials to align with the technological skills to developing technological innovations as well as adopting co-regulatory or self-regulatory arrangements together with content distributors, where more content is distributed in higher volumes and with increased frequency. Importantly, initiating an interactive learning process for both FPB employees and the general public can assist the regulator and improve FPB’s operational efficiency and effectiveness.

Keywords: media, regulation, technology, film and publications board

Procedia PDF Downloads 116
1375 Pushing the Boundary of Parallel Tractability for Ontology Materialization via Boolean Circuits

Authors: Zhangquan Zhou, Guilin Qi

Abstract:

Materialization is an important reasoning service for applications built on the Web Ontology Language (OWL). To make materialization efficient in practice, current research focuses on deciding tractability of an ontology language and designing parallel reasoning algorithms. However, some well-known large-scale ontologies, such as YAGO, have been shown to have good performance for parallel reasoning, but they are expressed in ontology languages that are not parallelly tractable, i.e., the reasoning is inherently sequential in the worst case. This motivates us to study the problem of parallel tractability of ontology materialization from a theoretical perspective. That is we aim to identify the ontologies for which materialization is parallelly tractable, i.e., in the NC complexity. Since the NC complexity is defined based on Boolean circuit that is widely used to investigate parallel computing problems, we first transform the problem of materialization to evaluation of Boolean circuits, and then study the problem of parallel tractability based on circuits. In this work, we focus on datalog rewritable ontology languages. We use Boolean circuits to identify two classes of datalog rewritable ontologies (called parallelly tractable classes) such that materialization over them is parallelly tractable. We further investigate the parallel tractability of materialization of a datalog rewritable OWL fragment DHL (Description Horn Logic). Based on the above results, we analyze real-world datasets and show that many ontologies expressed in DHL belong to the parallelly tractable classes.

Keywords: ontology materialization, parallel reasoning, datalog, Boolean circuit

Procedia PDF Downloads 274
1374 Identification and Isolation of E. Coli O₁₅₇:H₇ From Water and Wastewater of Shahrood and Neka Cities by PCR Technique

Authors: Aliasghar Golmohammadian, Sona Rostampour Yasouri

Abstract:

One of the most important intestinal pathogenic strains is E. coli O₁₅₇:H₇. This pathogenic bacterium is transmitted to humans through water and food. E. coli O₁₅₇:H₇ is the main cause of Hemorrhagic colitis (HC), Hemolytic Uremic Syndrome (HUS), Thrombotic Thrombocytopenic Purpura (TTP) and in some cases death. Since E. coli O₁₅₇:H₇ can be transmitted through the consumption of different foods, including vegetables, agricultural products, and fresh dairy products, this study aims to identify and isolate E. coli O₁₅₇:H₇ from wastewater by PCR technique. One hundred twenty samples of water and wastewater were collected by Falcom Sterile from Shahrood and Neka cities. The samples were checked for colony formation after appropriate centrifugation and cultivation in the specific medium of Sorbitol MacConkey Agar (SMAC) and other diagnostic media of E. coli O₁₅₇:H₇. Also, the plates were observed macroscopically and microscopically. Then, the necessary phenotypic tests were performed on the colonies, and finally, after DNA extraction, the PCR technique was performed with specific primers related to rfbE and stx2 genes. The number of 5 samples (6%) out of all the samples examined were determined positive by PCR technique with observing the bands related to the mentioned genes on the agarose gel electrophoresis. PCR is a fast and accurate method to identify the bacteria E. coli O₁₅₇:H₇. Considering that E. coli bacteria is a resistant bacteria and survives in water and food for weeks and months, the PCR technique can provide the possibility of quick detection of contaminated water. Moreover, it helps people in the community control and prevent the transfer of bacteria to healthy and underground water and agricultural and even dairy products.

Keywords: E. coli O₁₅₇:H₇, PCR, water, wastewater

Procedia PDF Downloads 69
1373 Teaching Tools for Web Processing Services

Authors: Rashid Javed, Hardy Lehmkuehler, Franz Josef-Behr

Abstract:

Web Processing Services (WPS) have up growing concern in geoinformation research. However, teaching about them is difficult because of the generally complex circumstances of their use. They limit the possibilities for hands- on- exercises on Web Processing Services. To support understanding however a Training Tools Collection was brought on the way at University of Applied Sciences Stuttgart (HFT). It is limited to the scope of Geostatistical Interpolation of sample point data where different algorithms can be used like IDW, Nearest Neighbor etc. The Tools Collection aims to support understanding of the scope, definition and deployment of Web Processing Services. For example it is necessary to characterize the input of Interpolation by the data set, the parameters for the algorithm and the interpolation results (here a grid of interpolated values is assumed). This paper reports on first experiences using a pilot installation. This was intended to find suitable software interfaces for later full implementations and conclude on potential user interface characteristics. Experiences were made with Deegree software, one of several Services Suites (Collections). Being strictly programmed in Java, Deegree offers several OGC compliant Service Implementations that also promise to be of benefit for the project. The mentioned parameters for a WPS were formalized following the paradigm that any meaningful component will be defined in terms of suitable standards. E.g. the data output can be defined as a GML file. But, the choice of meaningful information pieces and user interactions is not free but partially determined by the selected WPS Processing Suite.

Keywords: deegree, interpolation, IDW, web processing service (WPS)

Procedia PDF Downloads 358
1372 Multi-Objective Evolutionary Computation Based Feature Selection Applied to Behaviour Assessment of Children

Authors: F. Jiménez, R. Jódar, M. Martín, G. Sánchez, G. Sciavicco

Abstract:

Abstract—Attribute or feature selection is one of the basic strategies to improve the performances of data classification tasks, and, at the same time, to reduce the complexity of classifiers, and it is a particularly fundamental one when the number of attributes is relatively high. Its application to unsupervised classification is restricted to a limited number of experiments in the literature. Evolutionary computation has already proven itself to be a very effective choice to consistently reduce the number of attributes towards a better classification rate and a simpler semantic interpretation of the inferred classifiers. We present a feature selection wrapper model composed by a multi-objective evolutionary algorithm, the clustering method Expectation-Maximization (EM), and the classifier C4.5 for the unsupervised classification of data extracted from a psychological test named BASC-II (Behavior Assessment System for Children - II ed.) with two objectives: Maximizing the likelihood of the clustering model and maximizing the accuracy of the obtained classifier. We present a methodology to integrate feature selection for unsupervised classification, model evaluation, decision making (to choose the most satisfactory model according to a a posteriori process in a multi-objective context), and testing. We compare the performance of the classifier obtained by the multi-objective evolutionary algorithms ENORA and NSGA-II, and the best solution is then validated by the psychologists that collected the data.

Keywords: evolutionary computation, feature selection, classification, clustering

Procedia PDF Downloads 375
1371 A Conceptual Approach for Evaluating the Urban Renewal Process

Authors: Muge Unal, Ahmet Cilek

Abstract:

Urban identity, having a dynamic characteristic spatial and semantic aspects, is a phenomenon in an ever-changing. Urban identity formation includes not only a process of physical nature but also development and change processes that take place in the political, economic, social and cultural values, whether national and international level. Although the concept of urban transformation is basically regarded as the spatial transformation; in fact, it reveals a holistic perspective and transformation based on dialectical relationship existing between the spatial and social relationship. For this reason, urban renewal needs to address as not only spatial but also the impact of spatial transformation on social, cultural and economic. Implementation tools used in the perception of urban transformation are varied concepts such as urban renewal, urban resettlement, urban rehabilitation, urban redevelopment, and urban revitalization. The phenomenon of urban transformation begins with the Industrial Revolution. Until the 1980s, it was interpreted as reconsidering physical fossil on urban environment factor like occurring in rapid urbanization, changing in the spatial structure of the city, concentrating of the population in urban areas. However, after the 1980s, it has resided in a conceptual structure which requires to be addressed physical, economic, social, technological and integrity of information. In conclusion, urban transformation, when it enter the literature as a practice of planning, has been up to date in terms of the conceptual structure and content and also hasn’t remained behind converting itself. Urban transformation still maintains its simplest expression, while it transforms so fast converts the contents. In this study, the relationship between urban design and components of urban transformation were discussed with strategies used as a place in the historical process of urban transformation besides a general evaluation of the concept of urban renewal.

Keywords: conceptual approach, urban identity, urban regeneration, urban renewal

Procedia PDF Downloads 439
1370 Discriminant Analysis of Pacing Behavior on Mass Start Speed Skating

Authors: Feng Li, Qian Peng

Abstract:

The mass start speed skating (MSSS) is a new event for the 2018 PyeongChang Winter Olympics and will be an official race for the 2022 Beijing Winter Olympics. Considering that the event rankings were based on points gained on laps, it is worthwhile to investigate the pacing behavior on each lap that directly influences the ranking of the race. The aim of this study was to detect the pacing behavior and performance on MSSS regarding skaters’ level (SL), competition stage (semi-final/final) (CS) and gender (G). All the men's and women's races in the World Cup and World Championships were analyzed in the 2018-2019 and 2019-2020 seasons. As a result, a total of 601 skaters from 36 games were observed. ANOVA for repeated measures was applied to compare the pacing behavior on each lap, and the three-way ANOVA for repeated measures was used to identify the influence of SL, CS, and G on pacing behavior and total time spent. In general, the results showed that the pacing behavior from fast to slow were cluster 1—laps 4, 8, 12, 15, 16, cluster 2—laps 5, 9, 13, 14, cluster 3—laps 3, 6, 7, 10, 11, and cluster 4—laps 1 and 2 (p=0.000). For CS, the total time spent in the final was less than the semi-final (p=0.000). For SL, top-level skaters spent less total time than the middle-level and low-level (p≤0.002), while there was no significant difference between the middle-level and low-level (p=0.214). For G, the men’s skaters spent less total time than women on all laps (p≤0.048). This study could help to coach staff better understand the pacing behavior regarding SL, CS, and G, further providing references concerning promoting the pacing strategy and decision making before and during the race.

Keywords: performance analysis, pacing strategy, winning strategy, winter Olympics

Procedia PDF Downloads 196
1369 Time Series Forecasting (TSF) Using Various Deep Learning Models

Authors: Jimeng Shi, Mahek Jain, Giri Narasimhan

Abstract:

Time Series Forecasting (TSF) is used to predict the target variables at a future time point based on the learning from previous time points. To keep the problem tractable, learning methods use data from a fixed-length window in the past as an explicit input. In this paper, we study how the performance of predictive models changes as a function of different look-back window sizes and different amounts of time to predict the future. We also consider the performance of the recent attention-based Transformer models, which have had good success in the image processing and natural language processing domains. In all, we compare four different deep learning methods (RNN, LSTM, GRU, and Transformer) along with a baseline method. The dataset (hourly) we used is the Beijing Air Quality Dataset from the UCI website, which includes a multivariate time series of many factors measured on an hourly basis for a period of 5 years (2010-14). For each model, we also report on the relationship between the performance and the look-back window sizes and the number of predicted time points into the future. Our experiments suggest that Transformer models have the best performance with the lowest Mean Average Errors (MAE = 14.599, 23.273) and Root Mean Square Errors (RSME = 23.573, 38.131) for most of our single-step and multi-steps predictions. The best size for the look-back window to predict 1 hour into the future appears to be one day, while 2 or 4 days perform the best to predict 3 hours into the future.

Keywords: air quality prediction, deep learning algorithms, time series forecasting, look-back window

Procedia PDF Downloads 159
1368 Photophysics and Rotational Relaxation Dynamics of 6-Methoxyquinoline Fluorophore in Cationic Alkyltrimethylammonium Bromide Micelles

Authors: Tej Varma Y, Debi D. Pant

Abstract:

Photophysics and rotational dynamics of the fluorescent probe, 6-methoxyquinoline (6MQ) with cationic surfactant, alkyltrimethylammonium bromide (nTAB) micelle solutions have been investigated (n = 12, 14 and 16). Absorption and emission peaks of the dye have been observed to shift at concentrations around critical micellar concentration (cmc) of nTAB compared to that of bulk solutions suggesting probe is in a lower polar environment. The probe senses changes in polarity (ET (30)) brought about by variation of surfactant chain length concentration and is invariably solubilized in the aqueous interface or palisade layer. The order of change in polarity observed was DTAB > CTAB > TTAB. The binding constant study shows that the probe binds strongest with TTAB (is of the order TTAB > CTAB > DTAB) due to deeper penetration into the micelle. The anisotropy decay for the probe in all the nTAB micelles studied have been rationalized based on a two-step model consisting of fast-restricted rotation of the probe and slow lateral diffusion of the probe in the micelle that is coupled to the overall rotation of the micelle. Fluorescence lifetime measurements of probe in the cationic micelles demonstrate the close proximity of the 6MQ to the Br - counterions. The fluorescence lifetimes of TTAB and DTAB are much shorter than in CTAB. These results indicate that 6MQ resides to a substantial degree in the head group region of the micelles. All the changes observed in the steady state fluorescence, microenvironment, fluorescence lifetimes, fluorescence anisotropy, and other calculations are in agreement with each other suggesting binding of the cationic surfactant with the neutral dye molecule.

Keywords: photophysics, chain length, ntaB, micelles

Procedia PDF Downloads 640
1367 Unveiling Vegetation Composition and Dynamics Along Urbanization Gradient in Ranchi, Eastern India

Authors: Purabi Saikia

Abstract:

The present study was carried out across 84 vegetated grids (>10% vegetation cover) along an urbanization gradient, ranging from the urban core to peri-urban and natural vegetation in and around Ranchi, Eastern India, aiming to examine the phytosociological attributes by belt transect (167 transects each of 0.5 ha) method. Overall, plant species richness was highest in natural vegetation (242 spp.), followed by peri-urban (198 spp.) and urban (182 spp.). Similarly, H’, CD, E, Dmg, Dmn, and ENS showed significant differences in the tree layer (H’: 0.45-3.36; CD: 0.04-1.00; E: 0.25-0.96; Dmg: 0.18-7.15; Dmn: 0.03-4.24, and ENS: 1-29) in the entire urbanization gradient. Various α-diversity indices of the adult trees (H’: 3.98, Dmg: 14.32, Dmn: 2.38, ENS: 54) were comparatively better in urban vegetation compared to peri-urban (H’: 2.49, Dmg: 10.37, Dmn: 0.81, ENS: 12) and natural vegetation (H’: 2.89, Dmg: 13.46, Dmn: 0.85, ENS: 18). Tree communities have shown better response and adaptability in urban vegetation than shrubs and herbs. The prevalence of rare (41%), very rare (29%), and exotic species (39%) in urban vegetation may be due to the intentional introduction of a number of fast-growing exotic tree species in different social forestry plantations that have created a diverse and heterogeneous habitat. Despite contagious distribution, the majority of trees (36.14%) have shown no regeneration in the entire urbanization gradient. Additionally, a quite high percentage of IUCN red-listed plant species (51% and 178 spp.), including endangered (01 sp.), vulnerable (03 spp.), near threatened (04 spp.), least concern (163 spp.), and data deficient (07 spp.), warrant immediate policy interventions. Overall, the study witnessed subsequent transformations in floristic composition and structure from urban to natural vegetation in Eastern India. The outcomes are crucial for fostering resilient ecosystems, biodiversity conservation, and sustainable development in the region that supports diverse plant communities.

Keywords: floristic communities, urbanization gradients, exotic species, regeneration

Procedia PDF Downloads 27
1366 Interval Bilevel Linear Fractional Programming

Authors: F. Hamidi, N. Amiri, H. Mishmast Nehi

Abstract:

The Bilevel Programming (BP) model has been presented for a decision making process that consists of two decision makers in a hierarchical structure. In fact, BP is a model for a static two person game (the leader player in the upper level and the follower player in the lower level) wherein each player tries to optimize his/her personal objective function under dependent constraints; this game is sequential and non-cooperative. The decision making variables are divided between the two players and one’s choice affects the other’s benefit and choices. In other words, BP consists of two nested optimization problems with two objective functions (upper and lower) where the constraint region of the upper level problem is implicitly determined by the lower level problem. In real cases, the coefficients of an optimization problem may not be precise, i.e. they may be interval. In this paper we develop an algorithm for solving interval bilevel linear fractional programming problems. That is to say, bilevel problems in which both objective functions are linear fractional, the coefficients are interval and the common constraint region is a polyhedron. From the original problem, the best and the worst bilevel linear fractional problems have been derived and then, using the extended Charnes and Cooper transformation, each fractional problem can be reduced to a linear problem. Then we can find the best and the worst optimal values of the leader objective function by two algorithms.

Keywords: best and worst optimal solutions, bilevel programming, fractional, interval coefficients

Procedia PDF Downloads 449
1365 Synthesis and Characterization of AFe₂O₄ (A=CA, Co, CU) Nano-Spinels: Application to Hydrogen Photochemical Production under Visible Light Irradiation

Authors: H. Medjadji, A. Boulahouache, N. Salhi, A. Boudjemaa, M. Trari

Abstract:

Hydrogen from renewable sources, such as solar, is referred to as green hydrogen. The splitting water process using semiconductors, such as photocatalysts, has attracted significant attention due to its potential application for solving the energy crisis and environmental pollution. Spinel ferrites of the MF₂O₄ type have shown broad interest in diverse energy conversion processes, including fuel cells and photo electrocatalytic water splitting. This work focuses on preparing nano-spinels based on iron AFe₂O₄ (A= Ca, Co, and Cu) as photocatalysts using the nitrate method. These materials were characterized both physically and optically and subsequently tested for hydrogen generation under visible light irradiation. Various techniques were used to investigate the properties of the materials, including TGA-DT, X-ray diffraction (XRD), Fourier Transform Infrared Spectroscopy (FTIR), UV-visible spectroscopy, Scanning Electron Microscopy with Energy Dispersive X-ray Spectroscopy (SEM-EDX) and X-ray Photoelectron Spectroscopy (XPS) was also undertaken. XRD analysis confirmed the formation of pure phases at 850°C, with crystalline sizes of 31 nm for CaFe₂O₄, 27 nm for CoFe₂O₄, and 40 nm for CuFe₂O₄. The energy gaps, calculated from recorded diffuse reflection data, are 1.85 eV for CaFe₂O₄, 1.27 eV for CoFe₂O₄, and 1.64 eV for CuFe₂O₄. SEM micrographs showed homogeneous grains with uniform shapes and medium porosity in all samples. EDX elemental analysis determined the absence of any contaminating elements, highlighting the high purity of the prepared materials via the nitrate route. XPS spectra revealed the presence of Fe3+ and O in all samples. Additionally, XPS analysis revealed the presence of Ca²⁺, Co²⁺, and Cu²⁺ on the surface of CaFe₂O₄ and CoFe₂O₄ spinels, respectively. The photocatalytic activity was successfully evaluated by measuring H₂ evolution through the water-splitting process. The best performance was achieved with CaFe₂O₄ in a neutral medium (pH ~ 7), yielding 189 µmol at an optimal temperature of ~50°C. The highest hydrogen production rates for CoFe₂O₄ and CuFe₂O₄ were obtained at pH ~ 12 with release rates of 65 and 85 µmol, respectively, under visible light irradiation at the same optimal temperature. Various conditions were investigated including the pH of the solution, the hole sensors utilization and recyclability.

Keywords: hydrogen, MFe₂O₄, nitrate route, spinel ferrite

Procedia PDF Downloads 44
1364 Vehicular Speed Detection Camera System Using Video Stream

Authors: C. A. Anser Pasha

Abstract:

In this paper, a new Vehicular Speed Detection Camera System that is applicable as an alternative to traditional radars with the same accuracy or even better is presented. The real-time measurement and analysis of various traffic parameters such as speed and number of vehicles are increasingly required in traffic control and management. Image processing techniques are now considered as an attractive and flexible method for automatic analysis and data collections in traffic engineering. Various algorithms based on image processing techniques have been applied to detect multiple vehicles and track them. The SDCS processes can be divided into three successive phases; the first phase is Objects detection phase, which uses a hybrid algorithm based on combining an adaptive background subtraction technique with a three-frame differencing algorithm which ratifies the major drawback of using only adaptive background subtraction. The second phase is Objects tracking, which consists of three successive operations - object segmentation, object labeling, and object center extraction. Objects tracking operation takes into consideration the different possible scenarios of the moving object like simple tracking, the object has left the scene, the object has entered the scene, object crossed by another object, and object leaves and another one enters the scene. The third phase is speed calculation phase, which is calculated from the number of frames consumed by the object to pass by the scene.

Keywords: radar, image processing, detection, tracking, segmentation

Procedia PDF Downloads 471
1363 Continuous Measurement of Spatial Exposure Based on Visual Perception in Three-Dimensional Space

Authors: Nanjiang Chen

Abstract:

In the backdrop of expanding urban landscapes, accurately assessing spatial openness is critical. Traditional visibility analysis methods grapple with discretization errors and inefficiencies, creating a gap in truly capturing the human experi-ence of space. Addressing these gaps, this paper introduces a distinct continuous visibility algorithm, a leap in measuring urban spaces from a human-centric per-spective. This study presents a methodological breakthrough by applying this algorithm to urban visibility analysis. Unlike conventional approaches, this tech-nique allows for a continuous range of visibility assessment, closely mirroring hu-man visual perception. By eliminating the need for predefined subdivisions in ray casting, it offers a more accurate and efficient tool for urban planners and architects. The proposed algorithm not only reduces computational errors but also demonstrates faster processing capabilities, validated through a case study in Bei-jing's urban setting. Its key distinction lies in its potential to benefit a broad spec-trum of stakeholders, ranging from urban developers to public policymakers, aid-ing in the creation of urban spaces that prioritize visual openness and quality of life. This advancement in urban analysis methods could lead to more inclusive, comfortable, and well-integrated urban environments, enhancing the spatial experience for communities worldwide.

Keywords: visual openness, spatial continuity, ray-tracing algorithms, urban computation

Procedia PDF Downloads 52