Search results for: image-based techniques
6429 An Application of Modified M-out-of-N Bootstrap Method to Heavy-Tailed Distributions
Authors: Hannah F. Opayinka, Adedayo A. Adepoju
Abstract:
This study is an extension of a prior study on the modification of the existing m-out-of-n (moon) bootstrap method for heavy-tailed distributions in which modified m-out-of-n (mmoon) was proposed as an alternative method to the existing moon technique. In this study, both moon and mmoon techniques were applied to two real income datasets which followed Lognormal and Pareto distributions respectively with finite variances. The performances of these two techniques were compared using Standard Error (SE) and Root Mean Square Error (RMSE). The findings showed that mmoon outperformed moon bootstrap in terms of smaller SEs and RMSEs for all the sample sizes considered in the two datasets.Keywords: Bootstrap, income data, lognormal distribution, Pareto distribution
Procedia PDF Downloads 1876428 Innovative Acoustic Emission Techniques for Concrete Health Monitoring
Authors: Rahmat Ali, Beenish Khan, Aftabullah, Abid A. Shah
Abstract:
This research is an attempt to investigate the wide range of events using acoustic emission (AE) sensors of the concrete cubes subjected to different stress condition loading and unloading of concrete cubes. A total of 27 specimens were prepared and tested including 18 cubic (6”x6”x6”) and nine cylindrical (4”x8”) specimens were molded from three batches of concrete using w/c of 0.40, 0.50, and 0.60. The compressive strength of concrete was determined from concrete cylinder specimens. The deterioration of concrete was evaluated using the occurrence of felicity and Kaiser effects at each stress condition. It was found that acoustic emission hits usually exceeded when damage increases. Additionally, the correlation between AE techniques and the load applied were determined by plotting the normalized values. The influence of w/c on sensitivity of the AE technique in detecting concrete damages was also investigated.Keywords: acoustic emission, concrete, felicity ratio, sensors
Procedia PDF Downloads 3626427 Recommender Systems Using Ensemble Techniques
Authors: Yeonjeong Lee, Kyoung-jae Kim, Youngtae Kim
Abstract:
This study proposes a novel recommender system that uses data mining and multi-model ensemble techniques to enhance the recommendation performance through reflecting the precise user’s preference. The proposed model consists of two steps. In the first step, this study uses logistic regression, decision trees, and artificial neural networks to predict customers who have high likelihood to purchase products in each product group. Then, this study combines the results of each predictor using the multi-model ensemble techniques such as bagging and bumping. In the second step, this study uses the market basket analysis to extract association rules for co-purchased products. Finally, the system selects customers who have high likelihood to purchase products in each product group and recommends proper products from same or different product groups to them through above two steps. We test the usability of the proposed system by using prototype and real-world transaction and profile data. In addition, we survey about user satisfaction for the recommended product list from the proposed system and the randomly selected product lists. The results also show that the proposed system may be useful in real-world online shopping store.Keywords: product recommender system, ensemble technique, association rules, decision tree, artificial neural networks
Procedia PDF Downloads 2956426 An Energy Transfer Fluorescent Probe System for Glucose Sensor at Biomimetic Membrane Surface
Authors: Hoa Thi Hoang, Stephan Sass, Michael U. Kumke
Abstract:
Concanavalin A (conA) is a protein has been widely used in sensor system based on its specific binding to α-D-Glucose or α-D-Manose. For glucose sensor using conA, either fluoresence based techniques with intensity based or lifetime based are used. In this research, liposomes made from phospholipids were used as a biomimetic membrane system. In a first step, novel building blocks containing perylene labeled glucose units were added to the system and used to decorate the surface of the liposomes. Upon the binding between rhodamine labeled con A to the glucose units at the biomimetic membrane surface, a Förster resonance energy transfer system can be formed which combines unique fluorescence properties of perylene (e.g., high fluorescence quantum yield, no triplet formation) and its high hydrophobicity for efficient anchoring in membranes to form a novel probe for the investigation of sugar-driven binding reactions at biomimetic surfaces. Two glucose-labeled perylene derivatives were synthesized with different spacer length between the perylene and glucose unit in order to probe the binding of conA. The binding interaction was fully characterized by using high-end fluorescence techniques. Steady-state and time-resolved fluorescence techniques (e.g., fluorescence depolarization) in combination with single-molecule fluorescence spectroscopy techniques (fluorescence correlation spectroscopy, FCS) were used to monitor the interaction with conA. Base on the fluorescence depolarization, the rotational correlation times and the alteration in the diffusion coefficient (determined by FCS) the binding of the conA to the liposomes carrying the probe was studied. Moreover, single pair FRET experiments using pulsed interleaved excitation are used to characterize in detail the binding of conA to the liposome on a single molecule level avoiding averaging out effects.Keywords: concanavalin A, FRET, sensor, biomimetic membrane
Procedia PDF Downloads 3076425 Role of Natural Language Processing in Information Retrieval; Challenges and Opportunities
Authors: Khaled M. Alhawiti
Abstract:
This paper aims to analyze the role of natural language processing (NLP). The paper will discuss the role in the context of automated data retrieval, automated question answer, and text structuring. NLP techniques are gaining wider acceptance in real life applications and industrial concerns. There are various complexities involved in processing the text of natural language that could satisfy the need of decision makers. This paper begins with the description of the qualities of NLP practices. The paper then focuses on the challenges in natural language processing. The paper also discusses major techniques of NLP. The last section describes opportunities and challenges for future research.Keywords: data retrieval, information retrieval, natural language processing, text structuring
Procedia PDF Downloads 3416424 Urban Analysis of the Old City of Oran and Its Building after an Earthquake
Authors: A. Zatir, A. Mokhtari, A. Foufa, S. Zatir
Abstract:
The city of Oran, like any other region of northern Algeria, is subject to frequent seismic activity, the study presented in this work will be based on an analysis of urban and architectural context of the city of Oran before the date of the earthquake of 1790, and then try to deduce the differences between the old city before and after the earthquake. The analysis developed as a specific objective to tap into the seismic history of the city of Oran parallel to its urban history. The example of the citadel of Oran indicates that constructions presenting the site of the old citadel, may present elements of resistance for face to seismic effects. Removed in city observations of these structures, showed the ingenuity of the techniques used by the ancient builders, including the good performance of domes and arches in resistance to seismic forces.Keywords: earthquake, citadel, performance, traditional techniques, constructions
Procedia PDF Downloads 3056423 Nanomaterial Based Electrochemical Sensors for Endocrine Disrupting Compounds
Authors: Gaurav Bhanjana, Ganga Ram Chaudhary, Sandeep Kumar, Neeraj Dilbaghi
Abstract:
Main sources of endocrine disrupting compounds in the ecosystem are hormones, pesticides, phthalates, flame retardants, dioxins, personal-care products, coplanar polychlorinated biphenyls (PCBs), bisphenol A, and parabens. These endocrine disrupting compounds are responsible for learning disabilities, brain development problems, deformations of the body, cancer, reproductive abnormalities in females and decreased sperm count in human males. Although discharge of these chemical compounds into the environment cannot be stopped, yet their amount can be retarded through proper evaluation and detection techniques. The available techniques for determination of these endocrine disrupting compounds mainly include high performance liquid chromatography (HPLC), mass spectroscopy (MS) and gas chromatography-mass spectrometry (GC–MS). These techniques are accurate and reliable but have certain limitations like need of skilled personnel, time consuming, interference and requirement of pretreatment steps. Moreover, these techniques are laboratory bound and sample is required in large amount for analysis. In view of above facts, new methods for detection of endocrine disrupting compounds should be devised that promise high specificity, ultra sensitivity, cost effective, efficient and easy-to-operate procedure. Nowadays, electrochemical sensors/biosensors modified with nanomaterials are gaining high attention among researchers. Bioelement present in this system makes the developed sensors selective towards analyte of interest. Nanomaterials provide large surface area, high electron communication feature, enhanced catalytic activity and possibilities of chemical modifications. In most of the cases, nanomaterials also serve as an electron mediator or electrocatalyst for some analytes.Keywords: electrochemical, endocrine disruptors, microscopy, nanoparticles, sensors
Procedia PDF Downloads 2746422 Enhanced Calibration Map for a Four-Hole Probe for Measuring High Flow Angles
Authors: Jafar Mortadha, Imran Qureshi
Abstract:
This research explains and compares the modern techniques used for measuring the flow angles of a flowing fluid with the traditional technique of using multi-hole pressure probes. In particular, the focus of the study is on four-hole probes, which offer great reliability and benefits in several applications where the use of modern measurement techniques is either inconvenient or impractical. Due to modern advancements in manufacturing, small multi-hole pressure probes can be made with high precision, which eliminates the need for calibrating every manufactured probe. This study aims to improve the range of calibration maps for a four-hole probe to allow high flow angles to be measured accurately. The research methodology comprises a literature review of the successful calibration definitions that have been implemented on five-hole probes. These definitions are then adapted and applied on a four-hole probe using a set of raw pressures data. A comparison of the different definitions will be carried out in Matlab and the results will be analyzed to determine the best calibration definition. Taking simplicity of implementation into account as well as the reliability of flow angles estimation, an adapted technique from a research paper written in 2002 offered the most promising outcome. Consequently, the method is seen as a good enhancement for four-hole probes and it can substitute for the existing calibration definitions that offer less accuracy.Keywords: calibration definitions, calibration maps, flow measurement techniques, four-hole probes, multi-hole pressure probes
Procedia PDF Downloads 2976421 Optimal Classifying and Extracting Fuzzy Relationship from Query Using Text Mining Techniques
Authors: Faisal Alshuwaier, Ali Areshey
Abstract:
Text mining techniques are generally applied for classifying the text, finding fuzzy relations and structures in data sets. This research provides plenty text mining capabilities. One common application is text classification and event extraction, which encompass deducing specific knowledge concerning incidents referred to in texts. The main contribution of this paper is the clarification of a concept graph generation mechanism, which is based on a text classification and optimal fuzzy relationship extraction. Furthermore, the work presented in this paper explains the application of fuzzy relationship extraction and branch and bound method to simplify the texts.Keywords: extraction, max-prod, fuzzy relations, text mining, memberships, classification, memberships, classification
Procedia PDF Downloads 5836420 Analysis of Supply Chain Risk Management Strategies: Case Study of Supply Chain Disruptions
Authors: Marcelo Dias Carvalho, Leticia Ishikawa
Abstract:
Supply Chain Risk Management refers to a set of strategies used by companies to avoid supply chain disruption caused by damage at production facilities, natural disasters, capacity issues, inventory problems, incorrect forecasts, and delays. Many companies use the techniques of the Toyota Production System, which in a way goes against a better management of supply chain risks. This paper studies key events in some multinationals to analyze the trade-off between the best supply chain risk management techniques and management policies designed to create lean enterprises. The result of a good balance of these actions is the reduction of losses, increased customer trust in the company and better preparedness to face the general risks of a supply chain.Keywords: just in time, lean manufacturing, supply chain disruptions, supply chain management
Procedia PDF Downloads 3386419 Development of Evolutionary Algorithm by Combining Optimization and Imitation Approach for Machine Learning in Gaming
Authors: Rohit Mittal, Bright Keswani, Amit Mithal
Abstract:
This paper provides a sense about the application of computational intelligence techniques used to develop computer games, especially car racing. For the deep sense and knowledge of artificial intelligence, this paper is divided into various sections that is optimization, imitation, innovation and combining approach of optimization and imitation. This paper is mainly concerned with combining approach which tells different aspects of using fitness measures and supervised learning techniques used to imitate aspects of behavior. The main achievement of this paper is based on modelling player behaviour and evolving new game content such as racing tracks as single car racing on single track.Keywords: evolution algorithm, genetic, optimization, imitation, racing, innovation, gaming
Procedia PDF Downloads 6466418 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia
Authors: Carol Anne Hargreaves
Abstract:
A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.Keywords: machine learning, stock market trading, logistic regression, cluster analysis, factor analysis, decision trees, neural networks, automated stock investment system
Procedia PDF Downloads 1586417 Synthesis and Characterization of Hydroxyapatite from Biowaste for Potential Medical Application
Authors: M. D. H. Beg, John O. Akindoyo, Suriati Ghazali, Nitthiyah Jeyaratnam
Abstract:
Over the period of time, several approaches have been undertaken to mitigate the challenges associated with bone regeneration. This includes but not limited to xenografts, allografts, autografts as well as artificial substitutions like bioceramics, synthetic cements and metals. The former three techniques often come along with peculiar limitation and problems such as morbidity, availability, disease transmission, collateral site damage or absolute rejection by the body as the case may be. Synthetic routes remain the only feasible alternative option for treatment of bone defects. Hydroxyapatite (HA) is very compatible and suitable for this application. However, most of the common methods for HA synthesis are either expensive, complicated or environmentally unfriendly. Interestingly, extraction of HA from bio-wastes have been perceived not only to be cost effective, but also environment friendly. In this research, HA was synthesized from bio-waste: namely bovine bones through three different methods which are hydrothermal chemical processes, ultrasound assisted synthesis and ordinary calcination techniques. Structure and property analysis of the HA was carried out through different characterization techniques such as TGA, FTIR, and XRD. All the methods applied were able to produce HA with similar compositional properties to biomaterials found in human calcified tissues. Calcination process was however observed to be more efficient as it eliminated all the organic components from the produced HA. The HA synthesized is unique for its minimal cost and environmental friendliness. It is also perceived to be suitable for tissue and bone engineering applications.Keywords: hydroxyapatite, bone, calcination, biowaste
Procedia PDF Downloads 2506416 Rail-To-Rail Output Op-Amp Design with Negative Miller Capacitance Compensation
Authors: Muhaned Zaidi, Ian Grout, Abu Khari bin A’ain
Abstract:
In this paper, a two-stage op-amp design is considered using both Miller and negative Miller compensation techniques. The first op-amp design uses Miller compensation around the second amplification stage, whilst the second op-amp design uses negative Miller compensation around the first stage and Miller compensation around the second amplification stage. The aims of this work were to compare the gain and phase margins obtained using the different compensation techniques and identify the ability to choose either compensation technique based on a particular set of design requirements. The two op-amp designs created are based on the same two-stage rail-to-rail output CMOS op-amp architecture where the first stage of the op-amp consists of differential input and cascode circuits, and the second stage is a class AB amplifier. The op-amps have been designed using a 0.35mm CMOS fabrication process.Keywords: op-amp, rail-to-rail output, Miller compensation, Negative Miller capacitance
Procedia PDF Downloads 3396415 Image Processing techniques for Surveillance in Outdoor Environment
Authors: Jayanth C., Anirudh Sai Yetikuri, Kavitha S. N.
Abstract:
This paper explores the development and application of computer vision and machine learning techniques for real-time pose detection, facial recognition, and number plate extraction. Utilizing MediaPipe for pose estimation, the research presents methods for detecting hand raises and ducking postures through real-time video analysis. Complementarily, facial recognition is employed to compare and verify individual identities using the face recognition library. Additionally, the paper demonstrates a robust approach for extracting and storing vehicle number plates from images, integrating Optical Character Recognition (OCR) with a database management system. The study highlights the effectiveness and versatility of these technologies in practical scenarios, including security and surveillance applications. The findings underscore the potential of combining computer vision techniques to address diverse challenges and enhance automated systems for both individual and vehicular identification. This research contributes to the fields of computer vision and machine learning by providing scalable solutions and demonstrating their applicability in real-world contexts.Keywords: computer vision, pose detection, facial recognition, number plate extraction, machine learning, real-time analysis, OCR, database management
Procedia PDF Downloads 276414 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods
Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin
Abstract:
Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile
Procedia PDF Downloads 1716413 Determination of Complexity Level in Merged Irregular Transposition Cipher
Authors: Okike Benjamin, Garba Ejd
Abstract:
Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In order to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often easily decrypted by adversaries. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.Keywords: transposition cipher, merged irregular cipher, encryption, complexity level
Procedia PDF Downloads 3456412 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards
Authors: Golnush Masghati-Amoli, Paul Chin
Abstract:
Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering
Procedia PDF Downloads 1366411 The Study of Strength and Weakness Points of Various Techniques for Calculating the Volume of Done Work in Civil Projects
Authors: Ali Fazeli Moslehabadi
Abstract:
One of the topics discussed in civil projects, during the execution of the project, which the continuous change of work volumes is usually the characteristics of these types of projects, is how to calculate the volume of done work. The difference in volumes announced by the execution unit with the estimated volume by the technical office unit, has direct effect on the announced progress of the project. This issue can show the progress of the project more or less than actual value and as a result making mistakes for stakeholders and project managers and misleading them. This article intends to introduce some practical methods for calculating the volume of done work in civil projects. It then reviews the strengths and weaknesses of each of them, in order to resolve these contradictions and conflicts.Keywords: technical skills, systemic skills, communication skills, done work volume calculation techniques
Procedia PDF Downloads 1576410 The Influence of Temperature on the Corrosion and Corrosion Inhibition of Steel in Hydrochloric Acid Solution: Thermodynamic Study
Authors: Fatimah Al-Hayazi, Ehteram. A. Noor, Aisha H. Moubaraki
Abstract:
The inhibitive effect of Securigera securidaca seed extract (SSE) on mild steel corrosion in 1 M HCl solution has been studied by weight loss and electrochemical techniques at four different temperatures. All techniques studied provided data that the studied extract does well at all temperatures, and its inhibitory action increases with increasing its concentration. SEM images indicate thin-film formation on mild steel when corroded in solutions containing 1 g L-1 of inhibitor either at low or high temperatures. The polarization studies showed that SSE acts as an anodic inhibitor. Both polarization and impedance techniques show an acceleration behaviour for SSE at concentrations ≤ 0.1 g L-1 at all temperatures. At concentrations ≥ 0.1 g L-1, the efficiency of SSE is dramatically increased with increasing concentration, and its value does not change appreciably with increasing temperature. It was found that all adsorption data obeyed Temkin adsorption isotherm. Kinetic activation and thermodynamic adsorption parameters are evaluated and discussed. The results revealed an endothermic corrosion process with an associative activation mechanism, while a comprehensive adsorption mechanism for SSE on mild steel surfaces is suggested, in which both physical and chemical adsorption are involved in the adsorption process. A good correlation between inhibitor constituents and their inhibitory action was obtained.Keywords: corrosion, inhibition of steel, hydrochloric acid, thermodynamic study
Procedia PDF Downloads 1006409 Aristotelian Techniques of Communication Used by Current Affairs Talk Shows in Pakistan for Creating Dramatic Effect to Trigger Emotional Relevance
Authors: Shazia Anwer
Abstract:
The current TV Talk Shows, especially on domestic politics in Pakistan are following the Aristotelian techniques, including deductive reasoning, three modes of persuasion, and guidelines for communication. The application of “Approximate Truth is also seen when Talk Show presenters create doubts against political personalities or national issues. Mainstream media of Pakistan, being a key carrier of narrative construction for the sake of the primary function of national consensus on regional and extended public diplomacy, is failing the purpose. This paper has highlighted the Aristotelian communication methodology, its purposes and its limitations for a serious discussion, and its connection to the mistrust among the Pakistani population regarding fake or embedded, funded Information. Data has been collected from 3 Pakistani TV Talk Shows and their analysis has been made by applying the Aristotelian communication method to highlight the core issues. Paper has also elaborated that current media education is impaired in providing transparent techniques to train the future journalist for a meaningful, thought-provoking discussion. For this reason, this paper has given an overview of HEC’s (Higher Education Commission) graduate-level Mass Com Syllabus for Pakistani Universities. The idea of ethos, logos, and pathos are the main components of TV Talk Shows and as a result, the educated audience is lacking trust in the mainstream media, which eventually generating feelings of distrust and betrayal in the society because productions look like the genre of Drama instead of facts and analysis thus the line between Current Affairs shows and Infotainment has become blurred. In the last section, practical implication to improve meaningfulness and transparency in the TV Talk shows has been suggested by replacing the Aristotelian communication method with the cognitive semiotic communication approach.Keywords: Aristotelian techniques of communication, current affairs talk shows, drama, Pakistan
Procedia PDF Downloads 2066408 CAD Tool for Parametric Design modification of Yacht Hull Surface Models
Authors: Shahroz Khan, Erkan Gunpinar, Kemal Mart
Abstract:
Recently parametric design techniques became a vital concept in the field of Computer Aided Design (CAD), which helps to provide sophisticated platform to the designer in order to automate the design process in efficient time. In these techniques, design process starts by parameterizing the important features of design models (typically the key dimensions), with the implementation of design constraints. The design constraints help to retain the overall shape of the model while modifying its parameters. However, the process of initializing an appropriate number of design parameters and constraints is the crucial part of parametric design techniques, especially for complex surface models such as yacht hull. This paper introduces a method to create complex surface models in favor of parametric design techniques, a method to define the right number of parameters and respective design constraints, and a system to implement design parameters in contract to design constraints schema. For this, in our proposed approach the design process starts by dividing the yacht hull into three sections. Each section consists of different shape lines, which form the overall shape of yacht hull. The shape lines are created using Cubic Bezier Curves, which allow larger design flexibility. Design parameters and constraints are defined on the shape lines in 3D design space to facilitate the designers for better and individual handling of parameters. Afterwards, shape modifiers are developed, which allow the modification of each parameter while satisfying the respective set of criteria and design constraints. Such as, geometric continuities should be maintained between the shape lines of the three sections, fairness of the hull surfaces should be preserved after modification and while design modification, effect of a single parameter should be negligible on other parameters. The constraints are defined individually on shape lines of each section and mutually between the shape lines of two connecting sections. In order to validate and visualize design results of our shape modifiers, a real time graphic interface is created.Keywords: design parameter, design constraints, shape modifies, yacht hull
Procedia PDF Downloads 3016407 Acoustic Echo Cancellation Using Different Adaptive Algorithms
Authors: Hamid Sharif, Nazish Saleem Abbas, Muhammad Haris Jamil
Abstract:
An adaptive filter is a filter that self-adjusts its transfer function according to an optimization algorithm driven by an error signal. Because of the complexity of the optimization algorithms, most adaptive filters are digital filters. Adaptive filtering constitutes one of the core technologies in digital signal processing and finds numerous application areas in science as well as in industry. Adaptive filtering techniques are used in a wide range of applications, including adaptive noise cancellation and echo cancellation. Acoustic echo cancellation is a common occurrence in today’s telecommunication systems. The signal interference caused by acoustic echo is distracting to both users and causes a reduction in the quality of the communication. In this paper, we review different techniques of adaptive filtering to reduce this unwanted echo. In this paper, we see the behavior of techniques and algorithms of adaptive filtering like Least Mean Square (LMS), Normalized Least Mean Square (NLMS), Variable Step-Size Least Mean Square (VSLMS), Variable Step-Size Normalized Least Mean Square (VSNLMS), New Varying Step Size LMS Algorithm (NVSSLMS) and Recursive Least Square (RLS) algorithms to reduce this unwanted echo, to increase communication quality.Keywords: adaptive acoustic, echo cancellation, LMS algorithm, adaptive filter, normalized least mean square (NLMS), variable step-size least mean square (VSLMS)
Procedia PDF Downloads 806406 Flow Visualization in Biological Complex Geometries for Personalized Medicine
Authors: Carlos Escobar-del Pozo, César Ahumada-Monroy, Azael García-Rebolledo, Alberto Brambila-Solórzano, Gregorio Martínez-Sánchez, Luis Ortiz-Rincón
Abstract:
Numerical simulations of flow in complex biological structures have gained considerable attention in the last years. However, the major issue is the validation of the results. The present work shows a Particle Image Velocimetry PIV flow visualization technique in complex biological structures, particularly in intracranial aneurysms. A methodology to reconstruct and generate a transparent model has been developed, as well as visualization and particle tracking techniques. The generated transparent models allow visualizing the flow patterns with a regular camera using the visualization techniques. The final goal is to use visualization as a tool to provide more information on the treatment and surgery decisions in aneurysms.Keywords: aneurysms, PIV, flow visualization, particle tracking
Procedia PDF Downloads 926405 New Chinese Landscapes in the Works of the Chinese Photographer Yao Lu
Authors: Xiaoling Dai
Abstract:
Many Chinese artists have used digital photography to create works with features of Chinese landscape paintings since the 20th century. The ‘New Mountains and Water’ works created by digital techniques reflect the fusion of photographic techniques and traditional Chinese aesthetic thoughts. Borrowing from Chinese landscape paintings in the Song Dynasty, the Chinese photographer Yao Lu uses digital photography to reflect contemporary environmental construction in his series New Landscapes. By portraying a variety of natural environments brought by urbanization in the contemporary period, Lu deconstructs traditional Chinese paintings and reconstructs contemporary photographic practices. The primary object of this study is to investigate how Chinese photographer Yao Lu redefines and re-interprets the relationship between tradition and contemporaneity. In this study, Yao Lu’s series work New Landscapes is used for photo elicitation, which seeks to broaden understanding of the development of Chinese landscape photography. Furthermore, discourse analysis will be used to evaluate how Chinese social developments influence the creation of photographic practices. Through visual and discourse analysis, this study aims to excavate the relationship between tradition and contemporaneity in Lu’s works. According to New Landscapes, the study argues that in Lu’s interpretations of landscapes, tradition and contemporaneity are seen to establish a new relationship. Traditional approaches to creation do not become obsolete over time. On the contrary, traditional notions and styles of creation can shed new light on contemporary issues or techniques.Keywords: Chinese aesthetics, Yao Lu, new landscapes, tradition, contemporaneity
Procedia PDF Downloads 796404 Electrochemical Study of Copper–Tin Alloy Nucleation Mechanisms onto Different Substrates
Authors: Meriem Hamla, Mohamed Benaicha, Sabrine Derbal
Abstract:
In the present work, several materials such as M/glass (M = Pt, Mo) were investigated to test their suitability for studying the early nucleation stages and growth of copper-tin clusters. It was found that most of these materials stand as good substrates to be used in the study of the nucleation and growth of electrodeposited Cu-Sn alloys from aqueous solution containing CuCl2, SnCl2 as electroactive species and Na3C6H5O7 as complexing agent. Among these substrates, Pt shows instantaneous models followed by 3D diffusion-limited growth. On the other hand, the electrodeposited copper-tin thin films onto Mo substrate followed progressive nucleation. The deposition mechanism of the Cu-Sn films has been studied using stationary electrochemical techniques (cyclic voltammetery (CV) and chronoamperometry (CA). The structural, morphological and compositional of characterization have been studied using X-ray diffraction (XRD), scanning electron microscopy (SEM) and EDAX techniques respectively.Keywords: electrodeposition, CuSn, nucleation, mechanism
Procedia PDF Downloads 3986403 Empirical Acceleration Functions and Fuzzy Information
Authors: Muhammad Shafiq
Abstract:
In accelerated life testing approaches life time data is obtained under various conditions which are considered more severe than usual condition. Classical techniques are based on obtained precise measurements, and used to model variation among the observations. In fact, there are two types of uncertainty in data: variation among the observations and the fuzziness. Analysis techniques, which do not consider fuzziness and are only based on precise life time observations, lead to pseudo results. This study was aimed to examine the behavior of empirical acceleration functions using fuzzy lifetimes data. The results showed an increased fuzziness in the transformed life times as compare to the input data.Keywords: acceleration function, accelerated life testing, fuzzy number, non-precise data
Procedia PDF Downloads 3016402 Peat Soil Stabilization Methods: A Review
Authors: Mohammad Saberian, Mohammad Ali Rahgozar, Reza Porhoseini
Abstract:
Peat soil is formed naturally through the accumulation of organic matter under water and it consists of more than 75% organic substances. Peat is considered to be in the category of problematic soil, which is not suitable for construction, due to its high compressibility, high moisture content, low shear strength, and low bearing capacity. Since this kind of soil is generally found in many countries and different regions, finding desirable techniques for stabilization of peat is absolutely essential. The purpose of this paper is to review the various techniques applied for stabilizing peat soil and discuss outcomes of its improved mechanical parameters and strength properties. Recognizing characterization of stabilized peat is one of the most significant factors for architectural structures; as a consequence, various strategies for stabilization of this susceptible soil have been examined based on the depth of peat deposit.Keywords: peat soil, stabilization, depth, strength, unconfined compressive strength (USC)
Procedia PDF Downloads 5756401 Using Machine Learning as an Alternative for Predicting Exchange Rates
Authors: Pedro Paulo Galindo Francisco, Eli Dhadad Junior
Abstract:
This study addresses the Meese-Rogoff Puzzle by introducing the latest machine learning techniques as alternatives for predicting the exchange rates. Using RMSE as a comparison metric, Meese and Rogoff discovered that economic models are unable to outperform the random walk model as short-term exchange rate predictors. Decades after this study, no statistical prediction technique has proven effective in overcoming this obstacle; although there were positive results, they did not apply to all currencies and defined periods. Recent advancements in artificial intelligence technologies have paved the way for a new approach to exchange rate prediction. Leveraging this technology, we applied five machine learning techniques to attempt to overcome the Meese-Rogoff puzzle. We considered daily data for the real, yen, British pound, euro, and Chinese yuan against the US dollar over a time horizon from 2010 to 2023. Our results showed that none of the presented techniques were able to produce an RMSE lower than the Random Walk model. However, the performance of some models, particularly LSTM and N-BEATS were able to outperform the ARIMA model. The results also suggest that machine learning models have untapped potential and could represent an effective long-term possibility for overcoming the Meese-Rogoff puzzle.Keywords: exchage rate, prediction, machine learning, deep learning
Procedia PDF Downloads 336400 A Hybrid Method for Determination of Effective Poles Using Clustering Dominant Pole Algorithm
Authors: Anuj Abraham, N. Pappa, Daniel Honc, Rahul Sharma
Abstract:
In this paper, an analysis of some model order reduction techniques is presented. A new hybrid algorithm for model order reduction of linear time invariant systems is compared with the conventional techniques namely Balanced Truncation, Hankel Norm reduction and Dominant Pole Algorithm (DPA). The proposed hybrid algorithm is known as Clustering Dominant Pole Algorithm (CDPA) is able to compute the full set of dominant poles and its cluster center efficiently. The dominant poles of a transfer function are specific eigenvalues of the state space matrix of the corresponding dynamical system. The effectiveness of this novel technique is shown through the simulation results.Keywords: balanced truncation, clustering, dominant pole, Hankel norm, model reduction
Procedia PDF Downloads 599