Search results for: unsupervised techniques;
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6884

Search results for: unsupervised techniques;

6434 Learning Compression Techniques on Smart Phone

Authors: Farouk Lawan Gambo, Hamada Mohammad

Abstract:

Data compression shrinks files into fewer bits than their original presentation. It has more advantage on the internet because the smaller a file, the faster it can be transferred but learning most of the concepts in data compression are abstract in nature, therefore, making them difficult to digest by some students (engineers in particular). This paper studies the learning preference of engineering students who tend to have strong, active, sensing, visual and sequential learning preferences, the paper also studies the three shift of technology-aided that learning has experienced, which mobile learning has been considered to be the feature of learning that will integrate other form of the education process. Lastly, we propose a design and implementation of mobile learning application using software engineering methodology that will enhance the traditional teaching and learning of data compression techniques.

Keywords: data compression, learning preference, mobile learning, multimedia

Procedia PDF Downloads 448
6433 Suggestion for Malware Detection Agent Considering Network Environment

Authors: Ji-Hoon Hong, Dong-Hee Kim, Nam-Uk Kim, Tai-Myoung Chung

Abstract:

Smartphone users are increasing rapidly. Accordingly, many companies are running BYOD (Bring Your Own Device: Policies to bring private-smartphones to the company) policy to increase work efficiency. However, smartphones are always under the threat of malware, thus the company network that is connected smartphone is exposed to serious risks. Most smartphone malware detection techniques are to perform an independent detection (perform the detection of a single target application). In this paper, we analyzed a variety of intrusion detection techniques. Based on the results of analysis propose an agent using the network IDS.

Keywords: android malware detection, software-defined network, interaction environment, android malware detection, software-defined network, interaction environment

Procedia PDF Downloads 434
6432 Digital Art Fabric Prints: Procedure, Process and Progress

Authors: Tripti Singh

Abstract:

Digital tools are merging boundaries of different mediums as endeavoured artists exploring new areas. Digital fabric printing has motivated artists to create prints by combining images acquired by photograph, scanned images, computer graphics and microscopic imaginary etc to name few, with traditional media such as hand drawing, weaving, hand printed patterns, printing making techniques and so on. It opened whole new world of possibilities for artists to search, research and combine old and contemporary mediums for their unique art prints. As artistic medium digital art fabrics have aesthetic values which have impact and influence on not only on a personality but also interiors of a living or work space. In this way it can be worn, as fashion statement and also an interior decoration. Digital art fabric prints gives opportunity to print almost everything on any fabric with long lasting prints quality. Single edition and limited editions are possible for maintaining scarcity and uniqueness of an art form. These fabric prints fulfill today’s need, as they are eco-friendly in nature and they produce less wastage compared to traditional fabric printing techniques. These prints can be used to make unique and customized curtains, quilts, clothes, bags, furniture, dolls, pillows, framed artwork, costumes, banners and much, much more. This paper will explore the procedure, process, and progress techniques of digital art fabric printing in depth with suitable pictorial examples.

Keywords: digital art, fabric prints, digital fabric prints, new media

Procedia PDF Downloads 515
6431 KAP Study on Breast Cancer Among Women in Nirmala Educational Institutions-A Prospective Observational Study

Authors: Shaik Asha Begum, S. Joshna Rani, Shaik Abdul Rahaman

Abstract:

INTRODUCTION: Breast cancer is a disease that creates in breast cells. "KAP" study estimates the Knowledge, Attitude, and Practices of a local area. More than 1.5 million ladies (25% of all ladies with malignancy) are determined to have bosom disease consistently all through the world. Understanding the degrees of Knowledge, Attitude and Practice will empower a more effective cycle of mindfulness creation as it will permit the program to be custom-made all the more properly to the necessities of the local area. OBJECTIVES: The objective of this study is to assess the knowledge on signs and symptoms, risk factors, provide awareness on the practicing of the early detection techniques of breast cancer and provide knowledge on the overall breast cancer including preventive techniques. METHODOLOGY: This is an expressive cross-sectional investigation. This investigation of KAP was done in the Nirmala Educational Institutions from January to April 2021. A total of 300 participants are included from women students in pharmacy graduates & lecturers, and also from graduates other than the pharmacy. The examiners are taken from the BCAM (Breast Cancer Awareness Measure), tool compartment (Version 2). RESULT: According to the findings of the study, the majority of the participants were not well informed about breast cancer. A lump in the breast was the most commonly mentioned sign of breast cancer, followed by pain in the breast or nipple. The percentage of knowledge related to the breast cancer risk factors was also very less. The correct answers for breast cancer risk factors were radiation exposure (58.20 percent), a positive family history (47.6 percent), obesity (46.9 percent), a lack of physical activity (43.6 percent), and smoking (43.2 percent). Breast cancer screening, on the other hand, was uncommon (only 30 and 11.3 percent practiced clinical breast examination and mammography respectively). CONCLUSION: In this study, the knowledge on the signs and symptoms, risk factors of breast cancer - pharmacy graduates have more knowledge than the non-pharmacy graduates but in the preventive techniques and early detective tools of breast cancer -had poor knowledge in the pharmacy and non-pharmacy graduate. After the awareness program, pharmacy and non-pharmacy graduates got supportive knowledge on the preventive techniques and also practiced the early detective techniques of breast cancer.

Keywords: breast cancer, mammography, KAP study, early detection

Procedia PDF Downloads 138
6430 A Posteriori Trading-Inspired Model-Free Time Series Segmentation

Authors: Plessen Mogens Graf

Abstract:

Within the context of multivariate time series segmentation, this paper proposes a method inspired by a posteriori optimal trading. After a normalization step, time series are treated channelwise as surrogate stock prices that can be traded optimally a posteriori in a virtual portfolio holding either stock or cash. Linear transaction costs are interpreted as hyperparameters for noise filtering. Trading signals, as well as trading signals obtained on the reversed time series, are used for unsupervised channelwise labeling before a consensus over all channels is reached that determines the final segmentation time instants. The method is model-free such that no model prescriptions for segments are made. Benefits of proposed approach include simplicity, computational efficiency, and adaptability to a wide range of different shapes of time series. Performance is demonstrated on synthetic and real-world data, including a large-scale dataset comprising a multivariate time series of dimension 1000 and length 2709. Proposed method is compared to a popular model-based bottom-up approach fitting piecewise affine models and to a recent model-based top-down approach fitting Gaussian models and found to be consistently faster while producing more intuitive results in the sense of segmenting time series at peaks and valleys.

Keywords: time series segmentation, model-free, trading-inspired, multivariate data

Procedia PDF Downloads 136
6429 Pyramidal Lucas-Kanade Optical Flow Based Moving Object Detection in Dynamic Scenes

Authors: Hyojin Lim, Cuong Nguyen Khac, Yeongyu Choi, Ho-Youl Jung

Abstract:

In this paper, we propose a simple moving object detection, which is based on motion vectors obtained from pyramidal Lucas-Kanade optical flow. The proposed method detects moving objects such as pedestrians, the other vehicles and some obstacles at the front-side of the host vehicle, and it can provide the warning to the driver. Motion vectors are obtained by using pyramidal Lucas-Kanade optical flow, and some outliers are eliminated by comparing the amplitude of each vector with the pre-defined threshold value. The background model is obtained by calculating the mean and the variance of the amplitude of recent motion vectors in the rectangular shaped local region called the cell. The model is applied as the reference to classify motion vectors of moving objects and those of background. Motion vectors are clustered to rectangular regions by using the unsupervised clustering K-means algorithm. Labeling method is applied to label groups which is close to each other, using by distance between each center points of rectangular. Through the simulations tested on four kinds of scenarios such as approaching motorbike, vehicle, and pedestrians to host vehicle, we prove that the proposed is simple but efficient for moving object detection in parking lots.

Keywords: moving object detection, dynamic scene, optical flow, pyramidal optical flow

Procedia PDF Downloads 349
6428 Kinetic and Mechanistic Study on the Degradation of Typical Pharmaceutical and Personal Care Products in Water by Using Carbon Nanodots/C₃N₄ Composite and Ultrasonic Irradiation

Authors: Miao Yang

Abstract:

PPCPs (pharmaceutical and personal care products) in water, as an environmental pollutant, becomes an issue of increasing concern. Therefore, the techniques for degradation of PPCPs has been a hotspot in water pollution control field. Since there are several disadvantages for common degradation techniques of PPCPs, such as low degradation efficiency for certain PPCPs (ibuprofen and Carbamazepine) this proposal will adopt a combined technique by using CDs (carbon nanodots)/C₃N₄ composite and ultrasonic irradiation to mitigate or overcome these shortages. There is a significant scientific problem that the mechanism including PPCPs, major reactants, and interfacial active sites is not clear yet in the study of PPCPs degradation. This work aims to solve this problem by using both theoretical and experimental methodologies. Firstly, optimized parameters will be obtained by evaluating the kinetics and oxidation efficiency under different conditions. The competition between H₂O₂ and PPCPs with HO• will be elucidated, after which the degradation mechanism of PPCPs by the synergy of CDs/C₃N₄ composite and ultrasonic irradiation will be proposed. Finally, a sonolysis-adsorption-catalysis coupling mechanism will be established which is the theoretical basis and technical support for developing new efficient degradation techniques for PPCPs in the future.

Keywords: carbon nanodots/C₃N₄, pharmaceutical and personal care products, ultrasonic irradiation, hydroxyl radical, heterogeneous catalysis

Procedia PDF Downloads 180
6427 An Application of Modified M-out-of-N Bootstrap Method to Heavy-Tailed Distributions

Authors: Hannah F. Opayinka, Adedayo A. Adepoju

Abstract:

This study is an extension of a prior study on the modification of the existing m-out-of-n (moon) bootstrap method for heavy-tailed distributions in which modified m-out-of-n (mmoon) was proposed as an alternative method to the existing moon technique. In this study, both moon and mmoon techniques were applied to two real income datasets which followed Lognormal and Pareto distributions respectively with finite variances. The performances of these two techniques were compared using Standard Error (SE) and Root Mean Square Error (RMSE). The findings showed that mmoon outperformed moon bootstrap in terms of smaller SEs and RMSEs for all the sample sizes considered in the two datasets.

Keywords: Bootstrap, income data, lognormal distribution, Pareto distribution

Procedia PDF Downloads 186
6426 Innovative Acoustic Emission Techniques for Concrete Health Monitoring

Authors: Rahmat Ali, Beenish Khan, Aftabullah, Abid A. Shah

Abstract:

This research is an attempt to investigate the wide range of events using acoustic emission (AE) sensors of the concrete cubes subjected to different stress condition loading and unloading of concrete cubes. A total of 27 specimens were prepared and tested including 18 cubic (6”x6”x6”) and nine cylindrical (4”x8”) specimens were molded from three batches of concrete using w/c of 0.40, 0.50, and 0.60. The compressive strength of concrete was determined from concrete cylinder specimens. The deterioration of concrete was evaluated using the occurrence of felicity and Kaiser effects at each stress condition. It was found that acoustic emission hits usually exceeded when damage increases. Additionally, the correlation between AE techniques and the load applied were determined by plotting the normalized values. The influence of w/c on sensitivity of the AE technique in detecting concrete damages was also investigated.

Keywords: acoustic emission, concrete, felicity ratio, sensors

Procedia PDF Downloads 362
6425 Recommender Systems Using Ensemble Techniques

Authors: Yeonjeong Lee, Kyoung-jae Kim, Youngtae Kim

Abstract:

This study proposes a novel recommender system that uses data mining and multi-model ensemble techniques to enhance the recommendation performance through reflecting the precise user’s preference. The proposed model consists of two steps. In the first step, this study uses logistic regression, decision trees, and artificial neural networks to predict customers who have high likelihood to purchase products in each product group. Then, this study combines the results of each predictor using the multi-model ensemble techniques such as bagging and bumping. In the second step, this study uses the market basket analysis to extract association rules for co-purchased products. Finally, the system selects customers who have high likelihood to purchase products in each product group and recommends proper products from same or different product groups to them through above two steps. We test the usability of the proposed system by using prototype and real-world transaction and profile data. In addition, we survey about user satisfaction for the recommended product list from the proposed system and the randomly selected product lists. The results also show that the proposed system may be useful in real-world online shopping store.

Keywords: product recommender system, ensemble technique, association rules, decision tree, artificial neural networks

Procedia PDF Downloads 294
6424 An Energy Transfer Fluorescent Probe System for Glucose Sensor at Biomimetic Membrane Surface

Authors: Hoa Thi Hoang, Stephan Sass, Michael U. Kumke

Abstract:

Concanavalin A (conA) is a protein has been widely used in sensor system based on its specific binding to α-D-Glucose or α-D-Manose. For glucose sensor using conA, either fluoresence based techniques with intensity based or lifetime based are used. In this research, liposomes made from phospholipids were used as a biomimetic membrane system. In a first step, novel building blocks containing perylene labeled glucose units were added to the system and used to decorate the surface of the liposomes. Upon the binding between rhodamine labeled con A to the glucose units at the biomimetic membrane surface, a Förster resonance energy transfer system can be formed which combines unique fluorescence properties of perylene (e.g., high fluorescence quantum yield, no triplet formation) and its high hydrophobicity for efficient anchoring in membranes to form a novel probe for the investigation of sugar-driven binding reactions at biomimetic surfaces. Two glucose-labeled perylene derivatives were synthesized with different spacer length between the perylene and glucose unit in order to probe the binding of conA. The binding interaction was fully characterized by using high-end fluorescence techniques. Steady-state and time-resolved fluorescence techniques (e.g., fluorescence depolarization) in combination with single-molecule fluorescence spectroscopy techniques (fluorescence correlation spectroscopy, FCS) were used to monitor the interaction with conA. Base on the fluorescence depolarization, the rotational correlation times and the alteration in the diffusion coefficient (determined by FCS) the binding of the conA to the liposomes carrying the probe was studied. Moreover, single pair FRET experiments using pulsed interleaved excitation are used to characterize in detail the binding of conA to the liposome on a single molecule level avoiding averaging out effects.

Keywords: concanavalin A, FRET, sensor, biomimetic membrane

Procedia PDF Downloads 307
6423 A Clustering Algorithm for Massive Texts

Authors: Ming Liu, Chong Wu, Bingquan Liu, Lei Chen

Abstract:

Internet users have to face the massive amount of textual data every day. Organizing texts into categories can help users dig the useful information from large-scale text collection. Clustering, in fact, is one of the most promising tools for categorizing texts due to its unsupervised characteristic. Unfortunately, most of traditional clustering algorithms lose their high qualities on large-scale text collection. This situation mainly attributes to the high- dimensional vectors generated from texts. To effectively and efficiently cluster large-scale text collection, this paper proposes a vector reconstruction based clustering algorithm. Only the features that can represent the cluster are preserved in cluster’s representative vector. This algorithm alternately repeats two sub-processes until it converges. One process is partial tuning sub-process, where feature’s weight is fine-tuned by iterative process. To accelerate clustering velocity, an intersection based similarity measurement and its corresponding neuron adjustment function are proposed and implemented in this sub-process. The other process is overall tuning sub-process, where the features are reallocated among different clusters. In this sub-process, the features useless to represent the cluster are removed from cluster’s representative vector. Experimental results on the three text collections (including two small-scale and one large-scale text collections) demonstrate that our algorithm obtains high quality on both small-scale and large-scale text collections.

Keywords: vector reconstruction, large-scale text clustering, partial tuning sub-process, overall tuning sub-process

Procedia PDF Downloads 435
6422 Role of Natural Language Processing in Information Retrieval; Challenges and Opportunities

Authors: Khaled M. Alhawiti

Abstract:

This paper aims to analyze the role of natural language processing (NLP). The paper will discuss the role in the context of automated data retrieval, automated question answer, and text structuring. NLP techniques are gaining wider acceptance in real life applications and industrial concerns. There are various complexities involved in processing the text of natural language that could satisfy the need of decision makers. This paper begins with the description of the qualities of NLP practices. The paper then focuses on the challenges in natural language processing. The paper also discusses major techniques of NLP. The last section describes opportunities and challenges for future research.

Keywords: data retrieval, information retrieval, natural language processing, text structuring

Procedia PDF Downloads 341
6421 Urban Analysis of the Old City of Oran and Its Building after an Earthquake

Authors: A. Zatir, A. Mokhtari, A. Foufa, S. Zatir

Abstract:

The city of Oran, like any other region of northern Algeria, is subject to frequent seismic activity, the study presented in this work will be based on an analysis of urban and architectural context of the city of Oran before the date of the earthquake of 1790, and then try to deduce the differences between the old city before and after the earthquake. The analysis developed as a specific objective to tap into the seismic history of the city of Oran parallel to its urban history. The example of the citadel of Oran indicates that constructions presenting the site of the old citadel, may present elements of resistance for face to seismic effects. Removed in city observations of these structures, showed the ingenuity of the techniques used by the ancient builders, including the good performance of domes and arches in resistance to seismic forces.

Keywords: earthquake, citadel, performance, traditional techniques, constructions

Procedia PDF Downloads 305
6420 Nanomaterial Based Electrochemical Sensors for Endocrine Disrupting Compounds

Authors: Gaurav Bhanjana, Ganga Ram Chaudhary, Sandeep Kumar, Neeraj Dilbaghi

Abstract:

Main sources of endocrine disrupting compounds in the ecosystem are hormones, pesticides, phthalates, flame retardants, dioxins, personal-care products, coplanar polychlorinated biphenyls (PCBs), bisphenol A, and parabens. These endocrine disrupting compounds are responsible for learning disabilities, brain development problems, deformations of the body, cancer, reproductive abnormalities in females and decreased sperm count in human males. Although discharge of these chemical compounds into the environment cannot be stopped, yet their amount can be retarded through proper evaluation and detection techniques. The available techniques for determination of these endocrine disrupting compounds mainly include high performance liquid chromatography (HPLC), mass spectroscopy (MS) and gas chromatography-mass spectrometry (GC–MS). These techniques are accurate and reliable but have certain limitations like need of skilled personnel, time consuming, interference and requirement of pretreatment steps. Moreover, these techniques are laboratory bound and sample is required in large amount for analysis. In view of above facts, new methods for detection of endocrine disrupting compounds should be devised that promise high specificity, ultra sensitivity, cost effective, efficient and easy-to-operate procedure. Nowadays, electrochemical sensors/biosensors modified with nanomaterials are gaining high attention among researchers. Bioelement present in this system makes the developed sensors selective towards analyte of interest. Nanomaterials provide large surface area, high electron communication feature, enhanced catalytic activity and possibilities of chemical modifications. In most of the cases, nanomaterials also serve as an electron mediator or electrocatalyst for some analytes.

Keywords: electrochemical, endocrine disruptors, microscopy, nanoparticles, sensors

Procedia PDF Downloads 273
6419 Enhanced Calibration Map for a Four-Hole Probe for Measuring High Flow Angles

Authors: Jafar Mortadha, Imran Qureshi

Abstract:

This research explains and compares the modern techniques used for measuring the flow angles of a flowing fluid with the traditional technique of using multi-hole pressure probes. In particular, the focus of the study is on four-hole probes, which offer great reliability and benefits in several applications where the use of modern measurement techniques is either inconvenient or impractical. Due to modern advancements in manufacturing, small multi-hole pressure probes can be made with high precision, which eliminates the need for calibrating every manufactured probe. This study aims to improve the range of calibration maps for a four-hole probe to allow high flow angles to be measured accurately. The research methodology comprises a literature review of the successful calibration definitions that have been implemented on five-hole probes. These definitions are then adapted and applied on a four-hole probe using a set of raw pressures data. A comparison of the different definitions will be carried out in Matlab and the results will be analyzed to determine the best calibration definition. Taking simplicity of implementation into account as well as the reliability of flow angles estimation, an adapted technique from a research paper written in 2002 offered the most promising outcome. Consequently, the method is seen as a good enhancement for four-hole probes and it can substitute for the existing calibration definitions that offer less accuracy.

Keywords: calibration definitions, calibration maps, flow measurement techniques, four-hole probes, multi-hole pressure probes

Procedia PDF Downloads 295
6418 Optimal Classifying and Extracting Fuzzy Relationship from Query Using Text Mining Techniques

Authors: Faisal Alshuwaier, Ali Areshey

Abstract:

Text mining techniques are generally applied for classifying the text, finding fuzzy relations and structures in data sets. This research provides plenty text mining capabilities. One common application is text classification and event extraction, which encompass deducing specific knowledge concerning incidents referred to in texts. The main contribution of this paper is the clarification of a concept graph generation mechanism, which is based on a text classification and optimal fuzzy relationship extraction. Furthermore, the work presented in this paper explains the application of fuzzy relationship extraction and branch and bound method to simplify the texts.

Keywords: extraction, max-prod, fuzzy relations, text mining, memberships, classification, memberships, classification

Procedia PDF Downloads 582
6417 Analysis of Supply Chain Risk Management Strategies: Case Study of Supply Chain Disruptions

Authors: Marcelo Dias Carvalho, Leticia Ishikawa

Abstract:

Supply Chain Risk Management refers to a set of strategies used by companies to avoid supply chain disruption caused by damage at production facilities, natural disasters, capacity issues, inventory problems, incorrect forecasts, and delays. Many companies use the techniques of the Toyota Production System, which in a way goes against a better management of supply chain risks. This paper studies key events in some multinationals to analyze the trade-off between the best supply chain risk management techniques and management policies designed to create lean enterprises. The result of a good balance of these actions is the reduction of losses, increased customer trust in the company and better preparedness to face the general risks of a supply chain.

Keywords: just in time, lean manufacturing, supply chain disruptions, supply chain management

Procedia PDF Downloads 338
6416 Gan Nanowire-Based Sensor Array for the Detection of Cross-Sensitive Gases Using Principal Component Analysis

Authors: Ashfaque Hossain Khan, Brian Thomson, Ratan Debnath, Abhishek Motayed, Mulpuri V. Rao

Abstract:

Though the efforts had been made, the problem of cross-sensitivity for a single metal oxide-based sensor can’t be fully eliminated. In this work, a sensor array has been designed and fabricated comprising of platinum (Pt), copper (Cu), and silver (Ag) decorated TiO2 and ZnO functionalized GaN nanowires using industry-standard top-down fabrication approach. The metal/metal-oxide combinations within the array have been determined from prior molecular simulation study using first principle calculations based on density functional theory (DFT). The gas responses were obtained for both single and mixture of NO2, SO2, ethanol, and H2 in the presence of H2O and O2 gases under UV light at room temperature. Each gas leaves a unique response footprint across the array sensors by which precise discrimination of cross-sensitive gases has been achieved. An unsupervised principal component analysis (PCA) technique has been implemented on the array response. Results indicate that each gas forms a distinct cluster in the score plot for all the target gases and their mixtures, indicating a clear separation among them. In addition, the developed array device consumes very low power because of ultra-violet (UV) assisted sensing as compared to commercially available metal-oxide sensors. The nanowire sensor array, in combination with PCA, is a potential approach for precise real-time gas monitoring applications.

Keywords: cross-sensitivity, gas sensor, principle component analysis (PCA), sensor array

Procedia PDF Downloads 107
6415 Development of Evolutionary Algorithm by Combining Optimization and Imitation Approach for Machine Learning in Gaming

Authors: Rohit Mittal, Bright Keswani, Amit Mithal

Abstract:

This paper provides a sense about the application of computational intelligence techniques used to develop computer games, especially car racing. For the deep sense and knowledge of artificial intelligence, this paper is divided into various sections that is optimization, imitation, innovation and combining approach of optimization and imitation. This paper is mainly concerned with combining approach which tells different aspects of using fitness measures and supervised learning techniques used to imitate aspects of behavior. The main achievement of this paper is based on modelling player behaviour and evolving new game content such as racing tracks as single car racing on single track.

Keywords: evolution algorithm, genetic, optimization, imitation, racing, innovation, gaming

Procedia PDF Downloads 646
6414 Synthesis and Characterization of Hydroxyapatite from Biowaste for Potential Medical Application

Authors: M. D. H. Beg, John O. Akindoyo, Suriati Ghazali, Nitthiyah Jeyaratnam

Abstract:

Over the period of time, several approaches have been undertaken to mitigate the challenges associated with bone regeneration. This includes but not limited to xenografts, allografts, autografts as well as artificial substitutions like bioceramics, synthetic cements and metals. The former three techniques often come along with peculiar limitation and problems such as morbidity, availability, disease transmission, collateral site damage or absolute rejection by the body as the case may be. Synthetic routes remain the only feasible alternative option for treatment of bone defects. Hydroxyapatite (HA) is very compatible and suitable for this application. However, most of the common methods for HA synthesis are either expensive, complicated or environmentally unfriendly. Interestingly, extraction of HA from bio-wastes have been perceived not only to be cost effective, but also environment friendly. In this research, HA was synthesized from bio-waste: namely bovine bones through three different methods which are hydrothermal chemical processes, ultrasound assisted synthesis and ordinary calcination techniques. Structure and property analysis of the HA was carried out through different characterization techniques such as TGA, FTIR, and XRD. All the methods applied were able to produce HA with similar compositional properties to biomaterials found in human calcified tissues. Calcination process was however observed to be more efficient as it eliminated all the organic components from the produced HA. The HA synthesized is unique for its minimal cost and environmental friendliness. It is also perceived to be suitable for tissue and bone engineering applications.

Keywords: hydroxyapatite, bone, calcination, biowaste

Procedia PDF Downloads 249
6413 Rail-To-Rail Output Op-Amp Design with Negative Miller Capacitance Compensation

Authors: Muhaned Zaidi, Ian Grout, Abu Khari bin A’ain

Abstract:

In this paper, a two-stage op-amp design is considered using both Miller and negative Miller compensation techniques. The first op-amp design uses Miller compensation around the second amplification stage, whilst the second op-amp design uses negative Miller compensation around the first stage and Miller compensation around the second amplification stage. The aims of this work were to compare the gain and phase margins obtained using the different compensation techniques and identify the ability to choose either compensation technique based on a particular set of design requirements. The two op-amp designs created are based on the same two-stage rail-to-rail output CMOS op-amp architecture where the first stage of the op-amp consists of differential input and cascode circuits, and the second stage is a class AB amplifier. The op-amps have been designed using a 0.35mm CMOS fabrication process.

Keywords: op-amp, rail-to-rail output, Miller compensation, Negative Miller capacitance

Procedia PDF Downloads 338
6412 Image Processing techniques for Surveillance in Outdoor Environment

Authors: Jayanth C., Anirudh Sai Yetikuri, Kavitha S. N.

Abstract:

This paper explores the development and application of computer vision and machine learning techniques for real-time pose detection, facial recognition, and number plate extraction. Utilizing MediaPipe for pose estimation, the research presents methods for detecting hand raises and ducking postures through real-time video analysis. Complementarily, facial recognition is employed to compare and verify individual identities using the face recognition library. Additionally, the paper demonstrates a robust approach for extracting and storing vehicle number plates from images, integrating Optical Character Recognition (OCR) with a database management system. The study highlights the effectiveness and versatility of these technologies in practical scenarios, including security and surveillance applications. The findings underscore the potential of combining computer vision techniques to address diverse challenges and enhance automated systems for both individual and vehicular identification. This research contributes to the fields of computer vision and machine learning by providing scalable solutions and demonstrating their applicability in real-world contexts.

Keywords: computer vision, pose detection, facial recognition, number plate extraction, machine learning, real-time analysis, OCR, database management

Procedia PDF Downloads 26
6411 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods

Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin

Abstract:

Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.

Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile

Procedia PDF Downloads 169
6410 Determination of Complexity Level in Merged Irregular Transposition Cipher

Authors: Okike Benjamin, Garba Ejd

Abstract:

Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In order to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often easily decrypted by adversaries. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.

Keywords: transposition cipher, merged irregular cipher, encryption, complexity level

Procedia PDF Downloads 344
6409 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards

Authors: Golnush Masghati-Amoli, Paul Chin

Abstract:

Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.

Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering

Procedia PDF Downloads 134
6408 The Study of Strength and Weakness Points of Various Techniques for Calculating the Volume of Done Work in Civil Projects

Authors: Ali Fazeli Moslehabadi

Abstract:

One of the topics discussed in civil projects, during the execution of the project, which the continuous change of work volumes is usually the characteristics of these types of projects, is how to calculate the volume of done work. The difference in volumes announced by the execution unit with the estimated volume by the technical office unit, has direct effect on the announced progress of the project. This issue can show the progress of the project more or less than actual value and as a result making mistakes for stakeholders and project managers and misleading them. This article intends to introduce some practical methods for calculating the volume of done work in civil projects. It then reviews the strengths and weaknesses of each of them, in order to resolve these contradictions and conflicts.

Keywords: technical skills, systemic skills, communication skills, done work volume calculation techniques

Procedia PDF Downloads 157
6407 The Influence of Temperature on the Corrosion and Corrosion Inhibition of Steel in Hydrochloric Acid Solution: Thermodynamic Study

Authors: Fatimah Al-Hayazi, Ehteram. A. Noor, Aisha H. Moubaraki

Abstract:

The inhibitive effect of Securigera securidaca seed extract (SSE) on mild steel corrosion in 1 M HCl solution has been studied by weight loss and electrochemical techniques at four different temperatures. All techniques studied provided data that the studied extract does well at all temperatures, and its inhibitory action increases with increasing its concentration. SEM images indicate thin-film formation on mild steel when corroded in solutions containing 1 g L-1 of inhibitor either at low or high temperatures. The polarization studies showed that SSE acts as an anodic inhibitor. Both polarization and impedance techniques show an acceleration behaviour for SSE at concentrations ≤ 0.1 g L-1 at all temperatures. At concentrations ≥ 0.1 g L-1, the efficiency of SSE is dramatically increased with increasing concentration, and its value does not change appreciably with increasing temperature. It was found that all adsorption data obeyed Temkin adsorption isotherm. Kinetic activation and thermodynamic adsorption parameters are evaluated and discussed. The results revealed an endothermic corrosion process with an associative activation mechanism, while a comprehensive adsorption mechanism for SSE on mild steel surfaces is suggested, in which both physical and chemical adsorption are involved in the adsorption process. A good correlation between inhibitor constituents and their inhibitory action was obtained.

Keywords: corrosion, inhibition of steel, hydrochloric acid, thermodynamic study

Procedia PDF Downloads 100
6406 Aristotelian Techniques of Communication Used by Current Affairs Talk Shows in Pakistan for Creating Dramatic Effect to Trigger Emotional Relevance

Authors: Shazia Anwer

Abstract:

The current TV Talk Shows, especially on domestic politics in Pakistan are following the Aristotelian techniques, including deductive reasoning, three modes of persuasion, and guidelines for communication. The application of “Approximate Truth is also seen when Talk Show presenters create doubts against political personalities or national issues. Mainstream media of Pakistan, being a key carrier of narrative construction for the sake of the primary function of national consensus on regional and extended public diplomacy, is failing the purpose. This paper has highlighted the Aristotelian communication methodology, its purposes and its limitations for a serious discussion, and its connection to the mistrust among the Pakistani population regarding fake or embedded, funded Information. Data has been collected from 3 Pakistani TV Talk Shows and their analysis has been made by applying the Aristotelian communication method to highlight the core issues. Paper has also elaborated that current media education is impaired in providing transparent techniques to train the future journalist for a meaningful, thought-provoking discussion. For this reason, this paper has given an overview of HEC’s (Higher Education Commission) graduate-level Mass Com Syllabus for Pakistani Universities. The idea of ethos, logos, and pathos are the main components of TV Talk Shows and as a result, the educated audience is lacking trust in the mainstream media, which eventually generating feelings of distrust and betrayal in the society because productions look like the genre of Drama instead of facts and analysis thus the line between Current Affairs shows and Infotainment has become blurred. In the last section, practical implication to improve meaningfulness and transparency in the TV Talk shows has been suggested by replacing the Aristotelian communication method with the cognitive semiotic communication approach.

Keywords: Aristotelian techniques of communication, current affairs talk shows, drama, Pakistan

Procedia PDF Downloads 204
6405 CAD Tool for Parametric Design modification of Yacht Hull Surface Models

Authors: Shahroz Khan, Erkan Gunpinar, Kemal Mart

Abstract:

Recently parametric design techniques became a vital concept in the field of Computer Aided Design (CAD), which helps to provide sophisticated platform to the designer in order to automate the design process in efficient time. In these techniques, design process starts by parameterizing the important features of design models (typically the key dimensions), with the implementation of design constraints. The design constraints help to retain the overall shape of the model while modifying its parameters. However, the process of initializing an appropriate number of design parameters and constraints is the crucial part of parametric design techniques, especially for complex surface models such as yacht hull. This paper introduces a method to create complex surface models in favor of parametric design techniques, a method to define the right number of parameters and respective design constraints, and a system to implement design parameters in contract to design constraints schema. For this, in our proposed approach the design process starts by dividing the yacht hull into three sections. Each section consists of different shape lines, which form the overall shape of yacht hull. The shape lines are created using Cubic Bezier Curves, which allow larger design flexibility. Design parameters and constraints are defined on the shape lines in 3D design space to facilitate the designers for better and individual handling of parameters. Afterwards, shape modifiers are developed, which allow the modification of each parameter while satisfying the respective set of criteria and design constraints. Such as, geometric continuities should be maintained between the shape lines of the three sections, fairness of the hull surfaces should be preserved after modification and while design modification, effect of a single parameter should be negligible on other parameters. The constraints are defined individually on shape lines of each section and mutually between the shape lines of two connecting sections. In order to validate and visualize design results of our shape modifiers, a real time graphic interface is created.

Keywords: design parameter, design constraints, shape modifies, yacht hull

Procedia PDF Downloads 301