Search results for: gravitational search algorithm
2013 Development of Generalized Correlation for Liquid Thermal Conductivity of N-Alkane and Olefin
Authors: A. Ishag Mohamed, A. A. Rabah
Abstract:
The objective of this research is to develop a generalized correlation for the prediction of thermal conductivity of n-Alkanes and Alkenes. There is a minority of research and lack of correlation for thermal conductivity of liquids in the open literature. The available experimental data are collected covering the groups of n-Alkanes and Alkenes.The data were assumed to correlate to temperature using Filippov correlation. Nonparametric regression of Grace Algorithm was used to develop the generalized correlation model. A spread sheet program based on Microsoft Excel was used to plot and calculate the value of the coefficients. The results obtained were compared with the data that found in Perry's Chemical Engineering Hand Book. The experimental data correlated to the temperature ranged "between" 273.15 to 673.15 K, with R2 = 0.99.The developed correlation reproduced experimental data that which were not included in regression with absolute average percent deviation (AAPD) of less than 7 %. Thus the spread sheet was quite accurate which produces reliable data.Keywords: N-Alkanes, N-Alkenes, nonparametric, regression
Procedia PDF Downloads 6542012 Social Consequences of Male Migration on Women: An Evidence from Gujrat-Pakistan
Authors: Shahid Iqbal
Abstract:
It is observed that international migration is increased over a time period in all over the world counties and Pakistan is not escaped. It played a pivotal role in household economy and affecting the gender roles both positive and negative in the developing countries. A vast majority of males from Pakistan migrate to other countries in search of employment and income generating activities and their families left behind, particularly nuclear families are subjected to different social problems. In this scenario, most of the responsibilities lie on the female partners as they have to play the role of male as well as female for their children and household chores. Wives of some males feel loneliness, isolation and a sense of insecurity. Keeping in mind, these realities, this study aims to explore the social impact of husbands’ absentee ship on the lives of families left behind. Particularly, wives’ own experiences will be analyzed. This study will be carried out in the District Gujrat of Punjab Pakistan. Since this study will focus on observing the social impact of male’s migration on families, so, all the households that had at least one member abroad will be the potential respondents of the study. Purposes sampling technique will be used as method to locate the respondents. A focus group discussion will be conducted as a tool for the collection of data. Those women will be approached who are taking care of their families in the absence of their husbands for last three years. For the sake of analysis Interpretative Phenomenological Analysis (IPA) will be applied and researcher will explore how participants make sense of their personal experience and social world.Keywords: social consequences, male migration, left behind, absentee ship, Pakistan
Procedia PDF Downloads 5592011 Transcriptomic Analyses of Kappaphycus alvarezii under Different Wavelengths of Light
Authors: Vun Yee Thien, Kenneth Francis Rodrigues, Clemente Michael Vui Ling Wong, Wilson Thau Lym Yong
Abstract:
Transcriptomes associated with the process of photosynthesis have offered insights into the mechanism of gene regulation in terrestrial plants; however, limited information is available as far as macroalgae are concerned. This investigation aims to decipher the underlying mechanisms associated with photosynthesis in the red alga, Kappaphycus alvarezii, by performing a differential expression analysis on a de novo assembled transcriptomes. Comparative analysis of gene expression was designed to examine the alteration of light qualities and its effect on physiological mechanisms in the red alga. High-throughput paired-end RNA-sequencing was applied to profile the transcriptome of K. alvarezii irradiated with different wavelengths of light (blue 492-455 nm, green 577-492 nm and red 780-622 nm) as compared to the full light spectrum, resulted in more than 60 million reads individually and assembled using Trinity and SOAPdenovo-Trans. The transcripts were annotated in the NCBI non-redundant (nr) protein, SwissProt, KEGG and COG databases with a cutoff E-value of 1e-5 and nearly 30% of transcripts were assigned to functional annotation by Blast searches. Differential expression analysis was performed using edgeR. The DEGs were designated to six categories: BL (blue light) regulated, GL (green light) regulated, RL (red light) regulated, BL or GL regulated, BL or RL regulated, GL or RL regulated, and either BL, GL or RL regulated. These DEGs were mapped to terms in KEGG database and compared with the whole transcriptome background to search for genes that regulated by light quality. The outcomes of this study will enhance our understanding of molecular mechanisms underlying light-induced responses in red algae.Keywords: de novo transcriptome sequencing, differential gene expression, Kappaphycus alvareziired, red alga
Procedia PDF Downloads 5092010 Preclinical Evidence of Pharmacological Effect from Medicinal Hemp
Authors: Muhammad nor Farhan Sa'At, Xin Y. Lim, Terence Y. C. Tan, Siti Hajar M. Rosli, Syazwani S. Ali, Ami F. Syed Mohamed
Abstract:
INTRODUCTION: Hemp (Cannabis sativa subsp. sativa), commonly used for industrial purposes, differs from marijuana by containing lower levels of delta-9-tetrahydronannabidiol- the principal psychoactive constituent in cannabis. Due to its non-psychoactive nature, there has been growing interest in hemp’s therapeutic potential, which has been investigated through pre-clinical and clinical study modalities. OBJECTIVE: To provide an overview of the current landscape of hemp research, through recent scientific findings specific to the pharmacological effects of the medicinal hemp plant and its derived compounds. METHODS: This review was conducted through a systematic search strategy according to the preferred reporting items for systematic review and meta-analysis-ScR (PRISMA-ScR) checklist on electronic databases including MEDLINE, OVID (OVFT, APC Journal Club, EBM Reviews), Cochrane Library Central and Clinicaltrials.gov. RESULTS: From 65 primary articles reviewed, there were 47 pre-clinical studies related to medicinal hemp. Interestingly, the hemp derivatives showed several potential activities such as anti-oxidative, anti-hypertensive, anti-inflammatory, anti-diabetic, anti-neuroinflammatory, anti-arthritic, anti-acne, and anti-microbial activities. Renal protective effects and estrogenic properties were also exhibited in vitro. CONCLUSION: Medicinal hemp possesses various pharmacological effects tested in vitro and in vivo. Information provided in this review could be used as tool to strengthen the study design of future clinical trial research.Keywords: Preclinical, Herbal Medicine, Hemp, Cannabis
Procedia PDF Downloads 1372009 Encephalon-An Implementation of a Handwritten Mathematical Expression Solver
Authors: Shreeyam, Ranjan Kumar Sah, Shivangi
Abstract:
Recognizing and solving handwritten mathematical expressions can be a challenging task, particularly when certain characters are segmented and classified. This project proposes a solution that uses Convolutional Neural Network (CNN) and image processing techniques to accurately solve various types of equations, including arithmetic, quadratic, and trigonometric equations, as well as logical operations like logical AND, OR, NOT, NAND, XOR, and NOR. The proposed solution also provides a graphical solution, allowing users to visualize equations and their solutions. In addition to equation solving, the platform, called CNNCalc, offers a comprehensive learning experience for students. It provides educational content, a quiz platform, and a coding platform for practicing programming skills in different languages like C, Python, and Java. This all-in-one solution makes the learning process engaging and enjoyable for students. The proposed methodology includes horizontal compact projection analysis and survey for segmentation and binarization, as well as connected component analysis and integrated connected component analysis for character classification. The compact projection algorithm compresses the horizontal projections to remove noise and obtain a clearer image, contributing to the accuracy of character segmentation. Experimental results demonstrate the effectiveness of the proposed solution in solving a wide range of mathematical equations. CNNCalc provides a powerful and user-friendly platform for solving equations, learning, and practicing programming skills. With its comprehensive features and accurate results, CNNCalc is poised to revolutionize the way students learn and solve mathematical equations. The platform utilizes a custom-designed Convolutional Neural Network (CNN) with image processing techniques to accurately recognize and classify symbols within handwritten equations. The compact projection algorithm effectively removes noise from horizontal projections, leading to clearer images and improved character segmentation. Experimental results demonstrate the accuracy and effectiveness of the proposed solution in solving a wide range of equations, including arithmetic, quadratic, trigonometric, and logical operations. CNNCalc features a user-friendly interface with a graphical representation of equations being solved, making it an interactive and engaging learning experience for users. The platform also includes tutorials, testing capabilities, and programming features in languages such as C, Python, and Java. Users can track their progress and work towards improving their skills. CNNCalc is poised to revolutionize the way students learn and solve mathematical equations with its comprehensive features and accurate results.Keywords: AL, ML, hand written equation solver, maths, computer, CNNCalc, convolutional neural networks
Procedia PDF Downloads 1242008 Nonlinear Model Predictive Control of Water Quality in Drinking Water Distribution Systems with DBPs Objetives
Authors: Mingyu Xie, Mietek Brdys
Abstract:
The paper develops a non-linear model predictive control (NMPC) of water quality in drinking water distribution systems (DWDS) based on the advanced non-linear quality dynamics model including disinfections by-products (DBPs). A special attention is paid to the analysis of an impact of the flow trajectories prescribed by an upper control level of the recently developed two-time scale architecture of an integrated quality and quantity control in DWDS. The new quality controller is to operate within this architecture in the fast time scale as the lower level quality controller. The controller performance is validated by a comprehensive simulation study based on an example case study DWDS.Keywords: model predictive control, hierarchical control structure, genetic algorithm, water quality with DBPs objectives
Procedia PDF Downloads 3182007 Control of a Stewart Platform for Minimizing Impact Energy in Simulating Spacecraft Docking Operations
Authors: Leonardo Herrera, Shield B. Lin, Stephen J. Montgomery-Smith, Ziraguen O. Williams
Abstract:
Three control algorithms: Proportional-Integral-Derivative, Linear-Quadratic-Gaussian, and Linear-Quadratic-Gaussian with the shift, were applied to the computer simulation of a one-directional dynamic model of a Stewart Platform. The goal was to compare the dynamic system responses under the three control algorithms and to minimize the impact energy when simulating spacecraft docking operations. Equations were derived for the control algorithms and the input and output of the feedback control system. Using MATLAB, Simulink diagrams were created to represent the three control schemes. A switch selector was used for the convenience of changing among different controllers. The simulation demonstrated the controller using the algorithm of Linear-Quadratic-Gaussian with the shift resulting in the lowest impact energy.Keywords: controller, Stewart platform, docking operation, spacecraft
Procedia PDF Downloads 532006 Portfolio Optimization with Reward-Risk Ratio Measure Based on the Mean Absolute Deviation
Authors: Wlodzimierz Ogryczak, Michal Przyluski, Tomasz Sliwinski
Abstract:
In problems of portfolio selection, the reward-risk ratio criterion is optimized to search for a risky portfolio with the maximum increase of the mean return in proportion to the risk measure increase when compared to the risk-free investments. In the classical model, following Markowitz, the risk is measured by the variance thus representing the Sharpe ratio optimization and leading to the quadratic optimization problems. Several Linear Programming (LP) computable risk measures have been introduced and applied in portfolio optimization. In particular, the Mean Absolute Deviation (MAD) measure has been widely recognized. The reward-risk ratio optimization with the MAD measure can be transformed into the LP formulation with the number of constraints proportional to the number of scenarios and the number of variables proportional to the total of the number of scenarios and the number of instruments. This may lead to the LP models with huge number of variables and constraints in the case of real-life financial decisions based on several thousands scenarios, thus decreasing their computational efficiency and making them hardly solvable by general LP tools. We show that the computational efficiency can be then dramatically improved by an alternative model based on the inverse risk-reward ratio minimization and by taking advantages of the LP duality. In the introduced LP model the number of structural constraints is proportional to the number of instruments thus not affecting seriously the simplex method efficiency by the number of scenarios and therefore guaranteeing easy solvability. Moreover, we show that under natural restriction on the target value the MAD risk-reward ratio optimization is consistent with the second order stochastic dominance rules.Keywords: portfolio optimization, reward-risk ratio, mean absolute deviation, linear programming
Procedia PDF Downloads 4082005 A New Design Methodology for Partially Reconfigurable Systems-on-Chip
Authors: Roukaya Dalbouchi, Abdelkrin Zitouni
Abstract:
In this paper, we propose a novel design methodology for Dynamic Partial Reconfigurable (DPR) system. This type of system has the property of being able to be modified after its design and during its execution. The suggested design methodology is generic in terms of granularity, number of modules, and reconfigurable region and suitable for any type of modern application. It is based on the interconnection between several design stages. The recommended methodology represents a guide for the design of DPR architectures that meet compromise reconfiguration/performance. To validate the proposed methodology, we use as an application a video watermarking. The comparison result shows that the proposed methodology supports all stages of DPR architecture design and characterized by a high abstraction level. It provides a dynamic/partial reconfigurable architecture; it guarantees material efficiency, the flexibility of reconfiguration, and superior performance in terms of frequency and power consumption.Keywords: dynamically reconfigurable system, block matching algorithm, partial reconfiguration, motion vectors, video watermarking
Procedia PDF Downloads 962004 Binarization and Recognition of Characters from Historical Degraded Documents
Authors: Bency Jacob, S.B. Waykar
Abstract:
Degradations in historical document images appear due to aging of the documents. It is very difficult to understand and retrieve text from badly degraded documents as there is variation between the document foreground and background. Thresholding of such document images either result in broken characters or detection of false texts. Numerous algorithms exist that can separate text and background efficiently in the textual regions of the document; but portions of background are mistaken as text in areas that hardly contain any text. This paper presents a way to overcome these problems by a robust binarization technique that recovers the text from a severely degraded document images and thereby increases the accuracy of optical character recognition systems. The proposed document recovery algorithm efficiently removes degradations from document images. Here we are using the ostus method ,local thresholding and global thresholding and after the binarization training and recognizing the characters in the degraded documents.Keywords: binarization, denoising, global thresholding, local thresholding, thresholding
Procedia PDF Downloads 3452003 Comparative Methods for Speech Enhancement and the Effects on Text-Independent Speaker Identification Performance
Authors: R. Ajgou, S. Sbaa, S. Ghendir, A. Chemsa, A. Taleb-Ahmed
Abstract:
The speech enhancement algorithm is to improve speech quality. In this paper, we review some speech enhancement methods and we evaluated their performance based on Perceptual Evaluation of Speech Quality scores (PESQ, ITU-T P.862). All method was evaluated in presence of different kind of noise using TIMIT database and NOIZEUS noisy speech corpus.. The noise was taken from the AURORA database and includes suburban train noise, babble, car, exhibition hall, restaurant, street, airport and train station noise. Simulation results showed improved performance of speech enhancement for Tracking of non-stationary noise approach in comparison with various methods in terms of PESQ measure. Moreover, we have evaluated the effects of the speech enhancement technique on Speaker Identification system based on autoregressive (AR) model and Mel-frequency Cepstral coefficients (MFCC).Keywords: speech enhancement, pesq, speaker recognition, MFCC
Procedia PDF Downloads 4242002 A Deep Learning Based Integrated Model For Spatial Flood Prediction
Authors: Vinayaka Gude Divya Sampath
Abstract:
The research introduces an integrated prediction model to assess the susceptibility of roads in a future flooding event. The model consists of deep learning algorithm for forecasting gauge height data and Flood Inundation Mapper (FIM) for spatial flooding. An optimal architecture for Long short-term memory network (LSTM) was identified for the gauge located on Tangipahoa River at Robert, LA. Dropout was applied to the model to evaluate the uncertainty associated with the predictions. The estimates are then used along with FIM to identify the spatial flooding. Further geoprocessing in ArcGIS provides the susceptibility values for different roads. The model was validated based on the devastating flood of August 2016. The paper discusses the challenges for generalization the methodology for other locations and also for various types of flooding. The developed model can be used by the transportation department and other emergency response organizations for effective disaster management.Keywords: deep learning, disaster management, flood prediction, urban flooding
Procedia PDF Downloads 1492001 Comparison of Er:YAG Laser with Bur Prepared Cavities: A Systematic Review
Authors: Sarina Sahmeddini, Fahimeh Safarpour, Forough Pazhuheian
Abstract:
With the concepts of minimally invasive treatment and preventive dentistry gaining more and more recognition by dentists, there are many published clinical trials comparing the use of the erbium laser with traditional drilling for caries removal. However, the efficacy of the erbium laser is still controversial. The aim of this review study is to compare the effects of tooth preparation by laser irradiation and conventional preparation by bur to identify the best means for cavity preparation and reduction of recurrent caries. Randomized controlled trials, controlled clinical trials, and prospective, and retrospective cohort studies were included in this review. The eligibility criteria included studies in humans’ permanent teeth in which cavities were conducted in their cervical third and proximal surfaces. PubMed, Google scholar, and Scopus about Er:YAG laser and bur prepared cavities were carried out. The studies’ details were organized in four tables according to the groups: (1) Microleakage; (2) Morphological changes; (3) Microhardness; and (4) Bond strength. The initial search resulted in 134 articles, 12 studies published from 2012 up to March 2020 were included in this review. According to the risk of bias evaluation, all studies were classified as high quality. Clinical implications: Er:YAG lasers with the energy levels between 250 to 300 mJ can be proper alternatives to conventional burs, as minimal invasive instruments with no significant differences or better results in microleakage, microhardness, and bond strength compared with conventional burs. In conclusion, Er:YAG laser irradiations accompanied by phosphoric acid etching can reduce the chance of recurrent carries.Keywords: lasers, drilling, caries, micro leakage
Procedia PDF Downloads 1342000 Utilizing AI Green Grader Scope to Promote Environmental Responsibility Among University Students
Authors: Tarek Taha Kandil
Abstract:
In higher education, the use of automated grading systems is on the rise, automating the assessment of students' work and providing practical feedback. Sustainable Grader Scope addresses the environmental impact of these computational tasks. This system uses an AI-powered algorithm and is designed to minimize grading process emissions. It reduces carbon emissions through energy-efficient computing and carbon-conscious scheduling. Students submit their computational workloads to the system, which evaluates submissions using containers and a distributed infrastructure. A carbon-conscious scheduler manages workloads across global campuses, optimizing emissions using real-time carbon intensity data. This ensures the university stays within government-set emission limits while tracking and reducing its carbon footprint.Keywords: sustainability, green graders, digital sustainable grader scope, environmental responsibility; higher education.
Procedia PDF Downloads 121999 Facets of an Upcoming Urban Industrial Hub: A Case Study of Gurgaon-Manesar
Authors: Raman Kumar Singh
Abstract:
Urbanization and economic growth are considered to be the most striking features of the past century. There is currently a radical demographic shift in progress worldwide, wherein people are moving from rural to urban areas at an increasing rate. The UN-Habitat report 2005 indicates that in 2025, 61 per cent of the 5 billion world population will reside in the urban areas with about 85 per cent of the development process taking place in the urban hinterlands widely referred to as ‘peri-urban’, ‘suburbs’, ‘urban fringe’, ‘city edge’, ‘metropolitan shadow’, or ‘urban sprawl’. In this context the study is broadly concerned with understanding the development of the industrial hub in the Gurgaon and its impact on the immediate neighbourhood. However studies have revealed that with the increase of industrial development the growth pattern changes rapidly, not only the growth of the urban area but the overall economy shifts from more agrarian to non-agrarian, with the change in the occupational pattern of the people. The process is mainly known as tertiarization, where a number of tertiary activities increase in comparison to primary or secondary. The change in the occupational pattern creates a pull factor on its immediate neighbourhood, which triggers the in- migrations from the rural areas as people come in the core urban area in search of the better job opportunities and increased standards of living. But this gives way to the unplanned growth of the urban fringe and the villages which tend to accommodate the migrants and in turn the pressure on the socio-economic infrastructure increases. Therefore, it becomes increasing necessary for the government institution and policy level intervention to provide an overall socio-economic growth along with rapid industrial growth.Keywords: policy intervention, urban morphology, urban industrial hub, livelihood transformation
Procedia PDF Downloads 3771998 Energy Efficient Routing Protocol with Ad Hoc On-Demand Distance Vector for MANET
Authors: K. Thamizhmaran, Akshaya Devi Arivazhagan, M. Anitha
Abstract:
On the case of most important systematic issue that must need to be solved in means of implementing a data transmission algorithm on the source of Mobile adhoc networks (MANETs). That is, how to save mobile nodes energy on meeting the requirements of applications or users as the mobile nodes are with battery limited. On while satisfying the energy saving requirement, hence it is also necessary of need to achieve the quality of service. In case of emergency work, it is necessary to deliver the data on mean time. Achieving quality of service in MANETs is also important on while. In order to achieve this requirement, Hence, we further implement the Energy-Aware routing protocol for system of Mobile adhoc networks were it being proposed, that on which saves the energy as on every node by means of efficiently selecting the mode of energy efficient path in the routing process by means of Enhanced AODV routing protocol.Keywords: Ad-Hoc networks, MANET, routing, AODV, EAODV
Procedia PDF Downloads 3721997 The Physically Handicapped in the City
Authors: Bekhemmas Youcef
Abstract:
The category of the disabled, like other social groups, is considered to have been affected by fate with a disability that led to a reduction in the fulfillment of its social roles to the fullest extent or led to its complete abandonment. Psychological, and until we understand its behavioral methods that express a lot of this complexity and intertwining, and despite all that, this category has not yet received the appropriate great interest from specialized researchers, and even officials, and it is natural that the category of people with disabilities has psychological and social requirements in order to regains their capabilities or some From her, it also needs to prepare the environment in which she lives in order to integrate into society As the motor disability is one of the most common types of disability in the world, and it is constantly increasing, considering the increase in the causes leading to it, such as the traffic accident, and the motor disability often affects individuals from a psychological point of view, but it also affects their social surroundings, whether close or extended, and thus it draws limits and quality For their way of life, as well as determining roles for them as actors of a special kind within their societies. The methodology is similar to the organizational framework for the production of any scientific knowledge and based on the fact that sociology is a project that aims to understand and interpret the social reality scientifically and through the nature of the subject studied in the framework of the reality of the disabled in the city and in order to get closer to the daily life of the physically disabled within the urban center, we adopted the qualitative approach A choice that complies with the spirit of Viberian sociology, especially since Max Weber insists on the need to search for the meaning that the social actor gives to his behavior. Through the results reached in this study, it was found that the city still suffers from several deficiencies at the level of equipment and urban planning in a way that keeps pace with the number of people with disabilities in the city.Keywords: physically, handicapped, in, the city
Procedia PDF Downloads 721996 Bipolar Impulse Noise Removal and Edge Preservation in Color Images and Video Using Improved Kuwahara Filter
Authors: Reji Thankachan, Varsha PS
Abstract:
Both image capturing devices and human visual systems are nonlinear. Hence nonlinear filtering methods outperforms its linear counterpart in many applications. Linear methods are unable to remove impulsive noise in images by preserving its edges and fine details. In addition, linear algorithms are unable to remove signal dependent or multiplicative noise in images. This paper presents an approach to denoise and smoothen the Bipolar impulse noised images and videos using improved Kuwahara filter. It involves a 2 stage algorithm which includes a noise detection followed by filtering. Numerous simulation demonstrate that proposed method outperforms the existing method by eliminating the painting like flattening effect along the local feature direction while preserving edge with improvement in PSNR and MSE.Keywords: bipolar impulse noise, Kuwahara, PSNR MSE, PDF
Procedia PDF Downloads 5001995 Decentralized Peak-Shaving Strategies for Integrated Domestic Batteries
Authors: Corentin Jankowiak, Aggelos Zacharopoulos, Caterina Brandoni
Abstract:
In a context of increasing stress put on the electricity network by the decarbonization of many sectors, energy storage is likely to be the key mitigating element, by acting as a buffer between production and demand. In particular, the highest potential for storage is when connected closer to the loads. Yet, low voltage storage struggles to penetrate the market at a large scale due to the novelty and complexity of the solution, and the competitive advantage of fossil fuel-based technologies regarding regulations. Strong and reliable numerical simulations are required to show the benefits of storage located near loads and promote its development. The present study was restrained from excluding aggregated control of storage: it is assumed that the storage units operate independently to one another without exchanging information – as is currently mostly the case. A computationally light battery model is presented in detail and validated by direct comparison with a domestic battery operating in real conditions. This model is then used to develop Peak-Shaving (PS) control strategies as it is the decentralized service from which beneficial impacts are most likely to emerge. The aggregation of flatter, peak- shaved consumption profiles is likely to lead to flatter and arbitraged profile at higher voltage layers. Furthermore, voltage fluctuations can be expected to decrease if spikes of individual consumption are reduced. The crucial part to achieve PS lies in the charging pattern: peaks depend on the switching on and off of appliances in the dwelling by the occupants and are therefore impossible to predict accurately. A performant PS strategy must, therefore, include a smart charge recovery algorithm that can ensure enough energy is present in the battery in case it is needed without generating new peaks by charging the unit. Three categories of PS algorithms are introduced in detail. First, using a constant threshold or power rate for charge recovery, followed by algorithms using the State Of Charge (SOC) as a decision variable. Finally, using a load forecast – of which the impact of the accuracy is discussed – to generate PS. A performance metrics was defined in order to quantitatively evaluate their operating regarding peak reduction, total energy consumption, and self-consumption of domestic photovoltaic generation. The algorithms were tested on load profiles with a 1-minute granularity over a 1-year period, and their performance was assessed regarding these metrics. The results show that constant charging threshold or power are far from optimal: a certain value is not likely to fit the variability of a residential profile. As could be expected, forecast-based algorithms show the highest performance. However, these depend on the accuracy of the forecast. On the other hand, SOC based algorithms also present satisfying performance, making them a strong alternative when the reliable forecast is not available.Keywords: decentralised control, domestic integrated batteries, electricity network performance, peak-shaving algorithm
Procedia PDF Downloads 1181994 Academic Leadership Succession Planning Practice in Nigeria Higher Education Institutions: A Case Study of Colleges of Education
Authors: Adie, Julius Undiukeye
Abstract:
This research investigated the practice of academic leadership succession planning in Nigerian higher education institutions, drawing on the lived experiences of the academic staff of the case study institutions. It is multi-case study research that adopts a qualitative research method. Ten participants (mainly academic staff) were used as the study sample. The study was guided by four research questions. Semi-structured interviews and archival information from official documents formed the sources of data. The data collected was analyzed using the Constant Comparative Technique (CCT) to generate empirical insights and facts on the subject of this paper. The following findings emerged from the data analysis: firstly, there was no formalized leadership succession plan in place in the institutions that were sampled for this study; secondly, despite the absence of a formal succession plan, the data indicates that academics believe that succession planning is very significant for institutional survival; thirdly, existing practices of succession planning in the sampled institutions, takes the forms of job seniority ranking, political process and executive fiat, ad-hoc arrangement, and external hiring; and finally, data revealed that there are some barriers to the practice of succession planning, such as traditional higher education institutions’ characteristics (e.g. external talent search, shared governance, diversity, and equality in leadership appointment) and the lack of interest in leadership positions. Based on the research findings, some far-reaching recommendations were made, including the urgent need for the ‘formalization’ of leadership succession planning by the higher education institutions concerned, through the design of an official policy framework.Keywords: academic leadership, succession, planning, higher education
Procedia PDF Downloads 1471993 Computer Aided Diagnosis Bringing Changes in Breast Cancer Detection
Authors: Devadrita Dey Sarkar
Abstract:
Regardless of the many technologic advances in the past decade, increased training and experience, and the obvious benefits of uniform standards, the false-negative rate in screening mammography remains unacceptably high .A computer aided neural network classification of regions of suspicion (ROS) on digitized mammograms is presented in this abstract which employs features extracted by a new technique based on independent component analysis. CAD is a concept established by taking into account equally the roles of physicians and computers, whereas automated computer diagnosis is a concept based on computer algorithms only. With CAD, the performance by computers does not have to be comparable to or better than that by physicians, but needs to be complementary to that by physicians. In fact, a large number of CAD systems have been employed for assisting physicians in the early detection of breast cancers on mammograms. A CAD scheme that makes use of lateral breast images has the potential to improve the overall performance in the detection of breast lumps. Because breast lumps can be detected reliably by computer on lateral breast mammographs, radiologists’ accuracy in the detection of breast lumps would be improved by the use of CAD, and thus early diagnosis of breast cancer would become possible. In the future, many CAD schemes could be assembled as packages and implemented as a part of PACS. For example, the package for breast CAD may include the computerized detection of breast nodules, as well as the computerized classification of benign and malignant nodules. In order to assist in the differential diagnosis, it would be possible to search for and retrieve images (or lesions) with these CAD systems, which would be reliable and useful method for quantifying the similarity of a pair of images for visual comparison by radiologists.Keywords: CAD(computer-aided design), lesions, neural network, ROS(region of suspicion)
Procedia PDF Downloads 4561992 Inferring the Ecological Quality of Seagrass Beds from Using Composition and Configuration Indices
Authors: Fabrice Houngnandan, Celia Fery, Thomas Bockel, Julie Deter
Abstract:
Getting water cleaner and stopping global biodiversity loss requires indices to measure changes and evaluate the achievement of objectives. The endemic and protected seagrass species Posidonia oceanica is a biological indicator used to monitor the ecological quality of marine Mediterranean waters. One ecosystem index (EBQI), two biotic indices (PREI, Bipo), and several landscape indices, which measure the composition and configuration of the P. oceanica seagrass at the population scale have been developed. While the formers are measured at monitoring sites, the landscape indices can be calculated for the entire seabed covered by this ecosystem. This present work aims to search on the link between these indices and the best scale to be used in order to maximize this link. We used data collected between 2014 to 2019 along the French Mediterranean coastline to calculate EBQI, PREI, and Bipo at 100 sites. From the P. oceanica seagrass distribution map, configuration and composition indices around these different sites in 6 different grid sizes (100 m x 100 to 1000 m x 1000 m) were determined. Correlation analyses were first used to find out the grid size presenting the strongest and most significant link between the different types of indices. Finally, several models were compared basis on various metrics to identify the one that best explains the nature of the link between these indices. Our results showed a strong and significant link between biotic indices and the best correlations between biotic and landscape indices within the 600 m x 600 m grid cells. These results showed that the use of landscape indices is possible to monitor the health of seagrass beds at a large scale.Keywords: ecological indicators, decline, conservation, submerged aquatic vegetation
Procedia PDF Downloads 1341991 Coarse-Graining in Micromagnetic Simulations of Magnetic Hyperthermia
Authors: Razyeh Behbahani, Martin L. Plumer, Ivan Saika-Voivod
Abstract:
Micromagnetic simulations based on the stochastic Landau-Lifshitz-Gilbert equation are used to calculate dynamic magnetic hysteresis loops relevant to magnetic hyperthermia applications. With the goal to effectively simulate room-temperature loops for large iron-oxide based systems at relatively slow sweep rates on the order of 1 Oe/ns or less, a coarse-graining scheme is proposed and tested. The scheme is derived from a previously developed renormalization-group approach. Loops associated with nanorods, used as building blocks for larger nanoparticles that were employed in preclinical trials (Dennis et al., 2009 Nanotechnology 20 395103), serve as the model test system. The scaling algorithm is shown to produce nearly identical loops over several decades in the model grain sizes. Sweep-rate scaling involving the damping constant alpha is also demonstrated.Keywords: coarse-graining, hyperthermia, hysteresis loops, micromagnetic simulations
Procedia PDF Downloads 1491990 The Effects of North Sea Caspian Pattern Index on the Temperature and Precipitation Regime in the Aegean Region of Turkey
Authors: Cenk Sezen, Turgay Partal
Abstract:
North Sea Caspian Pattern Index (NCP) refers to an atmospheric teleconnection between the North Sea and North Caspian at the 500 hPa geopotential height level. The aim of this study is to search for effects of NCP on annual and seasonal mean temperature and also annual and seasonal precipitation totals in the Aegean region of Turkey. The study contains the data that consist of 46 years obtained from nine meteorological stations. To determine the relationship between NCP and the climatic parameters, firstly the Pearson correlation coefficient method was utilized. According to the results of the analysis, most of the stations in the region have a high negative correlation NCPI in all seasons, especially in the winter season in terms of annual and seasonal mean temperature (statistically at significant at the 90% level). Besides, high negative correlation values between NCPI and precipitation totals are observed during the winter season at the most of stations. Furthermore, the NCPI values were divided into two group as NCPI(-) and NCPI(+), and then mean temperature and precipitation total values, which are grouped according to the NCP(-) and NCP(+) phases, were determined as annual and seasonal. During the NCPI(-), higher mean temperature values are observed in all of seasons, particularly in the winter season compared to the mean temperature values under effect of NCP(+). Similarly, during the NCPI(-) in winter season precipitation total values have higher than the precipitation total values under the effect of NCP(+); however, in other seasons there no substantial changes were observed between the precipitation total values. As a result of this study, significant proof is obtained with regards to the influences of NCP on the temperature and precipitation regime in the Aegean region of Turkey.Keywords: Aegean region, NCPI, precipitation, temperature
Procedia PDF Downloads 2841989 Computational Team Dynamics and Interaction Patterns in New Product Development Teams
Authors: Shankaran Sitarama
Abstract:
New Product Development (NPD) is invariably a team effort and involves effective teamwork. NPD team has members from different disciplines coming together and working through the different phases all the way from conceptual design phase till the production and product roll out. Creativity and Innovation are some of the key factors of successful NPD. Team members going through the different phases of NPD interact and work closely yet challenge each other during the design phases to brainstorm on ideas and later converge to work together. These two traits require the teams to have a divergent and a convergent thinking simultaneously. There needs to be a good balance. The team dynamics invariably result in conflicts among team members. While some amount of conflict (ideational conflict) is desirable in NPD teams to be creative as a group, relational conflicts (or discords among members) could be detrimental to teamwork. Team communication truly reflect these tensions and team dynamics. In this research, team communication (emails) between the members of the NPD teams is considered for analysis. The email communication is processed through a semantic analysis algorithm (LSA) to analyze the content of communication and a semantic similarity analysis to arrive at a social network graph that depicts the communication amongst team members based on the content of communication. The amount of communication (content and not frequency of communication) defines the interaction strength between the members. Social network adjacency matrix is thus obtained for the team. Standard social network analysis techniques based on the Adjacency Matrix (AM) and Dichotomized Adjacency Matrix (DAM) based on network density yield network graphs and network metrics like centrality. The social network graphs are then rendered for visual representation using a Metric Multi-Dimensional Scaling (MMDS) algorithm for node placements and arcs connecting the nodes (representing team members) are drawn. The distance of the nodes in the placement represents the tie-strength between the members. Stronger tie-strengths render nodes closer. Overall visual representation of the social network graph provides a clear picture of the team’s interactions. This research reveals four distinct patterns of team interaction that are clearly identifiable in the visual representation of the social network graph and have a clearly defined computational scheme. The four computational patterns of team interaction defined are Central Member Pattern (CMP), Subgroup and Aloof member Pattern (SAP), Isolate Member Pattern (IMP), and Pendant Member Pattern (PMP). Each of these patterns has a team dynamics implication in terms of the conflict level in the team. For instance, Isolate member pattern, clearly points to a near break-down in communication with the member and hence a possible high conflict level, whereas the subgroup or aloof member pattern points to a non-uniform information flow in the team and some moderate level of conflict. These pattern classifications of teams are then compared and correlated to the real level of conflict in the teams as indicated by the team members through an elaborate self-evaluation, team reflection, feedback form and results show a good correlation.Keywords: team dynamics, team communication, team interactions, social network analysis, sna, new product development, latent semantic analysis, LSA, NPD teams
Procedia PDF Downloads 711988 Ankle Fracture Management: A Unique Cross Departmental Quality Improvement Project
Authors: Langhit Kurar, Loren Charles
Abstract:
Introduction: In light of recent BOAST 12 (August 2016) published guidance on management of ankle fractures, the project aimed to highlight key discrepancies throughout the care trajectory from admission to point of discharge at a district general hospital. Wide breadth of data covering three key domains: accident and emergency, radiology, and orthopaedic surgery were subsequently stratified and recommendations on note documentation, and outpatient follow up were made. Methods: A retrospective twelve month audit was conducted reviewing results of ankle fracture management in 37 patients. Inclusion criterion involved all patients seen at Darent Valley Hospital (DVH) emergency department with radiographic evidence of an ankle fracture. Exclusion criterion involved all patients managed solely by nursing staff or having sustained purely ligamentous injury. Medical notes, including discharge summaries and the PACS online radiographic tool were used for data extraction. Results: Cross-examination of the A & E domain revealed limited awareness of the BOAST 12 recent publication including requirements to document skin integrity and neurovascular assessment. This had direct implications as this would have changed the surgical plan for acutely compromised patients. The majority of results obtained from the radiographic domain were satisfactory with appropriate X-rays taken in over 95% of cases. However, due to time pressures within A & E, patients were often left without a post manipulation XRAY in a backslab. Poorly reduced fractures were subsequently left for a long period resulting in swollen ankles and a time-dependent lag to surgical intervention. This had knocked on implications for prolonged inpatient stay resulting in hospital-acquired co-morbidity including pressure sores. Discussion: The audit has highlighted several areas of improvement throughout the disease trajectory from review in the emergency department to follow up as an outpatient. This has prompted the creation of an algorithm to ensure patients with significant fractures presenting to the emergency department are seen promptly and treatment expedited as per recent guidance. This includes timing for X-rays taken in A & E. Re-audit has shown significant improvement in both documentation at time of presentation and appropriate follow-up strategies. Within the orthopedic domain, we are in the process of creating an ankle fracture pathway to ensure imaging and weight bearing status are made clear to the consulting clinicians in an outpatient setting. Significance/Clinical Relevance: As a result of the ankle fracture algorithm we have adapted the BOAST 12 guidance to shape an intrinsic pathway to not only improve patient management within the emergency department but also create a standardised format for follow up.Keywords: ankle, fracture, BOAST, radiology
Procedia PDF Downloads 1801987 Collagen Scaffold Incorporated with Macrotyloma uniflorum Plant Extracts as a–Burn/Wound Dressing Material, in Vitro and in Vivo Evaluation
Authors: Thangavelu Muthukumar, Thotapalli Parvathaleswara Sastry
Abstract:
Collagen is the most abundantly available connective tissue protein, which is being used as a biomaterial for various biomedical applications. Presently, fish wastes are disposed improperly which is causing serious environmental pollution resulting in offensive odour. Fish scales are promising source of Type I collagen. Medicinal plants have been used since time immemorial for treatment of various ailments of skin and dermatological disorders especially cuts, wounds, and burns. Developing biomaterials from the natural sources which are having wound healing properties within the search of a common man is the need of hour, particularly in developing and third world countries. With these objectives in view we have developed a wound dressing material containing fish scale collagen (FSC) incorporated with Macrotyloma uniflorum plant extract (PE). The wound dressing composite was characterized for its physiochemical properties using conventional methods. SEM image revealed that the composite has fibrous and porous surface which helps in transportation of oxygen as well as absorbing wound fluids. The biomaterial has shown 95% biocompatibility with required mechanical strength and has exhibited antimicrobial properties. This biomaterial has been used as a wound dressing material in experimental wounds of rats. The healing pattern was evaluated by macroscopic observations, panimetric studies, biochemical, histopathological observations. The results showed faster healing pattern in the wounds treated with CSPE compared to the other composites used in this study and untreated control. These experiments clearly suggest that CSPE can be used as wound/burn dressing materials.Keywords: collagen, wound dressing, Macrotyloma uniflorum, burn dressing
Procedia PDF Downloads 4201986 Using Artificial Intelligence Technology to Build the User-Oriented Platform for Integrated Archival Service
Authors: Lai Wenfang
Abstract:
Tthis study will describe how to use artificial intelligence (AI) technology to build the user-oriented platform for integrated archival service. The platform will be launched in 2020 by the National Archives Administration (NAA) in Taiwan. With the progression of information communication technology (ICT) the NAA has built many systems to provide archival service. In order to cope with new challenges, such as new ICT, artificial intelligence or blockchain etc. the NAA will try to use the natural language processing (NLP) and machine learning (ML) skill to build a training model and propose suggestions based on the data sent to the platform. NAA expects the platform not only can automatically inform the sending agencies’ staffs which records catalogues are against the transfer or destroy rules, but also can use the model to find the details hidden in the catalogues and suggest NAA’s staff whether the records should be or not to be, to shorten the auditing time. The platform keeps all the users’ browse trails; so that the platform can predict what kinds of archives user could be interested and recommend the search terms by visualization, moreover, inform them the new coming archives. In addition, according to the Archives Act, the NAA’s staff must spend a lot of time to mark or remove the personal data, classified data, etc. before archives provided. To upgrade the archives access service process, the platform will use some text recognition pattern to black out automatically, the staff only need to adjust the error and upload the correct one, when the platform has learned the accuracy will be getting higher. In short, the purpose of the platform is to deduct the government digital transformation and implement the vision of a service-oriented smart government.Keywords: artificial intelligence, natural language processing, machine learning, visualization
Procedia PDF Downloads 1771985 Identification of Spam Keywords Using Hierarchical Category in C2C E-Commerce
Authors: Shao Bo Cheng, Yong-Jin Han, Se Young Park, Seong-Bae Park
Abstract:
Consumer-to-Consumer (C2C) E-commerce has been growing at a very high speed in recent years. Since identical or nearly-same kinds of products compete one another by relying on keyword search in C2C E-commerce, some sellers describe their products with spam keywords that are popular but are not related to their products. Though such products get more chances to be retrieved and selected by consumers than those without spam keywords, the spam keywords mislead the consumers and waste their time. This problem has been reported in many commercial services like e-bay and taobao, but there have been little research to solve this problem. As a solution to this problem, this paper proposes a method to classify whether keywords of a product are spam or not. The proposed method assumes that a keyword for a given product is more reliable if the keyword is observed commonly in specifications of products which are the same or the same kind as the given product. This is because that a hierarchical category of a product in general determined precisely by a seller of the product and so is the specification of the product. Since higher layers of the hierarchical category represent more general kinds of products, a reliable degree is differently determined according to the layers. Hence, reliable degrees from different layers of a hierarchical category become features for keywords and they are used together with features only from specifications for classification of the keywords. Support Vector Machines are adopted as a basic classifier using the features, since it is powerful, and widely used in many classification tasks. In the experiments, the proposed method is evaluated with a golden standard dataset from Yi-han-wang, a Chinese C2C e-commerce, and is compared with a baseline method that does not consider the hierarchical category. The experimental results show that the proposed method outperforms the baseline in F1-measure, which proves that spam keywords are effectively identified by a hierarchical category in C2C e-commerce.Keywords: spam keyword, e-commerce, keyword features, spam filtering
Procedia PDF Downloads 2941984 Adaptive Dehazing Using Fusion Strategy
Authors: M. Ramesh Kanthan, S. Naga Nandini Sujatha
Abstract:
The goal of haze removal algorithms is to enhance and recover details of scene from foggy image. In enhancement the proposed method focus into two main categories: (i) image enhancement based on Adaptive contrast Histogram equalization, and (ii) image edge strengthened Gradient model. Many circumstances accurate haze removal algorithms are needed. The de-fog feature works through a complex algorithm which first determines the fog destiny of the scene, then analyses the obscured image before applying contrast and sharpness adjustments to the video in real-time to produce image the fusion strategy is driven by the intrinsic properties of the original image and is highly dependent on the choice of the inputs and the weights. Then the output haze free image has reconstructed using fusion methodology. In order to increase the accuracy, interpolation method has used in the output reconstruction. A promising retrieval performance is achieved especially in particular examples.Keywords: single image, fusion, dehazing, multi-scale fusion, per-pixel, weight map
Procedia PDF Downloads 465