Search results for: Scale-Invariant Feature Transformation (SIFT)
2440 Modeling, Topology Optimization and Experimental Validation of Glass-Transition-Based 4D-Printed Polymeric Structures
Authors: Sara A. Pakvis, Giulia Scalet, Stefania Marconi, Ferdinando Auricchio, Matthijs Langelaar
Abstract:
In recent developments in the field of multi-material additive manufacturing, differences in material properties are exploited to create printed shape-memory structures, which are referred to as 4D-printed structures. New printing techniques allow for the deliberate introduction of prestresses in the specimen during manufacturing, and, in combination with the right design, this enables new functionalities. This research focuses on bi-polymer 4D-printed structures, where the transformation process is based on a heat-induced glass transition in one material lowering its Young’s modulus, combined with an initial prestress in the other material. Upon the decrease in stiffness, the prestress is released, which results in the realization of an essentially pre-programmed deformation. As the design of such functional multi-material structures is crucial but far from trivial, a systematic methodology to find the design of 4D-printed structures is developed, where a finite element model is combined with a density-based topology optimization method to describe the material layout. This modeling approach is verified by a convergence analysis and validated by comparing its numerical results to analytical and published data. Specific aspects that are addressed include the interplay between the definition of the prestress and the material interpolation function used in the density-based topology description, the inclusion of a temperature-dependent stiffness relationship to simulate the glass transition effect, and the importance of the consideration of geometric nonlinearity in the finite element modeling. The efficacy of topology optimization to design 4D-printed structures is explored by applying the methodology to a variety of design problems, both in 2D and 3D settings. Bi-layer designs composed of thermoplastic polymers are printed by means of the fused deposition modeling (FDM) technology. Acrylonitrile butadiene styrene (ABS) polymer undergoes the glass transition transformation, while polyurethane (TPU) polymer is prestressed by means of the 3D-printing process itself. Tests inducing shape transformation in the printed samples through heating are performed to calibrate the prestress and validate the modeling approach by comparing the numerical results to the experimental findings. Using the experimentally obtained prestress values, more complex designs have been generated through topology optimization, and samples have been printed and tested to evaluate their performance. This study demonstrates that by combining topology optimization and 4D-printing concepts, stimuli-responsive structures with specific properties can be designed and realized.Keywords: 4D-printing, glass transition, shape memory polymer, topology optimization
Procedia PDF Downloads 2082439 Object-Based Image Analysis for Gully-Affected Area Detection in the Hilly Loess Plateau Region of China Using Unmanned Aerial Vehicle
Authors: Hu Ding, Kai Liu, Guoan Tang
Abstract:
The Chinese Loess Plateau suffers from serious gully erosion induced by natural and human causes. Gully features detection including gully-affected area and its two dimension parameters (length, width, area et al.), is a significant task not only for researchers but also for policy-makers. This study aims at gully-affected area detection in three catchments of Chinese Loess Plateau, which were selected in Changwu, Ansai, and Suide by using unmanned aerial vehicle (UAV). The methodology includes a sequence of UAV data generation, image segmentation, feature calculation and selection, and random forest classification. Two experiments were conducted to investigate the influences of segmentation strategy and feature selection. Results showed that vertical and horizontal root-mean-square errors were below 0.5 and 0.2 m, respectively, which were ideal for the Loess Plateau region. The segmentation strategy adopted in this paper, which considers the topographic information, and optimal parameter combination can improve the segmentation results. Besides, the overall extraction accuracy in Changwu, Ansai, and Suide achieved was 84.62%, 86.46%, and 93.06%, respectively, which indicated that the proposed method for detecting gully-affected area is more objective and effective than traditional methods. This study demonstrated that UAV can bridge the gap between field measurement and satellite-based remote sensing, obtaining a balance in resolution and efficiency for catchment-scale gully erosion research.Keywords: unmanned aerial vehicle (UAV), object-analysis image analysis, gully erosion, gully-affected area, Loess Plateau, random forest
Procedia PDF Downloads 2182438 Economic Expansion and Land Use Change in Thailand: An Environmental Impact Analysis Using Computable General Equilibrium Model
Authors: Supakij Saisopon
Abstract:
The process of economic development incurs spatial transformation. This spatial alternation also causes environmental impacts, leading to higher pollution. In the case of Thailand, there is still a lack of price-endogenous quantitative analysis incorporating relationships among economic growth, land-use change, and environmental impact. Therefore, this paper aimed at developing the Computable General Equilibrium (CGE) model with the capability of stimulating such mutual effects. The developed CGE model has also incorporated the nested constant elasticity of transformation (CET) structure that describes the spatial redistribution mechanism between agricultural land and urban area. The simulation results showed that the 1% decrease in the availability of agricultural land lowers the value-added of agricultural by 0.036%. Similarly, the 1% reduction of availability of urban areas can decrease the value-added of manufacturing and service sectors by 0.05% and 0.047%, respectively. Moreover, the outcomes indicate that the increasing farming and urban areas induce higher volumes of solid waste, wastewater, and air pollution. Specifically, the 1% increase in the urban area can increase pollution as follows: (1) the solid waste increase by 0.049%, (2) water pollution ̶ indicated by biochemical oxygen demand (BOD) value ̶ increase by 0.051% and (3) air pollution ̶ indicated by the volumes of CO₂, N₂O, NOₓ, CH₄, and SO₂ ̶ increase within the range of 0.045%–0.051%. With the simulation for exploring the sustainable development path, a 1% increase in agricultural land use efficiency leads to the shrinking demand for agricultural land. But this is not happening in urban, a 1% scale increase in urban utilization results in still increasing demand for land. Therefore, advanced clean production technology is necessary to align the increasing land-use efficiency with the lowered pollution density.Keywords: CGE model, CET structure, environmental impact, land use
Procedia PDF Downloads 2312437 Expanding the Therapeutic Utility of Curcumin
Authors: Azza H. El-Medany, Hanan H. Hagar, Omnia A. Nayel, Jamila H. El-Medany
Abstract:
In search for drugs that can target cancer cell micro-environment in as much as being able to halt malignant cellular transformation, the natural dietary phytochemical curcumin was currently assessed in DMH-induced colorectal cancer rat model. The study enrolled 50 animals divided into a control group (n=10) and DMH-induced colorectal cancer control group (n=20) (20mg/kg-body weight for 28 weeks) versus curcumin-treated group (n=20) (160 mg/kg suspension daily oral for further 8 weeks). Treatment by curcumin succeeded to significantly decrease the percent of ACF and tended to normalize back the histological changes retrieved in adenomatous and stromal cells induced by DMH. The drug also significantly elevated GSH and significantly reduced most of the accompanying biochemical elevations (namely MDA, TNF-α, TGF-β and COX2) observed in colonic carcinomatous tissue, induced by DMH, thus succeeding to revert that of MDA, COX2 and TGF-β back to near normal as justified by being non-significantly altered as compared to normal controls. The only exception was PAF that was insignificantly altered by the drug. When taken together, it could be concluded that curcumin possess the potentiality to halt some of the orchestrated cross-talk between cancerous transformation and its micro-environmental niche that contributes to cancer initiation, progression and metastasis in this experimental cancer colon model. Envisioning these merits to a drug with already known safety preferentiality, awaits final results of current ongoing clinical trials, before curcumin can be added to the new therapeutic armamentarium of anticancer therapy.Keywords: curcumin, dimethyl hydralazine, aberrant crypt foci, malondialdehyde, reduced glutathione, cyclooxygenase-2, tumour necrosis factor-alpha, transforming growth factor-beta, platelet activating factor
Procedia PDF Downloads 2972436 Evaluation of the Role of Theatre for Development in Combating Climate Change in South Africa
Authors: Isaiah Phillip Smith, Sam Erevbenagie Usadolo, Pamela Theresa Tancsik
Abstract:
This paper is part of ongoing doctoral research that examines the role of Theatre for Development (TfD) in addressing climate change in the Mosuthu community in Reservoir Hills, Durban, South Africa. The context of the research underscores the pressing challenges facing South Africa, including drought, water shortages, deterioration of land, and civil unrest that require innovative approaches to the mitigation of climate change. TfD, described as a dialogical form of theatre that allows communities to express and contribute to development, emerges as a strategic medium for engaging communities in the process. The research problem focused on the unexamined potential of TfD in promoting community involvement and critical awareness of climate change. The study objectives included assessing the community's understanding of climate change, exploring TfD's potential as a participatory tool, examining its role in community mobilization, and developing recommendations for its effective implementation. A review of relevant literature and preliminary investigations in the research community indicates that TfD is an effective medium for promoting societal transformation and engaging marginalized communities. Through culturally resonant narratives, TfD can instill a deeper understanding of environmental challenges, fostering empathy and motivating behavioural changes. By integrating community voices and cultural elements, TfD serves as a powerful catalyst for promoting climate change awareness and inspiring collective action within the South African context. This research contributes to the global discourse on innovative approaches to climate change awareness and action.Keywords: TfD, climate change, community involvement, societal transformation, culture
Procedia PDF Downloads 572435 Cytotoxic Effect of Biologically Transformed Propolis on HCT-116 Human Colon Cancer Cells
Authors: N. Selvi Gunel, L. M. Oktay, H. Memmedov, B. Durmaz, H. Kalkan Yildirim, E. Yildirim Sozmen
Abstract:
Object: Propolis which consists of compounds that are accepted as antioxidant, antimicrobial, antiseptic, antibacterial, anti-inflammatory, anti-mutagenic, immune-modulator and cytotoxic, is frequently used in current therapeutic applications. However, some of them result in allergic side effects, causing consumption to be restricted. Previously our group has succeeded in producing a new biotechnological product which was less allergenic. In this study, we purpose to optimize production conditions of this biologically-transformed propolis and determine the cytotoxic effects of obtained new products on colon cancer cell line (HCT-116). Method: Firstly, solid propolis samples were dissolved in water after weighing, grinding and sizing (sieve-35mesh) and applied 40 kHz/10 min ultrasonication. Samples were prepared according to inoculation with Lactobacillus plantarum in two different proportions (2.5% and 3.5%). Chromatographic analyzes of propolis were performed by UPLC-MS/MS (Waters, Milford, MA) system. Results were analysed by UPLC-MS/MS system MassLynx™ 4.1 software. HCT-116 cells were treated with propolis examples at 25-1000 µg/ml concentrations and cytotoxicity were measured by using WST-8 assay at 24, 48, and 72 hours. Samples with biological transformation were compared with the non-transformed control group samples. Our experiment groups were formed as follows: untreated (group 1), propolis dissolved in water ultrasonicated at 40 kHz/10 min (group 2), propolis dissolved in water ultrasonicated at 40 kHz/10 min and inoculated 2.5% L. plantarum L1 strain (group 3), propolis dissolved in water ultrasonicated at 40 kHz/10 min and inoculated 3.5% L. plantarum L3 strain (group 4). Obtained data were calculated with Graphpad Software V5 and analyzed by two-way ANOVA test followed by Bonferroni test. Result: As a result of our study, the cytotoxic effect of propolis samples on HCT-116 cells was evaluated. There was a 7.21 fold increase in group 3 compared to group 2 in the concentration of 1000 µg/ml, and it was a 6.66 fold increase in group 3 compared to group 1 at the end of 24 hours. At the end of 48 hours, in the concentration of 500 µg/ml, it was determined 4.7 fold increase in group 4 compared to group 3. At the same time, in the concentration of 750 µg/ml it was determined 2.01 fold increase in group 4 compared to group 3 and in the same concentration, it was determined 3.1 fold increase in group 4 compared to group 2. Also, at the 72 hours, in the concentration of 750 µg/ml, it was determined 2.42 fold increase in group 3 according to group 2 and in the same time, in the concentration of 1000 µg/ml, it was determined 2.13 fold increase in group 4 according to group 2. According to cytotoxicity results, the group which were ultrasonicated at 40 kHz/10min and inoculated 3.5% L. plantarum L3-strain had a higher cytotoxic effect. Conclusion: It is known that bioavailability of propolis is halved in six months. The data obtained from our results indicated that biologically-transformed propolis had more cytotoxic effect than non-transformed group on colon cancer cells. Consequently, we suggested that L. plantarum-transformation provides both reduction of allergenicity and extension of bioavailability period by enhancing healthful polyphenols.Keywords: bio-transformation, propolis, colon cancer, cytotoxicity
Procedia PDF Downloads 1402434 A Methodology to Integrate Data in the Company Based on the Semantic Standard in the Context of Industry 4.0
Authors: Chang Qin, Daham Mustafa, Abderrahmane Khiat, Pierre Bienert, Paulo Zanini
Abstract:
Nowadays, companies are facing lots of challenges in the process of digital transformation, which can be a complex and costly undertaking. Digital transformation involves the collection and analysis of large amounts of data, which can create challenges around data management and governance. Furthermore, it is also challenged to integrate data from multiple systems and technologies. Although with these pains, companies are still pursuing digitalization because by embracing advanced technologies, companies can improve efficiency, quality, decision-making, and customer experience while also creating different business models and revenue streams. In this paper, the issue that data is stored in data silos with different schema and structures is focused. The conventional approaches to addressing this issue involve utilizing data warehousing, data integration tools, data standardization, and business intelligence tools. However, these approaches primarily focus on the grammar and structure of the data and neglect the importance of semantic modeling and semantic standardization, which are essential for achieving data interoperability. In this session, the challenge of data silos in Industry 4.0 is addressed by developing a semantic modeling approach compliant with Asset Administration Shell (AAS) models as an efficient standard for communication in Industry 4.0. The paper highlights how our approach can facilitate the data mapping process and semantic lifting according to existing industry standards such as ECLASS and other industrial dictionaries. It also incorporates the Asset Administration Shell technology to model and map the company’s data and utilize a knowledge graph for data storage and exploration.Keywords: data interoperability in industry 4.0, digital integration, industrial dictionary, semantic modeling
Procedia PDF Downloads 942433 Ecological impacts of Cage Farming: A Case Study of Lake Victoria, Kenya
Authors: Mercy Chepkirui, Reuben Omondi, Paul Orina, Albert Getabu, Lewis Sitoki, Jonathan Munguti
Abstract:
Globally, the decline in capture fisheries as a result of the growing population and increasing awareness of the nutritional benefits of white meat has led to the development of aquaculture. This is anticipated to meet the increasing call for more food for the human population, which is likely to increase further by 2050. Statistics showed that more than 50% of the global future fish diet will come from aquaculture. Aquaculture began commercializing some decades ago; this is accredited to technological advancement from traditional to modern cultural systems, including cage farming. Cage farming technology has been rapidly growing since its inception in Lake Victoria, Kenya. Currently, over 6,000 cages have been set up in Kenyan waters, and this offers an excellent opportunity for recognition of Kenya’s government tactic to eliminate food insecurity and malnutrition, create employment and promote a Blue Economy. However, being an open farming enterprise is likely to emit large bulk of waste hence altering the ecosystem integrity of the lake. This is through increased chlorophyll-a pigments, alteration of the plankton community, macroinvertebrates, fish genetic pollution, transmission of fish diseases and pathogens. Cage farming further increases the nutrient loads leading to the production of harmful algal blooms, thus negatively affecting aquatic and human life. Despite the ecological transformation, cage farming provides a platform for the achievement of the Sustainable Development Goals of 2030, especially the achievement of food security and nutrition. Therefore, there is a need for Integrated Multitrophic Aquaculture as part of Blue Transformation for ecosystem monitoring.Keywords: aquaculture, ecosystem, blue economy, food security
Procedia PDF Downloads 792432 Epilepsy Seizure Prediction by Effective Connectivity Estimation Using Granger Causality and Directed Transfer Function Analysis of Multi-Channel Electroencephalogram
Authors: Mona Hejazi, Ali Motie Nasrabadi
Abstract:
Epilepsy is a persistent neurological disorder that affects more than 50 million people worldwide. Hence, there is a necessity to introduce an efficient prediction model for making a correct diagnosis of the epileptic seizure and accurate prediction of its type. In this study we consider how the Effective Connectivity (EC) patterns obtained from intracranial Electroencephalographic (EEG) recordings reveal information about the dynamics of the epileptic brain and can be used to predict imminent seizures, as this will enable the patients (and caregivers) to take appropriate precautions. We use this definition because we believe that effective connectivity near seizures begin to change, so we can predict seizures according to this feature. Results are reported on the standard Freiburg EEG dataset which contains data from 21 patients suffering from medically intractable focal epilepsy. Six channels of EEG from each patients are considered and effective connectivity using Directed Transfer Function (DTF) and Granger Causality (GC) methods is estimated. We concentrate on effective connectivity standard deviation over time and feature changes in five brain frequency sub-bands (Alpha, Beta, Theta, Delta, and Gamma) are compared. The performance obtained for the proposed scheme in predicting seizures is: average prediction time is 50 minutes before seizure onset, the maximum sensitivity is approximate ~80% and the false positive rate is 0.33 FP/h. DTF method is more acceptable to predict epileptic seizures and generally we can observe that the greater results are in gamma and beta sub-bands. The research of this paper is significantly helpful for clinical applications, especially for the exploitation of online portable devices.Keywords: effective connectivity, Granger causality, directed transfer function, epilepsy seizure prediction, EEG
Procedia PDF Downloads 4692431 Intelligent Materials and Functional Aspects of Shape Memory Alloys
Authors: Osman Adiguzel
Abstract:
Shape-memory alloys are a new class of functional materials with a peculiar property known as shape memory effect. These alloys return to a previously defined shape on heating after deformation in low temperature product phase region and take place in a class of functional materials due to this property. The origin of this phenomenon lies in the fact that the material changes its internal crystalline structure with changing temperature. Shape memory effect is based on martensitic transitions, which govern the remarkable changes in internal crystalline structure of materials. Martensitic transformation, which is a solid state phase transformation, occurs in thermal manner in material on cooling from high temperature parent phase region. This transformation is governed by changes in the crystalline structure of the material. Shape memory alloys cycle between original and deformed shapes in bulk level on heating and cooling, and can be used as a thermal actuator or temperature-sensitive elements due to this property. Martensitic transformations usually occur with the cooperative movement of atoms by means of lattice invariant shears. The ordered parent phase structures turn into twinned structures with this movement in crystallographic manner in thermal induced case. The twinned martensites turn into the twinned or oriented martensite by stressing the material at low temperature martensitic phase condition. The detwinned martensite turns into the parent phase structure on first heating, first cycle, and parent phase structures turn into the twinned and detwinned structures respectively in irreversible and reversible memory cases. On the other hand, shape memory materials are very important and useful in many interdisciplinary fields such as medicine, pharmacy, bioengineering, metallurgy and many engineering fields. The choice of material as well as actuator and sensor to combine it with the host structure is very essential to develop main materials and structures. Copper based alloys exhibit this property in metastable beta-phase region, which has bcc-based structures at high temperature parent phase field, and these structures martensitically turn into layered complex structures with lattice twinning following two ordered reactions on cooling. Martensitic transition occurs as self-accommodated martensite with inhomogeneous shears, lattice invariant shears which occur in two opposite directions, <110 > -type directions on the {110}-type plane of austenite matrix which is basal plane of martensite. This kind of shear can be called as {110}<110> -type mode and gives rise to the formation of layered structures, like 3R, 9R or 18R depending on the stacking sequences on the close-packed planes of the ordered lattice. In the present contribution, x-ray diffraction and transmission electron microscopy (TEM) studies were carried out on two copper based alloys which have the chemical compositions in weight; Cu-26.1%Zn 4%Al and Cu-11%Al-6%Mn. X-ray diffraction profiles and electron diffraction patterns reveal that both alloys exhibit super lattice reflections inherited from parent phase due to the displacive character of martensitic transformation. X-ray diffractograms taken in a long time interval show that locations and intensities of diffraction peaks change with the aging time at room temperature. In particular, some of the successive peak pairs providing a special relation between Miller indices come close each other.Keywords: Shape memory effect, martensite, twinning, detwinning, self-accommodation, layered structures
Procedia PDF Downloads 4262430 Classification on Statistical Distributions of a Complex N-Body System
Authors: David C. Ni
Abstract:
Contemporary models for N-body systems are based on temporal, two-body, and mass point representation of Newtonian mechanics. Other mainstream models include 2D and 3D Ising models based on local neighborhood the lattice structures. In Quantum mechanics, the theories of collective modes are for superconductivity and for the long-range quantum entanglement. However, these models are still mainly for the specific phenomena with a set of designated parameters. We are therefore motivated to develop a new construction directly from the complex-variable N-body systems based on the extended Blaschke functions (EBF), which represent a non-temporal and nonlinear extension of Lorentz transformation on the complex plane – the normalized momentum spaces. A point on the complex plane represents a normalized state of particle momentums observed from a reference frame in the theory of special relativity. There are only two key parameters, normalized momentum and nonlinearity for modelling. An algorithm similar to Jenkins-Traub method is adopted for solving EBF iteratively. Through iteration, the solution sets show a form of σ + i [-t, t], where σ and t are the real numbers, and the [-t, t] shows various distributions, such as 1-peak, 2-peak, and 3-peak etc. distributions and some of them are analog to the canonical distributions. The results of the numerical analysis demonstrate continuum-to-discreteness transitions, evolutional invariance of distributions, phase transitions with conjugate symmetry, etc., which manifest the construction as a potential candidate for the unification of statistics. We hereby classify the observed distributions on the finite convergent domains. Continuous and discrete distributions both exist and are predictable for given partitions in different regions of parameter-pair. We further compare these distributions with canonical distributions and address the impacts on the existing applications.Keywords: blaschke, lorentz transformation, complex variables, continuous, discrete, canonical, classification
Procedia PDF Downloads 3092429 Reading High Rise Residential Development in Istanbul on the Theory of Globalization
Authors: Tuba Sari
Abstract:
One of the major transformations caused by the industrial revolution, technological developments and globalization is undoubtedly acceleration of urbanization process. Globalization, in particular, is one of the major factors that trigger this transformation. In this context, as a result of the global metropolitan city system, multifunctional rising structure forms are becoming undeniable fact of the world’s leading metropolises as the manifestation of prestige and power with different life choices, easy accessibility to services related to the era of technology. The scope of research deals with five different urban centers in İstanbul where high-rise housing is increasing dramatically after 2000’s. Therefore, the research regards multi-centered urban residential pattern being created by high-rise housing structures in the city. The methodology of the research is based on two main issue, one of them is related to sampling method of high-rise housing projects in İstanbul, while the other method of the research is based on the model of Semantics. In the framework of research hypothesis, it is aimed to prove that the character of vertical intensive structuring in Istanbul is based on seeking of different forms and images in the expressive quality, considering the production of existing high-rise buildings in residential areas in recent years. In respect to rising discourse of 'World City' in the globalizing world, it is very important to state the place of Istanbul in other developing world metropolises. In the perspective of 'World City' discourse, Istanbul has different projects concerning with globalization, international finance companies, cultural activities, mega projects, etc. In brief, the aim of this research is examining transformation forms of high-rise housing development in Istanbul within the frame of developing world cities, searching and analyzing discourse and image related to these projects.Keywords: globalization, high-rise, housing, image
Procedia PDF Downloads 2842428 Rethinking Riba in an Agency Theoretic Framework: Islamic Banking and Finance beyond Sophistry
Authors: Muhammad Arsalan
Abstract:
The efficiency of a financial intermediation system is assessed by its ability to achieve allocative efficiency, asset transformation, and the subsequent economic development. Islamic Banking and Finance (IBF) was conceived to serve as an alternate financial intermediation system adherent to the injunctions of Islam. A critical appraisal of the state of contemporary IBF reveals that it neither fulfills the aspirations of Islamic rhetoric nor is efficient in terms of asset transformation and economic development. This paper is an intuitive pursuit to explore the economic rationale of established principles of IBF, and the reasons of the persistent divergence of IBF being accused of ruses and sophistry. Disentangling the varying viewpoints, the underdevelopment of IBF has been attributed to misinterpretation of Riba, which has been explicated through a narrow fiqhi and legally deterministic approach. It presents a critical account of how incorrect conceptualization of the key injunction on Riba, steered flawed institutionalization of an Islamic Financial intermediation system. It also emphasizes on the wrong interpretation of the ontological and epistemological sources of Islamic Law (primarily Riba), that explains the perennial economic underdevelopment of the Muslim world. Deeming ‘a collaborative and dynamic Ijtihad’ as the elixir, this paper insists on the exigency of redefining Riba, i.e., a definition that incorporates the modern modes of economic cooperation and the contemporary financial intermediation ecosystem. Finally, Riba has been articulated in an agency theoretic framework to eschew expropriation of wealth, and assure protection of property rights, aimed at realizing the twin goals of a) Shari’ah adherence in true spirit, b) financial and economic development of the Muslim world.Keywords: agency theory, financial intermediation, Islamic banking and finance, ijtihad, economic development, Riba, information asymmetry
Procedia PDF Downloads 1392427 Predicting Football Player Performance: Integrating Data Visualization and Machine Learning
Authors: Saahith M. S., Sivakami R.
Abstract:
In the realm of football analytics, particularly focusing on predicting football player performance, the ability to forecast player success accurately is of paramount importance for teams, managers, and fans. This study introduces an elaborate examination of predicting football player performance through the integration of data visualization methods and machine learning algorithms. The research entails the compilation of an extensive dataset comprising player attributes, conducting data preprocessing, feature selection, model selection, and model training to construct predictive models. The analysis within this study will involve delving into feature significance using methodologies like Select Best and Recursive Feature Elimination (RFE) to pinpoint pertinent attributes for predicting player performance. Various machine learning algorithms, including Random Forest, Decision Tree, Linear Regression, Support Vector Regression (SVR), and Artificial Neural Networks (ANN), will be explored to develop predictive models. The evaluation of each model's performance utilizing metrics such as Mean Squared Error (MSE) and R-squared will be executed to gauge their efficacy in predicting player performance. Furthermore, this investigation will encompass a top player analysis to recognize the top-performing players based on the anticipated overall performance scores. Nationality analysis will entail scrutinizing the player distribution based on nationality and investigating potential correlations between nationality and player performance. Positional analysis will concentrate on examining the player distribution across various positions and assessing the average performance of players in each position. Age analysis will evaluate the influence of age on player performance and identify any discernible trends or patterns associated with player age groups. The primary objective is to predict a football player's overall performance accurately based on their individual attributes, leveraging data-driven insights to enrich the comprehension of player success on the field. By amalgamating data visualization and machine learning methodologies, the aim is to furnish valuable tools for teams, managers, and fans to effectively analyze and forecast player performance. This research contributes to the progression of sports analytics by showcasing the potential of machine learning in predicting football player performance and offering actionable insights for diverse stakeholders in the football industry.Keywords: football analytics, player performance prediction, data visualization, machine learning algorithms, random forest, decision tree, linear regression, support vector regression, artificial neural networks, model evaluation, top player analysis, nationality analysis, positional analysis
Procedia PDF Downloads 382426 Numerical Calculation and Analysis of Fine Echo Characteristics of Underwater Hemispherical Cylindrical Shell
Authors: Hongjian Jia
Abstract:
A finite-length cylindrical shell with a spherical cap is a typical engineering approximation model of actual underwater targets. The research on the omni-directional acoustic scattering characteristics of this target model can provide a favorable basis for the detection and identification of actual underwater targets. The elastic resonance characteristics of the target are the results of the comprehensive effect of the target length, shell-thickness ratio and materials. Under the conditions of different materials and geometric dimensions, the coincidence resonance characteristics of the target have obvious differences. Aiming at this problem, this paper obtains the omni-directional acoustic scattering field of the underwater hemispherical cylindrical shell by numerical calculation and studies the influence of target geometric parameters (length, shell-thickness ratio) and material parameters on the coincidence resonance characteristics of the target in turn. The study found that the formant interval is not a stable value and changes with the incident angle. Among them, the formant interval is less affected by the target length and shell-thickness ratio and is significantly affected by the material properties, which is an effective feature for classifying and identifying targets of different materials. The quadratic polynomial is utilized to fully fit the change relationship between the formant interval and the angle. The results show that the three fitting coefficients of the stainless steel and aluminum targets are significantly different, which can be used as an effective feature parameter to characterize the target materials.Keywords: hemispherical cylindrical shell;, fine echo characteristics;, geometric and material parameters;, formant interval
Procedia PDF Downloads 1092425 Iris Recognition Based on the Low Order Norms of Gradient Components
Authors: Iman A. Saad, Loay E. George
Abstract:
Iris pattern is an important biological feature of human body; it becomes very hot topic in both research and practical applications. In this paper, an algorithm is proposed for iris recognition and a simple, efficient and fast method is introduced to extract a set of discriminatory features using first order gradient operator applied on grayscale images. The gradient based features are robust, up to certain extents, against the variations may occur in contrast or brightness of iris image samples; the variations are mostly occur due lightening differences and camera changes. At first, the iris region is located, after that it is remapped to a rectangular area of size 360x60 pixels. Also, a new method is proposed for detecting eyelash and eyelid points; it depends on making image statistical analysis, to mark the eyelash and eyelid as a noise points. In order to cover the features localization (variation), the rectangular iris image is partitioned into N overlapped sub-images (blocks); then from each block a set of different average directional gradient densities values is calculated to be used as texture features vector. The applied gradient operators are taken along the horizontal, vertical and diagonal directions. The low order norms of gradient components were used to establish the feature vector. Euclidean distance based classifier was used as a matching metric for determining the degree of similarity between the features vector extracted from the tested iris image and template features vectors stored in the database. Experimental tests were performed using 2639 iris images from CASIA V4-Interival database, the attained recognition accuracy has reached up to 99.92%.Keywords: iris recognition, contrast stretching, gradient features, texture features, Euclidean metric
Procedia PDF Downloads 3352424 Cas9-Assisted Direct Cloning and Refactoring of a Silent Biosynthetic Gene Cluster
Authors: Peng Hou
Abstract:
Natural products produced from marine bacteria serve as an immense reservoir for anti-infective drugs and therapeutic agents. Nowadays, heterologous expression of gene clusters of interests has been widely adopted as an effective strategy for natural product discovery. Briefly, the heterologous expression flowchart would be: biosynthetic gene cluster identification, pathway construction and expression, and product detection. However, gene cluster capture using traditional Transformation-associated recombination (TAR) protocol is low-efficient (0.5% positive colony rate). To make things worse, most of these putative new natural products are only predicted by bioinformatics analysis such as antiSMASH, and their corresponding natural products biosynthetic pathways are either not expressed or expressed at very low levels under laboratory conditions. Those setbacks have inspired us to focus on seeking new technologies to efficiently edit and refractor of biosynthetic gene clusters. Recently, two cutting-edge techniques have attracted our attention - the CRISPR-Cas9 and Gibson Assembly. By now, we have tried to pretreat Brevibacillus laterosporus strain genomic DNA with CRISPR-Cas9 nucleases that specifically generated breaks near the gene cluster of interest. This trial resulted in an increase in the efficiency of gene cluster capture (9%). Moreover, using Gibson Assembly by adding/deleting certain operon and tailoring enzymes regardless of end compatibility, the silent construct (~80kb) has been successfully refactored into an active one, yielded a series of analogs expected. With the appearances of the novel molecular tools, we are confident to believe that development of a high throughput mature pipeline for DNA assembly, transformation, product isolation and identification would no longer be a daydream for marine natural product discovery.Keywords: biosynthesis, CRISPR-Cas9, DNA assembly, refactor, TAR cloning
Procedia PDF Downloads 2822423 Facilitating Written Biology Assessment in Large-Enrollment Courses Using Machine Learning
Authors: Luanna B. Prevost, Kelli Carter, Margaurete Romero, Kirsti Martinez
Abstract:
Writing is an essential scientific practice, yet, in several countries, the increasing university science class-size limits the use of written assessments. Written assessments allow students to demonstrate their learning in their own words and permit the faculty to evaluate students’ understanding. However, the time and resources required to grade written assessments prohibit their use in large-enrollment science courses. This study examined the use of machine learning algorithms to automatically analyze student writing and provide timely feedback to the faculty about students' writing in biology. Written responses to questions about matter and energy transformation were collected from large-enrollment undergraduate introductory biology classrooms. Responses were analyzed using the LightSide text mining and classification software. Cohen’s Kappa was used to measure agreement between the LightSide models and human raters. Predictive models achieved agreement with human coding of 0.7 Cohen’s Kappa or greater. Models captured that when writing about matter-energy transformation at the ecosystem level, students focused on primarily on the concepts of heat loss, recycling of matter, and conservation of matter and energy. Models were also produced to capture writing about processes such as decomposition and biochemical cycling. The models created in this study can be used to provide automatic feedback about students understanding of these concepts to biology faculty who desire to use formative written assessments in larger enrollment biology classes, but do not have the time or personnel for manual grading.Keywords: machine learning, written assessment, biology education, text mining
Procedia PDF Downloads 2812422 Kou Jump Diffusion Model: An Application to the SP 500; Nasdaq 100 and Russell 2000 Index Options
Authors: Wajih Abbassi, Zouhaier Ben Khelifa
Abstract:
The present research points towards the empirical validation of three options valuation models, the ad-hoc Black-Scholes model as proposed by Berkowitz (2001), the constant elasticity of variance model of Cox and Ross (1976) and the Kou jump-diffusion model (2002). Our empirical analysis has been conducted on a sample of 26,974 options written on three indexes, the S&P 500, Nasdaq 100 and the Russell 2000 that were negotiated during the year 2007 just before the sub-prime crisis. We start by presenting the theoretical foundations of the models of interest. Then we use the technique of trust-region-reflective algorithm to estimate the structural parameters of these models from cross-section of option prices. The empirical analysis shows the superiority of the Kou jump-diffusion model. This superiority arises from the ability of this model to portray the behavior of market participants and to be closest to the true distribution that characterizes the evolution of these indices. Indeed the double-exponential distribution covers three interesting properties that are: the leptokurtic feature, the memory less property and the psychological aspect of market participants. Numerous empirical studies have shown that markets tend to have both overreaction and under reaction over good and bad news respectively. Despite of these advantages there are not many empirical studies based on this model partly because probability distribution and option valuation formula are rather complicated. This paper is the first to have used the technique of nonlinear curve-fitting through the trust-region-reflective algorithm and cross-section options to estimate the structural parameters of the Kou jump-diffusion model.Keywords: jump-diffusion process, Kou model, Leptokurtic feature, trust-region-reflective algorithm, US index options
Procedia PDF Downloads 4292421 Fake News Detection Based on Fusion of Domain Knowledge and Expert Knowledge
Authors: Yulan Wu
Abstract:
The spread of fake news on social media has posed significant societal harm to the public and the nation, with its threats spanning various domains, including politics, economics, health, and more. News on social media often covers multiple domains, and existing models studied by researchers and relevant organizations often perform well on datasets from a single domain. However, when these methods are applied to social platforms with news spanning multiple domains, their performance significantly deteriorates. Existing research has attempted to enhance the detection performance of multi-domain datasets by adding single-domain labels to the data. However, these methods overlook the fact that a news article typically belongs to multiple domains, leading to the loss of domain knowledge information contained within the news text. To address this issue, research has found that news records in different domains often use different vocabularies to describe their content. In this paper, we propose a fake news detection framework that combines domain knowledge and expert knowledge. Firstly, it utilizes an unsupervised domain discovery module to generate a low-dimensional vector for each news article, representing domain embeddings, which can retain multi-domain knowledge of the news content. Then, a feature extraction module uses the domain embeddings discovered through unsupervised domain knowledge to guide multiple experts in extracting news knowledge for the total feature representation. Finally, a classifier is used to determine whether the news is fake or not. Experiments show that this approach can improve multi-domain fake news detection performance while reducing the cost of manually labeling domain labels.Keywords: fake news, deep learning, natural language processing, multiple domains
Procedia PDF Downloads 732420 The Evolution Characteristics of Urban Ecological Patterns in Parallel Range-Valley Areas, China
Authors: Wen Feiming
Abstract:
As the ecological barrier of the Yangtze River, the ecological security of the Parallel Range-Valley area is very important. However, the unique geomorphic features aggravate the contradiction between man and land, resulting in the encroachment of ecological space. In recent years , relevant researches has focused on the single field of land science, ecology and landscape ecology, and it is difficult to systematically reflect the regularities of distribution and evolution trends of ecological patterns in the process of urban development. Therefore, from the perspective of "Production-Living-Ecological space", using spatial analysis methods such as Remote Sensing (RS) and Geographic Information Systems (GIS), this paper analyzes the evolution characteristics and driving factors of the ecological pattern of mountain towns in the parallel range-valley region from the aspects of land use structure, change rate, transformation relationship, and spatial correlation. It is concluded that the ecological pattern of mountain towns presents a trend from expansion and diffusion to agglomeration, and the dynamic spatial transfer is a trend from artificial transformation to the natural origin, while the driving effect analysis shows the significant characteristics of terrain attraction and construction barrier. Finally, combined with the evolution characteristics and driving mechanism, the evolution modes of "mountain area - concentrated growth", "trough area - diffusion attenuation" and "flat area - concentrated attenuation" are summarized, and the differentiated zoning and stratification ecological planning strategies are proposed here, in order to provide the theoretical basis for the sustainable development of mountain towns in parallel range-valley areas.Keywords: parallel range-valley, ecological pattern, evolution characteristics, driving factors
Procedia PDF Downloads 1042419 Relationship between Cinema and Culture: Reel and Real life in India
Authors: Prachi Chavda
Abstract:
The world, as of today, is smaller than it was for those who lived few decades ago. Internet, media and telecommunications have impacted the world like never before. Culture is the pillar upon which a society mushrooms. A culture develops with human creativity over the years and also by the exchange and intermixing of ideas and way of life across different civilizations and we can say that one of the influencing medium of exchange and intermixing of these ideas is cinema. Cinema has been the wonderful as well as important medium of communication since it has been emerged. Change is the thumb rule of life and so have been Indian cinema. As society has evolved from time to time so has the stories of Indian Cinema and its characters, hence it directly effects to the Indian culture as cinema has been very strong mediator for information exchange. The paper tries to discuss deeply how Indian cinema (reel life) and Indian culture (real life) has been influencing each other that results into a constant modification in both. Moreover, the research tries to deal with the issue with some examples that as a outcome how movies impact the Indian culture positively and negatively on culture. Therefore, it spreads the wave of change in cultural settings of society. The paper also tries to light the psychology of youth of India. Today, children and youth greatly admire the ostentatious materialistic display of outfits and style of the actors in the movies. Also, the movies bearing romanticism and showcasing disputatious issues like pre-marital sex, live-in relationship, homo-sexuality etc. though without highlighting them extensively have indeed inspired the commoners. Pros and cons always exist. Such revelation of issues certainly give a spark in the minds of those who are in their formative years and the effect of which is seen with the passage of time Thus, we can say that emergence of cinema as a strong tool of social change as well as culture as a triggering factor for transformation in cinema. As, a finding we can say that culture and cinema of India are influencing factors for each other. Cinema and culture are two sides of a coin, where both are responsible for evolution of each other.Keywords: cinema, culture, influence, transformation
Procedia PDF Downloads 3962418 Numerical Investigation of Entropy Signatures in Fluid Turbulence: Poisson Equation for Pressure Transformation from Navier-Stokes Equation
Authors: Samuel Ahamefula Mba
Abstract:
Fluid turbulence is a complex and nonlinear phenomenon that occurs in various natural and industrial processes. Understanding turbulence remains a challenging task due to its intricate nature. One approach to gain insights into turbulence is through the study of entropy, which quantifies the disorder or randomness of a system. This research presents a numerical investigation of entropy signatures in fluid turbulence. The work is to develop a numerical framework to describe and analyse fluid turbulence in terms of entropy. This decomposes the turbulent flow field into different scales, ranging from large energy-containing eddies to small dissipative structures, thus establishing a correlation between entropy and other turbulence statistics. This entropy-based framework provides a powerful tool for understanding the underlying mechanisms driving turbulence and its impact on various phenomena. This work necessitates the derivation of the Poisson equation for pressure transformation of Navier-Stokes equation and using Chebyshev-Finite Difference techniques to effectively resolve it. To carry out the mathematical analysis, consider bounded domains with smooth solutions and non-periodic boundary conditions. To address this, a hybrid computational approach combining direct numerical simulation (DNS) and Large Eddy Simulation with Wall Models (LES-WM) is utilized to perform extensive simulations of turbulent flows. The potential impact ranges from industrial process optimization and improved prediction of weather patterns.Keywords: turbulence, Navier-Stokes equation, Poisson pressure equation, numerical investigation, Chebyshev-finite difference, hybrid computational approach, large Eddy simulation with wall models, direct numerical simulation
Procedia PDF Downloads 942417 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings
Authors: Gaelle Candel, David Naccache
Abstract:
t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embeddings. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n²) to O(n²=k), and the memory requirement from n² to 2(n=k)², which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution, and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.Keywords: concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning
Procedia PDF Downloads 1442416 The Positive Effects of Processing Instruction on the Acquisition of French as a Second Language: An Eye-Tracking Study
Authors: Cecile Laval, Harriet Lowe
Abstract:
Processing Instruction is a psycholinguistic pedagogical approach drawing insights from the Input Processing Model which establishes the initial innate strategies used by second language learners to connect form and meaning of linguistic features. With the ever-growing use of technology in Second Language Acquisition research, the present study uses eye-tracking to measure the effectiveness of Processing Instruction in the acquisition of French and its effects on learner’s cognitive strategies. The experiment was designed using a TOBII Pro-TX300 eye-tracker to measure participants’ default strategies when processing French linguistic input and any cognitive changes after receiving Processing Instruction treatment. Participants were drawn from lower intermediate adult learners of French at the University of Greenwich and randomly assigned to two groups. The study used a pre-test/post-test methodology. The pre-tests (one per linguistic item) were administered via the eye-tracker to both groups one week prior to instructional treatment. One group received full Processing Instruction treatment (explicit information on the grammatical item and on the processing strategies, and structured input activities) on the primary target linguistic feature (French past tense imperfective aspect). The second group received Processing Instruction treatment except the explicit information on the processing strategies. Three immediate post-tests on the three grammatical structures under investigation (French past tense imperfective aspect, French Subjunctive used for the expression of doubt, and the French causative construction with Faire) were administered with the eye-tracker. The eye-tracking data showed the positive change in learners’ processing of the French target features after instruction with improvement in the interpretation of the three linguistic features under investigation. 100% of participants in both groups made a statistically significant improvement (p=0.001) in the interpretation of the primary target feature (French past tense imperfective aspect) after treatment. 62.5% of participants made an improvement in the secondary target item (French Subjunctive used for the expression of doubt) and 37.5% of participants made an improvement in the cumulative target feature (French causative construction with Faire). Statistically there was no significant difference between the pre-test and post-test scores in the cumulative target feature; however, the variance approximately tripled between the pre-test and the post-test (3.9 pre-test and 9.6 post-test). This suggests that the treatment does not affect participants homogenously and implies a role for individual differences in the transfer-of-training effect of Processing Instruction. The use of eye-tracking provides an opportunity for the study of unconscious processing decisions made during moment-by-moment comprehension. The visual data from the eye-tracking demonstrates changes in participants’ processing strategies. Gaze plots from pre- and post-tests display participants fixation points changing from focusing on content words to focusing on the verb ending. This change in processing strategies can be clearly seen in the interpretation of sentences in both primary and secondary target features. This paper will present the research methodology, design and results of the experimental study using eye-tracking to investigate the primary effects and transfer-of-training effects of Processing Instruction. It will then provide evidence of the cognitive benefits of Processing Instruction in Second Language Acquisition and offer suggestion in second language teaching of grammar.Keywords: eye-tracking, language teaching, processing instruction, second language acquisition
Procedia PDF Downloads 2802415 Use of Real Time Ultrasound for the Prediction of Carcass Composition in Serrana Goats
Authors: Antonio Monteiro, Jorge Azevedo, Severiano Silva, Alfredo Teixeira
Abstract:
The objective of this study was to compare the carcass and in vivo real-time ultrasound measurements (RTU) and their capacity to predict the composition of Serrana goats up to 40% of maturity. Twenty one females (11.1 ± 3.97 kg) and Twenty one males (15.6 ± 5.38 kg) were utilized to made in vivo measurements with a 5 MHz probe (ALOKA 500V scanner) at the 9th-10th, 10th-11th thoracic vertebrae (uT910 and uT1011, respectively), at the 1st- 2nd, 3rd-4th, and 4th-5th lumbar vertebrae (uL12, ul34 and uL45, respectively) and also at the 3rd-4th sternebrae (EEST). It was recorded the images of RTU measurements of Longissimus thoracis et lumborum muscle (LTL) depth (EM), width (LM), perimeter (PM), area (AM) and subcutaneous fat thickness (SFD) above the LTL, as well as the depth of tissues of the sternum (EEST) between the 3rd-4th sternebrae. All RTU images were analyzed using the ImageJ software. After slaughter, the carcasses were stored at 4 ºC for 24 h. After this period the carcasses were divided and the left half was entirely dissected into muscle, dissected fat (subcutaneous fat plus intermuscular fat) and bone. Prior to the dissection measurements equivalent to those obtained in vivo with RTU were recorded. Using the Statistica 5, correlation and regression analyses were performed. The prediction of carcass composition was achieved by stepwise regression procedure, with live weight and RTU measurements with and without transformation of variables to the same dimension. The RTU and carcass measurements, except for SFD measurements, showed high correlation (r > 0.60, P < 0.001). The RTU measurements and the live weight, showed ability to predict carcass composition on muscle (R2 = 0.99, P < 0.001), subcutaneous fat (R2 = 0.41, P < 0.001), intermuscular fat (R2 = 0.84, P < 0.001), dissected fat (R2 = 0.71, P < 0.001) and bone (R2 = 0.94, P < 0.001). The transformation of variables allowed a slight increase of precision, but with the increase in the number of variables, with the exception of subcutaneous fat prediction. In vivo measurements by RTU can be applied to predict kid goat carcass composition, from 5 measurements of RTU and the live weight.Keywords: carcass, goats, real time, ultrasound
Procedia PDF Downloads 2612414 Credit Card Fraud Detection with Ensemble Model: A Meta-Heuristic Approach
Authors: Gong Zhilin, Jing Yang, Jian Yin
Abstract:
The purpose of this paper is to develop a novel system for credit card fraud detection based on sequential modeling of data using hybrid deep learning models. The projected model encapsulates five major phases are pre-processing, imbalance-data handling, feature extraction, optimal feature selection, and fraud detection with an ensemble classifier. The collected raw data (input) is pre-processed to enhance the quality of the data through alleviation of the missing data, noisy data as well as null values. The pre-processed data are class imbalanced in nature, and therefore they are handled effectively with the K-means clustering-based SMOTE model. From the balanced class data, the most relevant features like improved Principal Component Analysis (PCA), statistical features (mean, median, standard deviation) and higher-order statistical features (skewness and kurtosis). Among the extracted features, the most optimal features are selected with the Self-improved Arithmetic Optimization Algorithm (SI-AOA). This SI-AOA model is the conceptual improvement of the standard Arithmetic Optimization Algorithm. The deep learning models like Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and optimized Quantum Deep Neural Network (QDNN). The LSTM and CNN are trained with the extracted optimal features. The outcomes from LSTM and CNN will enter as input to optimized QDNN that provides the final detection outcome. Since the QDNN is the ultimate detector, its weight function is fine-tuned with the Self-improved Arithmetic Optimization Algorithm (SI-AOA).Keywords: credit card, data mining, fraud detection, money transactions
Procedia PDF Downloads 1312413 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models, and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. It is important to design systems based on structural analysis, research, and evaluation of efficiency indicators. One of the important efficiency criteria is the reliability of the system, which depends on the components of the structure. Quantifying the reliability of large-scale systems is a computationally complex process, and it is advisable to perform it with the help of a computer. Logical-probabilistic modeling is one of the effective means of describing the structure of a complex system and quantitatively evaluating its reliability, which was the basis of our application. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of “weights” of elements of system. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research, and designing of optimal structure systems are carried out.Keywords: complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability of systems, “weights” of elements
Procedia PDF Downloads 662412 Faster Pedestrian Recognition Using Deformable Part Models
Authors: Alessandro Preziosi, Antonio Prioletti, Luca Castangia
Abstract:
Deformable part models achieve high precision in pedestrian recognition, but all publicly available implementations are too slow for real-time applications. We implemented a deformable part model algorithm fast enough for real-time use by exploiting information about the camera position and orientation. This implementation is both faster and more precise than alternative DPM implementations. These results are obtained by computing convolutions in the frequency domain and using lookup tables to speed up feature computation. This approach is almost an order of magnitude faster than the reference DPM implementation, with no loss in precision. Knowing the position of the camera with respect to horizon it is also possible prune many hypotheses based on their size and location. The range of acceptable sizes and positions is set by looking at the statistical distribution of bounding boxes in labelled images. With this approach it is not needed to compute the entire feature pyramid: for example higher resolution features are only needed near the horizon. This results in an increase in mean average precision of 5% and an increase in speed by a factor of two. Furthermore, to reduce misdetections involving small pedestrians near the horizon, input images are supersampled near the horizon. Supersampling the image at 1.5 times the original scale, results in an increase in precision of about 4%. The implementation was tested against the public KITTI dataset, obtaining an 8% improvement in mean average precision over the best performing DPM-based method. By allowing for a small loss in precision computational time can be easily brought down to our target of 100ms per image, reaching a solution that is faster and still more precise than all publicly available DPM implementations.Keywords: autonomous vehicles, deformable part model, dpm, pedestrian detection, real time
Procedia PDF Downloads 2812411 Service Interactions Coordination Using a Declarative Approach: Focuses on Deontic Rule from Semantics of Business Vocabulary and Rules Models
Authors: Nurulhuda A. Manaf, Nor Najihah Zainal Abidin, Nur Amalina Jamaludin
Abstract:
Coordinating service interactions are a vital part of developing distributed applications that are built up as networks of autonomous participants, e.g., software components, web services, online resources, involve a collaboration between a diverse number of participant services on different providers. The complexity in coordinating service interactions reflects how important the techniques and approaches require for designing and coordinating the interaction between participant services to ensure the overall goal of a collaboration between participant services is achieved. The objective of this research is to develop capability of steering a complex service interaction towards a desired outcome. Therefore, an efficient technique for modelling, generating, and verifying the coordination of service interactions is developed. The developed model describes service interactions using service choreographies approach and focusing on a declarative approach, advocating an Object Management Group (OMG) standard, Semantics of Business Vocabulary and Rules (SBVR). This model, namely, SBVR model for service choreographies focuses on a declarative deontic rule expressing both obligation and prohibition, which can be more useful in working with coordinating service interactions. The generated SBVR model is then be formulated and be transformed into Alloy model using Alloy Analyzer for verifying the generated SBVR model. The transformation of SBVR into Alloy allows to automatically generate the corresponding coordination of service interactions (service choreography), hence producing an immediate instance of execution that satisfies the constraints of the specification and verifies whether a specific request can be realised in the given choreography in the generated choreography.Keywords: service choreography, service coordination, behavioural modelling, complex interactions, declarative specification, verification, model transformation, semantics of business vocabulary and rules, SBVR
Procedia PDF Downloads 155