Search results for: twin therapeutic approach
11366 3D Object Retrieval Based on Similarity Calculation in 3D Computer Aided Design Systems
Authors: Ahmed Fradi
Abstract:
Nowadays, recent technological advances in the acquisition, modeling, and processing of three-dimensional (3D) objects data lead to the creation of models stored in huge databases, which are used in various domains such as computer vision, augmented reality, game industry, medicine, CAD (Computer-aided design), 3D printing etc. On the other hand, the industry is currently benefiting from powerful modeling tools enabling designers to easily and quickly produce 3D models. The great ease of acquisition and modeling of 3D objects make possible to create large 3D models databases, then, it becomes difficult to navigate them. Therefore, the indexing of 3D objects appears as a necessary and promising solution to manage this type of data, to extract model information, retrieve an existing model or calculate similarity between 3D objects. The objective of the proposed research is to develop a framework allowing easy and fast access to 3D objects in a CAD models database with specific indexing algorithm to find objects similar to a reference model. Our main objectives are to study existing methods of similarity calculation of 3D objects (essentially shape-based methods) by specifying the characteristics of each method as well as the difference between them, and then we will propose a new approach for indexing and comparing 3D models, which is suitable for our case study and which is based on some previously studied methods. Our proposed approach is finally illustrated by an implementation, and evaluated in a professional context.Keywords: CAD, 3D object retrieval, shape based retrieval, similarity calculation
Procedia PDF Downloads 26211365 Rank-Based Chain-Mode Ensemble for Binary Classification
Authors: Chongya Song, Kang Yen, Alexander Pons, Jin Liu
Abstract:
In the field of machine learning, the ensemble has been employed as a common methodology to improve the performance upon multiple base classifiers. However, the true predictions are often canceled out by the false ones during consensus due to a phenomenon called “curse of correlation” which is represented as the strong interferences among the predictions produced by the base classifiers. In addition, the existing practices are still not able to effectively mitigate the problem of imbalanced classification. Based on the analysis on our experiment results, we conclude that the two problems are caused by some inherent deficiencies in the approach of consensus. Therefore, we create an enhanced ensemble algorithm which adopts a designed rank-based chain-mode consensus to overcome the two problems. In order to evaluate the proposed ensemble algorithm, we employ a well-known benchmark data set NSL-KDD (the improved version of dataset KDDCup99 produced by University of New Brunswick) to make comparisons between the proposed and 8 common ensemble algorithms. Particularly, each compared ensemble classifier uses the same 22 base classifiers, so that the differences in terms of the improvements toward the accuracy and reliability upon the base classifiers can be truly revealed. As a result, the proposed rank-based chain-mode consensus is proved to be a more effective ensemble solution than the traditional consensus approach, which outperforms the 8 ensemble algorithms by 20% on almost all compared metrices which include accuracy, precision, recall, F1-score and area under receiver operating characteristic curve.Keywords: consensus, curse of correlation, imbalance classification, rank-based chain-mode ensemble
Procedia PDF Downloads 13811364 Integrated Coastal Management for the Sustainable Development of Coastal Cities: The Case of El-Mina, Tripoli, Lebanon
Authors: G. Ghamrawi, Y. Abunnasr, M. Fawaz, S. Yazigi
Abstract:
Coastal cities are constantly exposed to environmental degradation and economic regression fueled by rapid and uncontrolled urban growth as well as continuous resource depletion. This is the case of the City of Mina in Tripoli (Lebanon), where lack of awareness to preserve social, ecological, and historical assets, coupled with the increasing development pressures, are threatening the socioeconomic status of the city residents, the quality of life and accessibility to the coast. To address these challenges, a holistic coastal urban design and planning approach was developed to analyze the environmental, political, legal, and socioeconomic context of the city. This approach aims to investigate the potential of balancing urban development with the protection and enhancement of cultural, ecological, and environmental assets under an integrated coastal zone management approach (ICZM). The analysis of Mina's different sectors adopted several tools that include direct field observation, interviews with stakeholders, analysis of available data, historical maps, and previously proposed projects. The findings from the analysis were mapped and graphically represented, allowing the recognition of character zones that become the design intervention units. Consequently, the thesis proposes an urban, city-scale intervention that identifies 6 different character zones (the historical fishing port, Abdul Wahab island, the abandoned Port Said, Hammam el Makloub, the sand beach, and the new developable area) and proposes context-specific design interventions that capitalize on the main characteristics of each zone. Moreover, the intervention builds on the institutional framework of ICZM as well as other studies previously conducted for the coast and adopts nature-based solutions with hybrid systems for providing better environmental design solutions for developing the coast. This enables the realization of an all-inclusive, well-connected shoreline with easy and free access towards the sea; a developed shoreline with an active local economy, and an improved urban environment.Keywords: blue green infrastructure, coastal cities, hybrid solutions, integrated coastal zone management, sustainable development, urban planning
Procedia PDF Downloads 15611363 Measuring the Extent of Equalization in Fiscal Transfers in India: An Index-Based Approach
Authors: Ragini Trehan, D.K. Srivastava
Abstract:
In the post-planning era, India’s fiscal transfers from the central to state governments are solely determined by the Finance Commissions (FCs). While in some of the well-established federations such as Australia, Canada, and Germany, equalization serves as the guiding principle of fiscal transfers and is constitutionally mandated, in India, it is not explicitly mandated, and FCs attempt to implement it indirectly by a combination of a formula-based share in the divisible pool of central taxes supplemented by a set of grants. In this context, it is important to measure the extent of equalization that is achieved through FC transfers with a view to improving the design of such transfers. This study uses an index-based methodology for measuring the degree of equalization achieved through FC-transfers covering the period from FC12 to the first year of FC15 spanning from 2005-06 to 2020-21. The ‘Index of Equalization’ shows that the extent of equalization has remained low in the range of 30% to 37% for the four Commission periods under review. The highest degree of equalization at 36.7% was witnessed in the FC12 period and the lowest equalization at 29.5% was achieved during the FC15(1) period. The equalizing efficiency of recommended transfers also shows a consistent fall from 11.4% in the FC12 period to 7.5% by the FC15 (1) period. Further, considering progressivity in fiscal transfers as a special case of equalizing transfers, this study shows that the scheme of per capita total transfers when determined using the equalization approach is more progressive and is characterized by minimal deviations as compared to the profile of transfers recommended by recent FCs.Keywords: fiscal transfers, index of equalization, equalizing efficiency, fiscal capacity, expenditure needs, finance Commission, tax effort
Procedia PDF Downloads 7411362 Systems Approach on Thermal Analysis of an Automatic Transmission
Authors: Sinsze Koo, Benjin Luo, Matthew Henry
Abstract:
In order to increase the performance of an automatic transmission, the automatic transmission fluid is required to be warm up to an optimal operating temperature. In a conventional vehicle, cold starts result in friction loss occurring in the gear box and engine. The stop and go nature of city driving dramatically affect the warm-up of engine oil and automatic transmission fluid and delay the time frame needed to reach an optimal operating temperature. This temperature phenomenon impacts both engine and transmission performance but also increases fuel consumption and CO2 emission. The aim of this study is to develop know-how of the thermal behavior in order to identify thermal impacts and functional principles in automatic transmissions. Thermal behavior was studied using models and simulations, developed using GT-Suit, on a one-dimensional thermal and flow transport. A power train of a conventional vehicle was modeled in order to emphasis the thermal phenomena occurring in the various components and how they impact the automatic transmission performance. The simulation demonstrates the thermal model of a transmission fluid cooling system and its component parts in warm-up after a cold start. The result of these analyses will support the future designs of transmission systems and components in an attempt to obtain better fuel efficiency and transmission performance. Therefore, these thermal analyses could possibly identify ways that improve existing thermal management techniques with prioritization on fuel efficiency.Keywords: thermal management, automatic transmission, hybrid, and systematic approach
Procedia PDF Downloads 37711361 Design and Optimization of Open Loop Supply Chain Distribution Network Using Hybrid K-Means Cluster Based Heuristic Algorithm
Authors: P. Suresh, K. Gunasekaran, R. Thanigaivelan
Abstract:
Radio frequency identification (RFID) technology has been attracting considerable attention with the expectation of improved supply chain visibility for consumer goods, apparel, and pharmaceutical manufacturers, as well as retailers and government procurement agencies. It is also expected to improve the consumer shopping experience by making it more likely that the products they want to purchase are available. Recent announcements from some key retailers have brought interest in RFID to the forefront. A modified K- Means Cluster based Heuristic approach, Hybrid Genetic Algorithm (GA) - Simulated Annealing (SA) approach, Hybrid K-Means Cluster based Heuristic-GA and Hybrid K-Means Cluster based Heuristic-GA-SA for Open Loop Supply Chain Network problem are proposed. The study incorporated uniform crossover operator and combined crossover operator in GAs for solving open loop supply chain distribution network problem. The algorithms are tested on 50 randomly generated data set and compared with each other. The results of the numerical experiments show that the Hybrid K-means cluster based heuristic-GA-SA, when tested on 50 randomly generated data set, shows superior performance to the other methods for solving the open loop supply chain distribution network problem.Keywords: RFID, supply chain distribution network, open loop supply chain, genetic algorithm, simulated annealing
Procedia PDF Downloads 16511360 An Assessment of the Role of Actors in the Medical Waste Management Policy-Making Process of Bangladesh
Authors: Md Monirul Islam, Shahaduz Zaman, Mosarraf H. Sarker
Abstract:
Context: Medical waste management (MWM) is a critical sector in Bangladesh due to its impact on human health and the environment. There is a need to assess the current policies and identify the role of policy actors in the policy formulation and implementation process. Research Aim: The study aimed to evaluate the role of policy actors in the medical waste management policy-making process in Bangladesh, identify policy gaps, and provide actionable recommendations for improvement. Methodology: The study adopted a qualitative research method and conducted key informant interviews. The data collected were analyzed using the thematic coding approach through Atlas.ti software. Findings: The study found that policies are formulated at higher administrative levels and implemented in a top-down approach. Higher-level institutions predominantly contribute to policy development, while lower-level institutions focus on implementation. However, due to negligence, ignorance, and lack of coordination, medical waste management receives insufficient attention from the actors. The study recommends the need for immediate strategies, a comprehensive action plan, regular policy updates, and inter-ministerial meetings to enhance medical waste management practices and interventions. Theoretical Importance: The research contributes to evaluating the role of policy actors in medical waste management policymaking and implementation in Bangladesh. It identifies policy gaps and provides actionable recommendations for improvement. Data Collection: The study used key informant interviews as the data collection method. Thirty-six participants were interviewed, including influential policymakers and representatives of various administrative spheres. Analysis Procedures: The data collected was analyzed using the inductive thematic analysis approach. Question Addressed: The study aimed to assess the role of policy actors in medical waste management policymaking and implementation in Bangladesh. Conclusion: In conclusion, the study provides insights into the current medical waste management policy in Bangladesh, the role of policy actors in policy formulation and implementation, and the need for improved strategies and policy updates. The findings of this study can guide future policy-making efforts to enhance medical waste management practices and interventions in Bangladesh.Keywords: key informant, medical waste management, policy maker, qualitative study
Procedia PDF Downloads 8111359 Numerical Simulation of Lifeboat Launching Using Overset Meshing
Authors: Alok Khaware, Vinay Kumar Gupta, Jean Noel Pederzani
Abstract:
Lifeboat launching from marine vessel or offshore platform is one of the important areas of research in offshore applications. With the advancement of computational fluid dynamic simulation (CFD) technology to solve fluid induced motions coupled with Six Degree of Freedom (6DOF), rigid body dynamics solver, it is now possible to predict the motion of the lifeboat precisely in different challenging conditions. Traditionally dynamic remeshing approach is used to solve this kind of problems, but remeshing approach has some bottlenecks to control good quality mesh in transient moving mesh cases. In the present study, an overset method with higher-order interpolation is used to simulate a lifeboat launched from an offshore platform into calm water, and volume of fluid (VOF) method is used to track free surface. Overset mesh consists of a set of overlapping component meshes, which allows complex geometries to be meshed with lesser effort. Good quality mesh with local refinement is generated at the beginning of the simulation and stay unchanged throughout the simulation. Overset mesh accuracy depends on the precise interpolation technique; the present study includes a robust and accurate least square interpolation method and results obtained with overset mesh shows good agreement with experiment.Keywords: computational fluid dynamics, free surface flow, lifeboat launching, overset mesh, volume of fluid
Procedia PDF Downloads 27711358 Prediction of Slaughter Body Weight in Rabbits: Multivariate Approach through Path Coefficient and Principal Component Analysis
Authors: K. A. Bindu, T. V. Raja, P. M. Rojan, A. Siby
Abstract:
The multivariate path coefficient approach was employed to study the effects of various production and reproduction traits on the slaughter body weight of rabbits. Information on 562 rabbits maintained at the university rabbit farm attached to the Centre for Advanced Studies in Animal Genetics, and Breeding, Kerala Veterinary and Animal Sciences University, Kerala State, India was utilized. The manifest variables used in the study were age and weight of dam, birth weight, litter size at birth and weaning, weight at first, second and third months. The linear multiple regression analysis was performed by keeping the slaughter weight as the dependent variable and the remaining as independent variables. The model explained 48.60 percentage of the total variation present in the market weight of the rabbits. Even though the model used was significant, the standardized beta coefficients for the independent variables viz., age and weight of the dam, birth weight and litter sizes at birth and weaning were less than one indicating their negligible influence on the slaughter weight. However, the standardized beta coefficient of the second-month body weight was maximum followed by the first-month weight indicating their major role on the market weight. All the other factors influence indirectly only through these two variables. Hence it was concluded that the slaughter body weight can be predicted using the first and second-month body weights. The principal components were also developed so as to achieve more accuracy in the prediction of market weight of rabbits.Keywords: component analysis, multivariate, slaughter, regression
Procedia PDF Downloads 16511357 A Comparative Soft Computing Approach to Supplier Performance Prediction Using GEP and ANN Models: An Automotive Case Study
Authors: Seyed Esmail Seyedi Bariran, Khairul Salleh Mohamed Sahari
Abstract:
In multi-echelon supply chain networks, optimal supplier selection significantly depends on the accuracy of suppliers’ performance prediction. Different methods of multi criteria decision making such as ANN, GA, Fuzzy, AHP, etc have been previously used to predict the supplier performance but the “black-box” characteristic of these methods is yet a major concern to be resolved. Therefore, the primary objective in this paper is to implement an artificial intelligence-based gene expression programming (GEP) model to compare the prediction accuracy with that of ANN. A full factorial design with %95 confidence interval is initially applied to determine the appropriate set of criteria for supplier performance evaluation. A test-train approach is then utilized for the ANN and GEP exclusively. The training results are used to find the optimal network architecture and the testing data will determine the prediction accuracy of each method based on measures of root mean square error (RMSE) and correlation coefficient (R2). The results of a case study conducted in Supplying Automotive Parts Co. (SAPCO) with more than 100 local and foreign supply chain members revealed that, in comparison with ANN, gene expression programming has a significant preference in predicting supplier performance by referring to the respective RMSE and R-squared values. Moreover, using GEP, a mathematical function was also derived to solve the issue of ANN black-box structure in modeling the performance prediction.Keywords: Supplier Performance Prediction, ANN, GEP, Automotive, SAPCO
Procedia PDF Downloads 41911356 Management in Health Education Process among Spa Resorts in Poland
Authors: J. Wozniak-Holecka, T. Holecki, P. Romaniuk
Abstract:
Spa facilities are being perceived as the ways of healing treatment in Poland and are guaranteed within the public financing. The universal health insurance (National Health Fund, NFZ), and the disability prevention programme held by Social Insurance Institution (ZUS) are the main sources of financing spa facilities. The dominant public payer of spa services is the NFZ. The Social Insurance Institution covers the cost of health treatment realized in spa facilities as medical rehabilitation, in the field of disability prevention. Health services delivered in the spa resorts are characterized by complexity, and the combination of various methods, typical for health prevention, education, balneotherapy, and physiotherapy. Healing with natural methods, believed to enhance the therapeutic effect, is also involved in health spa treatment. Regardless of the type of facility, each form of spa treatment includes health promotion, health education, prevention at all levels, including rehabilitation. The aim of the study was to determine the optimal organization of health education process. Its efficiency strongly depends on the type of service provider and the funding institution (NFZ vs ZUS). It results from the use of different measures of the effectiveness, the quality and the evaluation of the process being assessed by funding institutions. The methods of the study include a comparative and descriptive quantitative and qualitative analysis. In the empirical part, a questionnaire had been developed. It was then distributed among spa personnel, responsible directly for the health promotion, and among patients who are beneficiaries of health services in spa centers. The quantitative part of the study was based on interviews carried with the use of the online survey (CAWI: Computer-Assisted Web Interview), telephone survey (CATI: Computer-Assisted Telephone Interview) and a conventional questionnaire (PAPI: Paper over Pencil Interview). As a result of the conducted research, it was found that the effectiveness of health education activities in spa resort facilities in Poland is higher when the services are organized using structured tools for managerial control. This applies to formalized procedures implemented by one of the dominant payers covering costs of services (ZUS) and involves the application of health education as one of the mandatory elements of treatment, subjected to the process of control during the course of spa therapy and evaluation after it is completed.Keywords: effectiveness, health education, public health system, spa treatment
Procedia PDF Downloads 14211355 Gas Network Noncooperative Game
Authors: Teresa Azevedo PerdicoúLis, Paulo Lopes Dos Santos
Abstract:
The conceptualisation of the problem of network optimisation as a noncooperative game sets up a holistic interactive approach that brings together different network features (e.g., com-pressor stations, sources, and pipelines, in the gas context) where the optimisation objectives are different, and a single optimisation procedure becomes possible without having to feed results from diverse software packages into each other. A mathematical model of this type, where independent entities take action, offers the ideal modularity and subsequent problem decomposition in view to design a decentralised algorithm to optimise the operation and management of the network. In a game framework, compressor stations and sources are under-stood as players which communicate through network connectivity constraints–the pipeline model. That is, in a scheme similar to tatonnementˆ, the players appoint their best settings and then interact to check for network feasibility. The devolved degree of network unfeasibility informs the players about the ’quality’ of their settings, and this two-phase iterative scheme is repeated until a global optimum is obtained. Due to network transients, its optimisation needs to be assessed at different points of the control interval. For this reason, the proposed approach to optimisation has two stages: (i) the first stage computes along the period of optimisation in order to fulfil the requirement just mentioned; (ii) the second stage is initialised with the solution found by the problem computed at the first stage, and computes in the end of the period of optimisation to rectify the solution found at the first stage. The liability of the proposed scheme is proven correct on an abstract prototype and three example networks.Keywords: connectivity matrix, gas network optimisation, large-scale, noncooperative game, system decomposition
Procedia PDF Downloads 15211354 The Role of the University of Zululand in Documenting and Disseminating Indigenous Knowledge, in KwaZulu-Natal, South Africa
Authors: Smiso Buthelezi, Petros Dlamini, Dennis Ocholla
Abstract:
The study assesses the University of Zululand's practices for documenting, sharing, and accessing indigenous knowledge. Two research objectives guided it: to determine how indigenous knowledge (IK) is developed at the University of Zululand and how indigenous knowledge (IK) is documented at the University of Zululand. The study adopted both interpretive and positivist research paradigms. Ultimately, qualitative and quantitative research methods were used. The qualitative research approach collected data from academic and non-academic staff members. Interviews were conducted with 18 academic staff members and 5 with support staff members. The quantitative research approach was used to collect data from indigenous knowledge (IK) theses and dissertations from the University of Zululand Institutional Repository between 2009-2019. The study results revealed that many departments across the University of Zululand were involved in creating indigenous knowledge (IK)-related content. The department of African Languages was noted to be more involved in creating IK-related content. Moreover, the documentation of the content related to indigenous knowledge (IK) at the University of Zululand is done frequently but is not readily known. It was found that the creation and documentation of indigenous knowledge by different departments faced several challenges. The common challenges are a lack of interest among indigenous knowledge (IK) owners in sharing their knowledge, the local language as a barrier, and a shortage of proper tools for recording and capturing indigenous knowledge (IK). One of the study recommendations is the need for an indigenous knowledge systems (IKS) policy to be in place at the University of Zululand.Keywords: knowledge creation, SECI model, information and communication technology., indigenous knowledge
Procedia PDF Downloads 11211353 Effect of Total Body Irradiation for Metastatic Lymph Node and Lung Metastasis in Early Stage
Authors: Shouta Sora, Shizuki Kuriu, Radhika Mishra, Ariunbuyan Sukhbaatar, Maya Sakamoto, Shiro Mori, Tetsuya Kodama
Abstract:
Lymph node (LN) metastasis accounts for 20 - 30 % of all deaths in patients with head and neck cancer. Therefore, the control of metastatic lymph nodes (MLNs) is necessary to improve the life prognosis of patients with cancer. In a classical metastatic theory, tumor cells are thought to metastasize hematogenously through a bead-like network of lymph nodes. Recently, a lymph node-mediated hematogenous metastasis theory has been proposed, in which sentinel LNs are regarded as a source of distant metastasis. Therefore, the treatment of MLNs at the early stage is essential to prevent distant metastasis. Radiation therapy is one of the primary therapeutic modalities in cancer treatment. In addition, total body irradiation (TBI) has been reported to act as activation of natural killer cells and increase of infiltration of CD4+ T-cells to tumor tissues. However, the treatment effect of TBI for MLNs remains unclear. This study evaluated the possibilities of low-dose total body irradiation (L-TBI) and middle-dose total body irradiation (M-TBI) for the treatment of MLNs. Mouse breast cancer FM3A-Luc cells were injected into subiliac lymph node (SiLN) of MXH10/Mo/LPR mice to induce the metastasis to the proper axillary lymph node (PALN) and lung. Mice were irradiated for the whole body on 4 days after tumor injection. The L-TBI and M-TBI were defined as irradiations to the whole body at 0.2 Gy and 1.0 Gy, respectively. Tumor growth was evaluated by in vivo bioluminescence imaging system. In the non-irradiated group, tumor activities on SiLN and PALN significantly increased over time, and the metastasis to the lung from LNs was confirmed 28 days after tumor injection. The L-TBI led to a tumor growth delay in PALN but did not control tumor growth in SiLN and metastasis to the lung. In contrast, it was found that the M-TBI significantly delayed the tumor growth of both SiLN and PALN and controlled the distant metastasis to the lung compared with non-irradiated and L-TBI groups. These results suggest that the M-TBI is an effective treatment method for MLNs in the early stage and distant metastasis from lymph nodes via blood vessels connected with LNs.Keywords: metastatic lymph node, lung metastasis, radiation therapy, total body irradiation, lymphatic system
Procedia PDF Downloads 18111352 A Program of Data Analysis on the Possible State of the Antibiotic Resistance in Bangladesh Environment in 2019
Authors: S. D. Kadir
Abstract:
Background: Antibiotics have always been at the centrum of the revolution of modern microbiology. Micro-organisms and its pathogenicity, resistant organisms, inappropriate or over usage of various types of antibiotic agents are fuelled multidrug-resistant pathogenic organisms. Our present time review report mainly focuses on the therapeutic condition of antibiotic resistance and the possible roots behind the development of antibiotic resistance in Bangladesh in 2019. Methodology: The systemic review has progressed through a series of research analyses on various manuscripts published on Google Scholar, PubMed, Research Gate, and collected relevant information from established popular healthcare and diagnostic center and its subdivisions all over Bangladesh. Our research analysis on the possible assurance of antibiotic resistance been ensured by the selective medical reports and on random assay on the extent of individual antibiotic in 2019. Results: 5 research articles, 50 medical report summary, and around 5 patients have been interviewed while going through the estimation process. We have prioritized research articles where the research analysis been performed by the appropriate use of the Kirby-Bauer method. Kirby-Bauer technique is preferred as it provides greater efficiency, ensures lower performance expenditure, and supplies greater convenience and simplification in the application. In most of the reviews, clinical and laboratory standards institute guidelines were strictly followed. Most of our reports indicate significant resistance shown by the Beta-lactam drugs. Specifically by the derivatives of Penicillin's, Cephalosporin's (rare use of the first generation Cephalosporin and overuse of the second and third generation of Cephalosporin and misuse of the fourth generation of Cephalosporin), which are responsible for almost 67 percent of the bacterial resistance. Moreover, approximately 20 percent of the resistance was due to the fact of drug pumping from the bacterial cell by tetracycline and sulphonamides and their derivatives. Conclusion: 90 percent of the approximate antibiotic resistance is due to the usage of relative and true broad-spectrum antibiotics. The environment has been created by the following circumstances where; the excessive usage of broad-spectrum antibiotics had led to a condition where the disruption of native bacteria and a series of anti-microbial resistance causing a disturbance of the surrounding environments in medium, leading to a state of super-infection.Keywords: antibiotics, antibiotic resistance, Kirby Bauer method, microbiology
Procedia PDF Downloads 12011351 Video Compression Using Contourlet Transform
Authors: Delara Kazempour, Mashallah Abasi Dezfuli, Reza Javidan
Abstract:
Video compression used for channels with limited bandwidth and storage devices has limited storage capabilities. One of the most popular approaches in video compression is the usage of different transforms. Discrete cosine transform is one of the video compression methods that have some problems such as blocking, noising and high distortion inappropriate effect in compression ratio. wavelet transform is another approach is better than cosine transforms in balancing of compression and quality but the recognizing of curve curvature is so limit. Because of the importance of the compression and problems of the cosine and wavelet transforms, the contourlet transform is most popular in video compression. In the new proposed method, we used contourlet transform in video image compression. Contourlet transform can save details of the image better than the previous transforms because this transform is multi-scale and oriented. This transform can recognize discontinuity such as edges. In this approach we lost data less than previous approaches. Contourlet transform finds discrete space structure. This transform is useful for represented of two dimension smooth images. This transform, produces compressed images with high compression ratio along with texture and edge preservation. Finally, the results show that the majority of the images, the parameters of the mean square error and maximum signal-to-noise ratio of the new method based contourlet transform compared to wavelet transform are improved but in most of the images, the parameters of the mean square error and maximum signal-to-noise ratio in the cosine transform is better than the method based on contourlet transform.Keywords: video compression, contourlet transform, discrete cosine transform, wavelet transform
Procedia PDF Downloads 44411350 Disaster Management Approach for Planning an Early Response to Earthquakes in Urban Areas
Authors: Luis Reynaldo Mota-Santiago, Angélica Lozano
Abstract:
Determining appropriate measures to face earthquakesarea challenge for practitioners. In the literature, some analyses consider disaster scenarios, disregarding some important field characteristics. Sometimes, software that allows estimating the number of victims and infrastructure damages is used. Other times historical information of previous events is used, or the scenarios’informationis assumed to be available even if it isnot usual in practice. Humanitarian operations start immediately after an earthquake strikes, and the first hours in relief efforts are important; local efforts are critical to assess the situation and deliver relief supplies to the victims. A preparation action is prepositioning stockpiles, most of them at central warehouses placed away from damage-prone areas, which requires large size facilities and budget. Usually, decisions in the first 12 hours (standard relief time (SRT)) after the disaster are the location of temporary depots and the design of distribution paths. The motivation for this research was the delay in the reaction time of the early relief efforts generating the late arrival of aid to some areas after the Mexico City 7.1 magnitude earthquake in 2017. Hence, a preparation approach for planning the immediate response to earthquake disasters is proposed, intended for local governments, considering their capabilities for planning and for responding during the SRT, in order to reduce the start-up time of immediate response operations in urban areas. The first steps are the generation and analysis of disaster scenarios, which allow estimatethe relief demand before and in the early hours after an earthquake. The scenarios can be based on historical data and/or the seismic hazard analysis of an Atlas of Natural Hazards and Risk as a way to address the limited or null available information.The following steps include the decision processes for: a) locating local depots (places to prepositioning stockpiles)and aid-giving facilities at closer places as possible to risk areas; and b) designing the vehicle paths for aid distribution (from local depots to the aid-giving facilities), which can be used at the beginning of the response actions. This approach allows speeding up the delivery of aid in the early moments of the emergency, which could reduce the suffering of the victims allowing additional time to integrate a broader and more streamlined response (according to new information)from national and international organizations into these efforts. The proposed approachis applied to two case studies in Mexico City. These areas were affectedby the 2017’s earthquake, having limited aid response. The approach generates disaster scenarios in an easy way and plans a faster early response with a short quantity of stockpiles which can be managed in the early hours of the emergency by local governments. Considering long-term storage, the estimated quantities of stockpiles require a limited budget to maintain and a small storage space. These stockpiles are useful also to address a different kind of emergencies in the area.Keywords: disaster logistics, early response, generation of disaster scenarios, preparation phase
Procedia PDF Downloads 11011349 The Functional-Engineered Product-Service System Model: An Extensive Review towards a Unified Approach
Authors: Nicolas Haber
Abstract:
The study addresses the design process of integrated product-service offerings as a measure of answering environmental sustainability concerns by replacing stand-alone physical artefacts with comprehensive solutions relying on functional results rather than conventional product sales. However, views regarding this transformation are dissimilar and differentiated: The study discusses the importance and requirements of product-service systems before analysing the theoretical studies accomplished in the extent of their design and development processes. Based on this, a framework, built on a design science approach, is proposed, where the distinct approaches from the literature are merged towards a unified structure serving as a generic methodology to designing product-service systems. Each stage of this model is then developed to present a holistic design proposal called the Functional Engineered Product-Service System (FEPSS) model. Product-service systems are portrayed as customisable solutions tailored to specific settings and defined circumstances. Moreover, the approaches adopted to guide the design process are diversified. A thorough analysis of the design strategies and development processes however, allowed the extraction of a design backbone, valid to varied situations and contexts whether they are product-oriented, use-oriented or result-oriented. The goal is to guide manufacturers towards an eased adoption of these integrated offerings, given their inherited environmental benefits, by proposing a robust all-purpose design process.Keywords: functional product, integrated product-service offerings, product-service systems, sustainable design
Procedia PDF Downloads 29311348 Coarse-Grained Computational Fluid Dynamics-Discrete Element Method Modelling of the Multiphase Flow in Hydrocyclones
Authors: Li Ji, Kaiwei Chu, Shibo Kuang, Aibing Yu
Abstract:
Hydrocyclones are widely used to classify particles by size in industries such as mineral processing and chemical processing. The particles to be handled usually have a broad range of size distributions and sometimes density distributions, which has to be properly considered, causing challenges in the modelling of hydrocyclone. The combined approach of Computational Fluid Dynamics (CFD) and Discrete Element Method (DEM) offers convenience to model particle size/density distribution. However, its direct application to hydrocyclones is computationally prohibitive because there are billions of particles involved. In this work, a CFD-DEM model with the concept of the coarse-grained (CG) model is developed to model the solid-fluid flow in a hydrocyclone. The DEM is used to model the motion of discrete particles by applying Newton’s laws of motion. Here, a particle assembly containing a certain number of particles with same properties is treated as one CG particle. The CFD is used to model the liquid flow by numerically solving the local-averaged Navier-Stokes equations facilitated with the Volume of Fluid (VOF) model to capture air-core. The results are analyzed in terms of fluid and solid flow structures, and particle-fluid, particle-particle and particle-wall interaction forces. Furthermore, the calculated separation performance is compared with the measurements. The results obtained from the present study indicate that this approach can offer an alternative way to examine the flow and performance of hydrocyclonesKeywords: computational fluid dynamics, discrete element method, hydrocyclone, multiphase flow
Procedia PDF Downloads 40711347 Lagrangian Approach for Modeling Marine Litter Transport
Authors: Sarra Zaied, Arthur Bonpain, Pierre Yves Fravallo
Abstract:
The permanent supply of marine litter implies their accumulation in the oceans, which causes the presence of more compact wastes layers. Their Spatio-temporal distribution is never homogeneous and depends mainly on the hydrodynamic characteristics of the environment and the size and location of the wastes. As part of optimizing collect of marine plastic wastes, it is important to measure and monitor their evolution over time. For this, many research studies have been dedicated to describing the wastes behavior in order to identify their accumulation in oceans areas. Several models are therefore developed to understand the mechanisms that allow the accumulation and the displacements of marine litter. These models are able to accurately simulate the drift of wastes to study their behavior and stranding. However, these works aim to study the wastes behavior over a long period of time and not at the time of waste collection. This work investigates the transport of floating marine litter (FML) to provide basic information that can help in optimizing wastes collection by proposing a model for predicting their behavior during collection. The proposed study is based on a Lagrangian modeling approach that uses the main factors influencing the dynamics of the waste. The performance of the proposed method was assessed on real data collected from the Copernicus Marine Environment Monitoring Service (CMEMS). Evaluation results in the Java Sea (Indonesia) prove that the proposed model can effectively predict the position and the velocity of marine wastes during collection.Keywords: floating marine litter, lagrangian transport, particle-tracking model, wastes drift
Procedia PDF Downloads 19111346 Experimental and Computational Investigations on the Mitigation of Air Pollutants Using Pulsed Radio Waves
Authors: Gangadhara Siva Naga Venkata Krishna Satya Narayana Swamy Undi
Abstract:
Particulate matter (PM) pollution in ambient air is a major environmental health risk factor contributing to disease and mortality worldwide. Current air pollution control methods have limitations in reducing real-world ambient PM levels. This study demonstrates the efficacy of using pulsed radio wave technology as a distinct approach to lower outdoor particulate pollution. Experimental data were compared with computational models to evaluate the efficiency of pulsed waves in coagulating and settling PM. Results showed 50%+ reductions in PM2.5 and PM10 concentrations at the city scale, with particle removal rates exceeding gravity settling by over 3X. Historical air quality data further validated the significant PM reductions achieved in test cases. Computational analyses revealed the underlying coagulation mechanisms induced by the pulsed waves, supporting the feasibility of this strategy for ambient particulate control. The pulsed electromagnetic technology displayed robustness in sustainably managing PM levels across diverse urban and industrial environments. Findings highlight the promise of this advanced approach as a next-generation solution to mitigate particulate air pollution and associated health burdens globally. The technology's scalability and energy efficiency can help address a key gap in current efforts to improve ambient air quality.Keywords: particulate matter, mitigation technologies, clean air, ambient air pollution
Procedia PDF Downloads 5111345 Digital Design and Practice of The Problem Based Learning in College of Medicine, Qassim University, Saudi Arabia
Authors: Ahmed Elzainy, Abir El Sadik, Waleed Al Abdulmonem, Ahmad Alamro, Homaidan Al-Homaidan
Abstract:
Problem-based learning (PBL) is an educational modality which stimulates critical and creative thinking. PBL has been practiced in the college of medicine, Qassim University, Saudi Arabia, since the 2002s with offline face to face activities. Therefore, crucial technological changes in paperless work were needed. The aim of the present study was to design and implement the digitalization of the PBL activities and to evaluate its impact on students' and tutors’ performance. This approach promoted the involvement of all stakeholders after their awareness of the techniques of using online tools. IT support, learning resources facilities, and required multimedia were prepared. Students’ and staff perception surveys reflected their satisfaction with these remarkable changes. The students were interested in the new digitalized materials and educational design, which facilitated the conduction of PBL sessions and provided sufficient time for discussion and peer sharing of knowledge. It enhanced the tutors for supervision and tracking students’ activities on the Learning Management System. It could be concluded that introducing of digitalization of the PBL activities promoted the students’ performance, engagement and enabled a better evaluation of PBL materials and getting prompt students as well as staff feedback. These positive findings encouraged the college to implement the digitalization approach in other educational activities, such as Team-Based Learning, as an additional opportunity for further development.Keywords: multimedia in PBL, online PBL, problem-based learning, PBL digitalization
Procedia PDF Downloads 12011344 Feature Based Unsupervised Intrusion Detection
Authors: Deeman Yousif Mahmood, Mohammed Abdullah Hussein
Abstract:
The goal of a network-based intrusion detection system is to classify activities of network traffics into two major categories: normal and attack (intrusive) activities. Nowadays, data mining and machine learning plays an important role in many sciences; including intrusion detection system (IDS) using both supervised and unsupervised techniques. However, one of the essential steps of data mining is feature selection that helps in improving the efficiency, performance and prediction rate of proposed approach. This paper applies unsupervised K-means clustering algorithm with information gain (IG) for feature selection and reduction to build a network intrusion detection system. For our experimental analysis, we have used the new NSL-KDD dataset, which is a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. With a split of 60.0% for the training set and the remainder for the testing set, a 2 class classifications have been implemented (Normal, Attack). Weka framework which is a java based open source software consists of a collection of machine learning algorithms for data mining tasks has been used in the testing process. The experimental results show that the proposed approach is very accurate with low false positive rate and high true positive rate and it takes less learning time in comparison with using the full features of the dataset with the same algorithm.Keywords: information gain (IG), intrusion detection system (IDS), k-means clustering, Weka
Procedia PDF Downloads 29611343 Performative Acts Exhibited in Selected Ghanaian Newspaper Headlines
Authors: Charlotte Tetebea Asiamah
Abstract:
This paper sought to highlight the use of performative acts as exhibited in a Ghanaian newspaper headline; the Daily Graphic. The study categorically discusses and analyze thirty headlines on performative acts as captured in the month of June and July, 2024. The paper dwells on J.L Austin and J.R Searle’s theory of speech acts. Although a lot has been done in the area of performative acts, there is still a gap as far as newspaper headlines are concerned. Getting to know performative act’s stand in the domain of newspaper headlines will contribute to the discussion in literature thereby extending the scope of discourse as far as performative acts are concerned. Some of the questions for this study among others are; Do performative acts exhibited in newspaper headlines follow felicity conditions? Are the utterances explicitly stated or otherwise?A qualitative method approach was used in gathering and analyzing data. This approach was chosen in order to gain a depth insight of the study. The headlines were selected using the instrument of document analysis. Out of the numerous headlines, the researcher snapped over 60 headlines after which thirty (30) headlines were carefully selected for the study. The 30 newspaper headlines were purposively selected based on the element of performativity in them which were related to the study. Per the data, the findings depicted that, Performative Acts are exhibited in the Ghanaian Daily Graphic Newspaper headlines. The performative acts are expressed in all of the five categories of performative acts as J.R Searle discussed in his writing. These acts were seen in all the categories of the newspaper headlines; be it, governance or politics, social, international news and sports. It was also observed in the data that, directives were the most used performative act. The performative acts found in the newspaper headlines helped to grab readers attention, it also served as a way of influencing how readers perceive an utterance made by an individual in the headlines.Keywords: explicit, headline, illocutionary, newspaper, performative
Procedia PDF Downloads 1911342 Augmented Reality: New Relations with the Architectural Heritage Education
Authors: Carla Maria Furuno Rimkus
Abstract:
The technologies related to virtual reality and augmented reality in combination with mobile technologies, are being more consolidated and used each day. The increasing technological availability along with the decrease of their acquisition and maintenance costs, have favored the expansion of its use in the field of historic heritage. In this context it is focused, in this article, on the potential of mobile applications in the dissemination of the architectural heritage, using the technology of Augmented Reality. From this perspective approach, it is discussed about the process of producing an application for mobile devices on the Android platform, which combines the technologies of geometric modeling with augmented reality (AR) and access to interactive multimedia contents with cultural, social and historic information of the historic building that we take as the object of study: a block with a set of buildings built in the XVIII century, known as "Quarteirão dos Trapiches", which was modeled in 3D, coated with the original texture of its facades and displayed on AR. From this perspective approach, this paper discusses about methodological aspects of the development of this application regarding to the process and the project development tools, and presents our considerations on methodological aspects of developing an application for the Android system, focused on the dissemination of the architectural heritage, in order to encourage the tourist potential of the city in a sustainable way and to contribute to develop the digital documentation of the heritage of the city, meeting a demand of tourists visiting the city and the professionals who work in the preservation and restoration of it, consisting of architects, historians, archaeologists, museum specialists, among others.Keywords: augmented reality, architectural heritage, geometric modeling, mobile applications
Procedia PDF Downloads 47811341 Non-Canonical Beclin-1-Independent Autophagy and Apoptosis in Cell Death Induced by Rhus coriaria in Human Colon HT-29 Cancer Cells
Authors: Rabah Iratni, Husain El Hasasna, Khawlah Athamneh, Halima Al Sameri, Nehla Benhalilou, Asma Al Rashedi
Abstract:
Background: Cancer therapies have witnessed great advances in the recent past, however, cancer continues to be a leading cause of death, with colorectal cancer being the fourth cause of cancer-related deaths. Colorectal cancer affects both sexes equally with poor survival rate once it metastasizes. Phytochemicals, which are plant derived compounds, have been on a steady rise as anti-cancer drugs due to the accumulation of evidences that support their potential. Here, we investigated the anticancer effect of Rhus coriaria on colon cancer cells. Material and Method: Human colon cancer HT-29 cell line was used. Protein expression and protein phosphorylation were examined using Western blotting. Transcription activity was measure using Quantitative RT-PCR. Human tumoral clonogenic assay was used to assess cell survival. Senescence was assessed by the senescence-associated beta-galactosidase assay. Results: Rhus coriaria extract (RCE) was found to significantly inhibit the viability and colony growth of human HT-29 colon cancer cells. RCE induced senescence and cell cycle arrest at G1 phase. These changes were concomitant with upregulation of p21, p16, downregulation of cyclin D1, p27, c-myc and expression of Senescence-associated-β-Galactosidase activity. Moreover, RCE induced non-canonical beclin-1independent autophagy and subsequent apoptotic cell death through activation of activation caspase 8 and caspase 7. The blocking of autophagy by 3-methyladenine (3-MA) or chloroquine (CQ) reduced RCE-induced cell death. Further, RCE induced DNA damage, reduced mutant p53 protein level and downregulated phospho-AKT and phospho-mTOR, events that preceded autophagy. Mechanistically, we found that RCE inhibited the AKT and mTOR pathway, a regulator of autophagy, by promoting the proteasome-dependent degradation of both AKT and mTOR proteins. Conclusion: Our findings provide strong evidence that Rhus coriaria possesses strong anti-colon cancer activity through induction of senescence and autophagic cell death, making it a promising alternative or adjunct therapeutic candidate against colon cancer.Keywords: autophagy, proteasome degradation, senescence, mTOR, apoptosis, Beclin-1
Procedia PDF Downloads 26211340 New Environmental Culture in Algeria: Eco Design
Authors: S. Tireche, A. Tairi abdelaziz
Abstract:
Environmental damage has increased steadily in recent decades: Depletion of natural resources, destruction of the ozone layer, greenhouse effect, degradation of the quality of life, land use etc. New terms have emerged as: "Prevention rather than cure" or "polluter pays" falls within the principles of common sense, their practical implementation still remains fragmented. Among the avenues to be explored, one of the most promising is certainly one that focuses on product design. Indeed, where better than during the design phase, can reduce the source of future impacts on the environment? What choices or those of design, they influence more on the environmental characteristics of products? The most currently recognized at the international level is the analysis of the life cycle (LCA) and Life Cycle Assessment, subject to International Standardization (ISO 14040-14043). LCA provides scientific and objective assessment of potential impacts of the product or service, considering its entire life cycle. This approach makes it possible to minimize impacts to the source in pollution prevention. It is widely preferable to curative approach, currently majority in the industrial crops, led mostly by a report of pollution. The "product" is to reduce the environmental impacts of a given product, taking into account all or part of its life cycle. Currently, there are emerging tools, known as eco-design. They are intended to establish an environmental profile of the product to improve its environmental performance. They require a quantity sufficient information on the product for each phase of its life cycle: raw material extraction, manufacturing, distribution, usage, end of life (recycling or incineration or deposit) and all stages of transport. The assessment results indicate the sensitive points of the product studied, points on which the developer must act.Keywords: eco design, impact, life cycle analysis (LCA), sustainability
Procedia PDF Downloads 42711339 The Second Generation of Tyrosine Kinase Inhibitor Afatinib Controls Inflammation by Regulating NLRP3 Inflammasome Activation
Authors: Shujun Xie, Shirong Zhang, Shenglin Ma
Abstract:
Background: Chronic inflammation might lead to many malignancies, and inadequate resolution could play a crucial role in tumor invasion, progression, and metastases. A randomised, double-blind, placebo-controlled trial shows that IL-1β inhibition with canakinumab could reduce incident lung cancer and lung cancer mortality in patients with atherosclerosis. The process and secretion of proinflammatory cytokine IL-1β are controlled by the inflammasome. Here we showed the correlation of the innate immune system and afatinib, a tyrosine kinase inhibitor targeting epidermal growth factor receptor (EGFR) in non-small cell lung cancer. Methods: Murine Bone marrow derived macrophages (BMDMs), peritoneal macrophages (PMs) and THP-1 were used to check the effect of afatinib on the activation of NLRP3 inflammasome. The assembly of NLRP3 inflammasome was check by co-immunoprecipitation of NLRP3 and apoptosis-associated speck-like protein containing CARD (ASC), disuccinimidyl suberate (DSS)-cross link of ASC. Lipopolysaccharide (LPS)-induced sepsis and Alum-induced peritonitis were conducted to confirm that afatinib could inhibit the activation of NLRP3 in vivo. Peripheral blood mononuclear cells (PBMCs) from non-small cell lung cancer (NSCLC) patients before or after taking afatinib were used to check that afatinib inhibits inflammation in NSCLC therapy. Results: Our data showed that afatinib could inhibit the secretion of IL-1β in a dose-dependent manner in macrophage. Moreover, afatinib could inhibit the maturation of IL-1β and caspase-1 without affecting the precursors of IL-1β and caspase-1. Next, we found that afatinib could block the assembly of NLRP3 inflammasome and the ASC speck by blocking the interaction of the sensor protein NLRP3 and the adaptor protein ASC. We also found that afatinib was able to alleviate the LPS-induced sepsis in vivo. Conclusion: Our study found that afatinib could inhibit the activation of NLRP3 inflammasome in macrophage, providing new evidence that afatinib could target the innate immune system to control chronic inflammation. These investigations will provide significant experimental evidence in afatinib as therapeutic drug for non-small cell lung cancer or other tumors and NLRP3-related diseases and will explore new targets for afatinib.Keywords: inflammasome, afatinib, inflammation, tyrosine kinase inhibitor
Procedia PDF Downloads 11811338 Restructuring the College Classroom: Scaffolding Student Learning and Engagement in Higher Education
Authors: Claire Griffin
Abstract:
Recent years have witnessed a surge in the use of innovative teaching approaches to support student engagement and higher-order learning within higher education. This paper seeks to explore the use of collaborative, interactive teaching and learning strategies to support student engagement in a final year undergraduate Developmental Psychology module. In particular, the use of the jigsaw method, in-class presentations and online discussion fora were adopted in a ‘lectorial’ style teaching approach, aimed at scaffolding learning, fostering social interdependence and supporting various levels of student engagement in higher education. Using the ‘Student Course Engagement Questionnaire’, the impact of such teaching strategies on students’ college classroom experience was measured, with additional qualitative student feedback gathered. Results illustrate the positive impact of the teaching methodologies on students’ levels of engagement, with positive implications emerging across the four engagement factors: skills engagement, emotional engagement, participation/interaction engagement and performance engagement. Thematic analysis on students’ qualitative comments also provided greater insight into the positive impact of the ‘lectorial’ teaching approach on students’ classroom experience within higher level education. Implications of the findings are presented in terms of informing effective teaching practices within higher education. Additional avenues for future research and strategy usage will also be discussed, in light of evolving practice and cutting edge literature within the field.Keywords: learning, higher education, scaffolding, student engagement
Procedia PDF Downloads 37811337 6-Degree-Of-Freedom Spacecraft Motion Planning via Model Predictive Control and Dual Quaternions
Authors: Omer Burak Iskender, Keck Voon Ling, Vincent Dubanchet, Luca Simonini
Abstract:
This paper presents Guidance and Control (G&C) strategy to approach and synchronize with potentially rotating targets. The proposed strategy generates and tracks a safe trajectory for space servicing missions, including tasks like approaching, inspecting, and capturing. The main objective of this paper is to validate the G&C laws using a Hardware-In-the-Loop (HIL) setup with realistic rendezvous and docking equipment. Throughout this work, the assumption of full relative state feedback is relaxed by onboard sensors that bring realistic errors and delays and, while the proposed closed loop approach demonstrates the robustness to the above mentioned challenge. Moreover, G&C blocks are unified via the Model Predictive Control (MPC) paradigm, and the coupling between translational motion and rotational motion is addressed via dual quaternion based kinematic description. In this work, G&C is formulated as a convex optimization problem where constraints such as thruster limits and the output constraints are explicitly handled. Furthermore, the Monte-Carlo method is used to evaluate the robustness of the proposed method to the initial condition errors, the uncertainty of the target's motion and attitude, and actuator errors. A capture scenario is tested with the robotic test bench that has onboard sensors which estimate the position and orientation of a drifting satellite through camera imagery. Finally, the approach is compared with currently used robust H-infinity controllers and guidance profile provided by the industrial partner. The HIL experiments demonstrate that the proposed strategy is a potential candidate for future space servicing missions because 1) the algorithm is real-time implementable as convex programming offers deterministic convergence properties and guarantee finite time solution, 2) critical physical and output constraints are respected, 3) robustness to sensor errors and uncertainties in the system is proven, 4) couples translational motion with rotational motion.Keywords: dual quaternion, model predictive control, real-time experimental test, rendezvous and docking, spacecraft autonomy, space servicing
Procedia PDF Downloads 146