Search results for: hybrid PSO-GA algorithm and mutual information
13020 Parametric Appraisal of Robotic Arc Welding of Mild Steel Material by Principal Component Analysis-Fuzzy with Taguchi Technique
Authors: Amruta Rout, Golak Bihari Mahanta, Gunji Bala Murali, Bibhuti Bhusan Biswal, B. B. V. L. Deepak
Abstract:
The use of industrial robots for performing welding operation is one of the chief sign of contemporary welding in these days. The weld joint parameter and weld process parameter modeling is one of the most crucial aspects of robotic welding. As weld process parameters affect the weld joint parameters differently, a multi-objective optimization technique has to be utilized to obtain optimal setting of weld process parameter. In this paper, a hybrid optimization technique, i.e., Principal Component Analysis (PCA) combined with fuzzy logic has been proposed to get optimal setting of weld process parameters like wire feed rate, welding current. Gas flow rate, welding speed and nozzle tip to plate distance. The weld joint parameters considered for optimization are the depth of penetration, yield strength, and ultimate strength. PCA is a very efficient multi-objective technique for converting the correlated and dependent parameters into uncorrelated and independent variables like the weld joint parameters. Also in this approach, no need for checking the correlation among responses as no individual weight has been assigned to responses. Fuzzy Inference Engine can efficiently consider these aspects into an internal hierarchy of it thereby overcoming various limitations of existing optimization approaches. At last Taguchi method is used to get the optimal setting of weld process parameters. Therefore, it has been concluded the hybrid technique has its own advantages which can be used for quality improvement in industrial applications.Keywords: robotic arc welding, weld process parameters, weld joint parameters, principal component analysis, fuzzy logic, Taguchi method
Procedia PDF Downloads 17913019 Collective Redress in Consumer Protection in South East Europe: Cross-National Comparisons, Issues of Commonality and Difference
Authors: Veronika Efremova
Abstract:
In recent decades, there have been significant developments in the European Union in the field of collective consumer redress. South East European countries (SEE) covered by this paper, in line with their EU accession priorities and duties under Stabilisation and Association Agreements, have to harmonize their national laws with the relevant EU acquis for consumer protection (Chapter 28: Health and Consumer). In these countries, only minimal compliance is achieved. SEE countries have introduced rudimentary collective redress mechanisms, with modest enforcement of collective redress and case law. This paper is based on comprehensive interdisciplinary research conducted for SEE countries on common principles for injunctive and compensatory collective redress mechanisms, emphasizing cross-national comparisons, underlining issues of commonality and difference aiming to develop recommendations for an adequate enforcement of collective redress. SEE countries are recognized by the sectoral approach for regulating collective redress contrary to the majority of EU Member States with having adopted horizontal approach to collective redress. In most SEE countries, the laws do not recognize compensatory but only injunctive collective redress in consumer protection. All responsible stakeholders for implementation of collective redress in SEE countries, lack information and awareness on collective redress mechanisms and the way they function in practice. Therefore, specific actions are needed in these countries to make the whole system of collective redress for consumer protection operational and efficient. Taking into consideration the various designated stakeholders in collective redress in each SEE countries, there is a need of their mutual coordination and cooperation in order to develop consumer protection system and policies. By putting into practice the national collective redress mechanisms, effective access to justice for all consumers, the principle of rule of law will be secured and appropriate procedural guarantees to avoid abusive litigation will be ensured.Keywords: collective redress mechanism, consumer protection, commonality and difference, South East Europe
Procedia PDF Downloads 22013018 How Markets React to Corporate Disclosure: An Analysis Using a SEM Model
Authors: Helena Susana Afonso Alves, Natália Maria Rafael Canadas, Ana Maria Rodrigues
Abstract:
We examined the impact of governance rules on information asymmetry, using the turnover ratio and the bid-ask spread as proxies for the information asymmetry. We used a SEM model and analyzed the indirect relations through the voluntary disclosure of information and the organizational performance. We built a voluntary disclosure index based on the information firms provided in their annual reports and divided the governance characteristics in two constructs: directors’ and supervisors’ structures and ownership structure. We concluded that the ownership structure exerts a direct influence on share price and share liquidity, Otherwise, the directors’ and supervisors’ structures exert an indirect influence, through the organizational performance and the voluntary disclosure of information. The results also show that for firms with high levels of disclosure the bid-ask spread is lower. However, in firms with a high ownership concentration investors tend to increase the bid-ask spreads and trade less, which, in this case, reduces the liquidity of the stock. The failure to find the relationship between voluntary disclosure of information and the turnover ratio shows us that the liquidity of shares is more related to the greater or lesser concentration of shareholders, with the performance of their companies than with the access to information. Moreover, it is clear that the role that information disclosure plays is mainly at the level of price formation.Keywords: corporate governance, information asymmetry, voluntary disclosure, structural equation modelling, SEM
Procedia PDF Downloads 51613017 Clinical Advice Services: Using Lean Chassis to Optimize Nurse-Driven Telephonic Triage of After-Hour Calls from Patients
Authors: Eric Lee G. Escobedo-Wu, Nidhi Rohatgi, Fouzel Dhebar
Abstract:
It is challenging for patients to navigate through healthcare systems after-hours. This leads to delays in care, patient/provider dissatisfaction, inappropriate resource utilization, readmissions, and higher costs. It is important to provide patients and providers with effective clinical decision-making tools to allow seamless connectivity and coordinated care. In August 2015, patient-centric Stanford Health Care established Clinical Advice Services (CAS) to provide clinical decision support after-hours. CAS is founded on key Lean principles: Value stream mapping, empathy mapping, waste walk, takt time calculations, standard work, plan-do-check-act cycles, and active daily management. At CAS, Clinical Assistants take the initial call and manage all non-clinical calls (e.g., appointments, directions, general information). If the patient has a clinical symptom, the CAS nurses take the call and utilize standardized clinical algorithms to triage the patient to home, clinic, urgent care, emergency department, or 911. Nurses may also contact the on-call physician based on the clinical algorithm for further direction and consultation. Since August 2015, CAS has managed 228,990 calls from 26 clinical specialties. Reporting is built into the electronic health record for analysis and data collection. 65.3% of the after-hours calls are clinically related. Average clinical algorithm adherence rate has been 92%. An average of 9% of calls was escalated by CAS nurses to the physician on call. An average of 5% of patients was triaged to the Emergency Department by CAS. Key learnings indicate that a seamless connectivity vision, cascading, multidisciplinary ownership of the problem, and synergistic enterprise improvements have contributed to this success while striving for continuous improvement.Keywords: after hours phone calls, clinical advice services, nurse triage, Stanford Health Care
Procedia PDF Downloads 17413016 Scheduling Algorithm Based on Load-Aware Queue Partitioning in Heterogeneous Multi-Core Systems
Authors: Hong Kai, Zhong Jun Jie, Chen Lin Qi, Wang Chen Guang
Abstract:
There are inefficient global scheduling parallelism and local scheduling parallelism prone to processor starvation in current scheduling algorithms. Regarding this issue, this paper proposed a load-aware queue partitioning scheduling strategy by first allocating the queues according to the number of processor cores, calculating the load factor to specify the load queue capacity, and it assigned the awaiting nodes to the appropriate perceptual queues through the precursor nodes and the communication computation overhead. At the same time, real-time computation of the load factor could effectively prevent the processor from being starved for a long time. Experimental comparison with two classical algorithms shows that there is a certain improvement in both performance metrics of scheduling length and task speedup ratio.Keywords: load-aware, scheduling algorithm, perceptual queue, heterogeneous multi-core
Procedia PDF Downloads 14513015 Constrained RGBD SLAM with a Prior Knowledge of the Environment
Authors: Kathia Melbouci, Sylvie Naudet Collette, Vincent Gay-Bellile, Omar Ait-Aider, Michel Dhome
Abstract:
In this paper, we handle the problem of real time localization and mapping in indoor environment assisted by a partial prior 3D model, using an RGBD sensor. The proposed solution relies on a feature-based RGBD SLAM algorithm to localize the camera and update the 3D map of the scene. To improve the accuracy and the robustness of the localization, we propose to combine in a local bundle adjustment process, geometric information provided by a prior coarse 3D model of the scene (e.g. generated from the 2D floor plan of the building) along with RGBD data from a Kinect camera. The proposed approach is evaluated on a public benchmark dataset as well as on real scene acquired by a Kinect sensor.Keywords: SLAM, global localization, 3D sensor, bundle adjustment, 3D model
Procedia PDF Downloads 41413014 Information Extraction Based on Search Engine Results
Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk
Abstract:
The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.Keywords: search engines, information extraction, agent system
Procedia PDF Downloads 43013013 An Integrated Approach for Optimal Selection of Machining Parameters in Laser Micro-Machining Process
Authors: A. Gopala Krishna, M. Lakshmi Chaitanya, V. Kalyana Manohar
Abstract:
In the existent analysis, laser micro machining (LMM) of Silicon carbide (SiCp) reinforced Aluminum 7075 Metal Matrix Composite (Al7075/SiCp MMC) was studied. While machining, Because of the intense heat generated, A layer gets formed on the work piece surface which is called recast layer and this layer is detrimental to the surface quality of the component. The recast layer needs to be as small as possible for precise applications. Therefore, The height of recast layer and the depth of groove which are conflicting in nature were considered as the significant manufacturing criteria, Which determines the pursuit of a machining process obtained in LMM of Al7075/10%SiCp composite. The present work formulates the depth of groove and height of recast layer in relation to the machining parameters using the Response Surface Methodology (RSM) and correspondingly, The formulated mathematical models were put to use for optimization. Since the effect of machining parameters on the depth of groove and height of recast layer was contradictory, The problem was explicated as a multi objective optimization problem. Moreover, An evolutionary Non-dominated sorting genetic algorithm (NSGA-II) was employed to optimize the model established by RSM. Subsequently this algorithm was also adapted to achieve the Pareto optimal set of solutions that provide a detailed illustration for making the optimal solutions. Eventually experiments were conducted to affirm the results obtained from RSM and NSGA-II.Keywords: Laser Micro Machining (LMM), depth of groove, Height of recast layer, Response Surface Methodology (RSM), non-dominated sorting genetic algorithm
Procedia PDF Downloads 34513012 LiTa2PO8-based Composite Solid Polymer Electrolytes for High-Voltage Cathodes in Lithium-Metal Batteries
Authors: Kumlachew Zelalem Walle, Chun-Chen Yang
Abstract:
Solid-state Lithium metal batteries (SSLMBs) that contain polymer and ceramic solid electrolytes have received considerable attention as an alternative to substitute liquid electrolytes in lithium metal batteries (LMBs) for highly safe, excellent energy storage performance and stability under elevated temperature situations. Here, a novel fast Li-ion conducting material, LiTa₂PO₈ (LTPO), was synthesized and electrochemical performance of as-prepared powder and LTPO-incorporated composite solid polymer electrolyte (LTPO-CPE) membrane were investigated. The as-prepared LTPO powder was homogeneously dispersed in polymer matrices, and a hybrid solid electrolyte membrane was synthesized via a simple solution-casting method. The room temperature total ionic conductivity (σt) of the LTPO pellet and LTPO-CPE membrane were 0.14 and 0.57 mS cm-1, respectively. A coin battery with NCM811 cathode is cycled under 1C between 2.8 to 4.5 V at room temperature, achieving a Coulombic efficiency of 99.3% with capacity retention of 74.1% after 300 cycles. Similarly, the LFP cathode also delivered an excellent performance at 0.5C with an average Coulombic efficiency of 100% without virtually capacity loss (the maximum specific capacity is at 27th: 138 mAh g−1 and 500th: 131.3 mAh g−1). These results demonstrates the feasibility of a high Li-ion conductor LTPO as a filler, and the developed polymer/ceramic hybrid electrolyte has potential to be a high-performance electrolyte for high-voltage cathodes, which may provide a fresh platform for developing more advanced solid-state electrolytes.Keywords: li-ion conductor, lithium-metal batteries, composite solid electrolytes, liTa2PO8, high-voltage cathode
Procedia PDF Downloads 6613011 Soil Parameters Identification around PMT Test by Inverse Analysis
Authors: I. Toumi, Y. Abed, A. Bouafia
Abstract:
This paper presents a methodology for identifying the cohesive soil parameters that takes into account different constitutive equations. The procedure, applied to identify the parameters of generalized Prager model associated to the Drucker & Prager failure criterion from a pressuremeter expansion curve, is based on an inverse analysis approach, which consists of minimizing the function representing the difference between the experimental curve and the simulated curve using a simplex algorithm. The model response on pressuremeter path and its identification from experimental data lead to the determination of the friction angle, the cohesion and the Young modulus. Some parameters effects on the simulated curves and stresses path around pressuremeter probe are presented. Comparisons between the parameters determined with the proposed method and those obtained by other means are also presented.Keywords: cohesive soils, cavity expansion, pressuremeter test, finite element method, optimization procedure, simplex algorithm
Procedia PDF Downloads 29413010 Identification Algorithm of Critical Interface, Modelling Perils on Critical Infrastructure Subjects
Authors: Jiří. J. Urbánek, Hana Malachová, Josef Krahulec, Jitka Johanidisová
Abstract:
The paper deals with crisis situations investigation and modelling within the organizations of critical infrastructure. Every crisis situation has an origin in the emergency event occurrence in the organizations of energetic critical infrastructure especially. Here, the emergency events can be both the expected events, then crisis scenarios can be pre-prepared by pertinent organizational crisis management authorities towards their coping or the unexpected event (Black Swan effect) – without pre-prepared scenario, but it needs operational coping of crisis situations as well. The forms, characteristics, behaviour and utilization of crisis scenarios have various qualities, depending on real critical infrastructure organization prevention and training processes. An aim is always better organizational security and continuity obtainment. This paper objective is to find and investigate critical/ crisis zones and functions in critical situations models of critical infrastructure organization. The DYVELOP (Dynamic Vector Logistics of Processes) method is able to identify problematic critical zones and functions, displaying critical interfaces among actors of crisis situations on the DYVELOP maps named Blazons. Firstly, for realization of this ability is necessary to derive and create identification algorithm of critical interfaces. The locations of critical interfaces are the flags of crisis situation in real organization of critical infrastructure. Conclusive, the model of critical interface will be displayed at real organization of Czech energetic crisis infrastructure subject in Black Out peril environment. The Blazons need live power Point presentation for better comprehension of this paper mission.Keywords: algorithm, crisis, DYVELOP, infrastructure
Procedia PDF Downloads 40913009 Knowledge Discovery and Data Mining Techniques in Textile Industry
Authors: Filiz Ersoz, Taner Ersoz, Erkin Guler
Abstract:
This paper addresses the issues and technique for textile industry using data mining techniques. Data mining has been applied to the stitching of garments products that were obtained from a textile company. Data mining techniques were applied to the data obtained from the CHAID algorithm, CART algorithm, Regression Analysis and, Artificial Neural Networks. Classification technique based analyses were used while data mining and decision model about the production per person and variables affecting about production were found by this method. In the study, the results show that as the daily working time increases, the production per person also decreases. In addition, the relationship between total daily working and production per person shows a negative result and the production per person show the highest and negative relationship.Keywords: data mining, textile production, decision trees, classification
Procedia PDF Downloads 34913008 An Efficient Algorithm for Solving the Transmission Network Expansion Planning Problem Integrating Machine Learning with Mathematical Decomposition
Authors: Pablo Oteiza, Ricardo Alvarez, Mehrdad Pirnia, Fuat Can
Abstract:
To effectively combat climate change, many countries around the world have committed to a decarbonisation of their electricity, along with promoting a large-scale integration of renewable energy sources (RES). While this trend represents a unique opportunity to effectively combat climate change, achieving a sound and cost-efficient energy transition towards low-carbon power systems poses significant challenges for the multi-year Transmission Network Expansion Planning (TNEP) problem. The objective of the multi-year TNEP is to determine the necessary network infrastructure to supply the projected demand in a cost-efficient way, considering the evolution of the new generation mix, including the integration of RES. The rapid integration of large-scale RES increases the variability and uncertainty in the power system operation, which in turn increases short-term flexibility requirements. To meet these requirements, flexible generating technologies such as energy storage systems must be considered within the TNEP as well, along with proper models for capturing the operational challenges of future power systems. As a consequence, TNEP formulations are becoming more complex and difficult to solve, especially for its application in realistic-sized power system models. To meet these challenges, there is an increasing need for developing efficient algorithms capable of solving the TNEP problem with reasonable computational time and resources. In this regard, a promising research area is the use of artificial intelligence (AI) techniques for solving large-scale mixed-integer optimization problems, such as the TNEP. In particular, the use of AI along with mathematical optimization strategies based on decomposition has shown great potential. In this context, this paper presents an efficient algorithm for solving the multi-year TNEP problem. The algorithm combines AI techniques with Column Generation, a traditional decomposition-based mathematical optimization method. One of the challenges of using Column Generation for solving the TNEP problem is that the subproblems are of mixed-integer nature, and therefore solving them requires significant amounts of time and resources. Hence, in this proposal we solve a linearly relaxed version of the subproblems, and trained a binary classifier that determines the value of the binary variables, based on the results obtained from the linearized version. A key feature of the proposal is that we integrate the binary classifier into the optimization algorithm in such a way that the optimality of the solution can be guaranteed. The results of a study case based on the HRP 38-bus test system shows that the binary classifier has an accuracy above 97% for estimating the value of the binary variables. Since the linearly relaxed version of the subproblems can be solved with significantly less time than the integer programming counterpart, the integration of the binary classifier into the Column Generation algorithm allowed us to reduce the computational time required for solving the problem by 50%. The final version of this paper will contain a detailed description of the proposed algorithm, the AI-based binary classifier technique and its integration into the CG algorithm. To demonstrate the capabilities of the proposal, we evaluate the algorithm in case studies with different scenarios, as well as in other power system models.Keywords: integer optimization, machine learning, mathematical decomposition, transmission planning
Procedia PDF Downloads 8513007 Modern Hybrid of Older Black Female Stereotypes in Hollywood Film
Authors: Frederick W. Gooding, Jr., Mark Beeman
Abstract:
Nearly a century ago, the groundbreaking 1915 film ‘The Birth of a Nation’ popularized the way Hollywood made movies with its avant-garde, feature-length style. The movie's subjugating and demeaning depictions of African American women (and men) reflected popular racist beliefs held during the time of slavery and the early Jim Crow era. Although much has changed concerning race relations in the past century, American sociologist Patricia Hill Collins theorizes that the disparaging images of African American women originating in the era of plantation slavery are adaptable and endure as controlling images today. In this context, a comparative analysis of the successful contemporary film, ‘Bringing Down the House’ starring Queen Latifah is relevant as this 2004 film was designed to purposely defy and ridicule classic stereotypes of African American women. However, the film is still tied to the controlling images from the past, although in a modern hybrid form. Scholars of race and film have noted that the pervasive filmic imagery of the African American woman as the loyal mammy stereotype faded from the screen in the post-civil rights era in favor of more sexualized characters (i.e., the Jezebel trope). Analyzing scenes and dialogue through the lens of sociological and critical race theory, the troubling persistence of African American controlling images in film stubbornly emerge in a movie like ‘Bringing Down the House.’ Thus, these controlling images, like racism itself, can adapt to new social and economic conditions. Although the classic controlling images appeared in the first feature length film focusing on race relations a century ago, ‘The Birth of a Nation,’ this black and white rendition of the mammy figure was later updated in 1939 with the classic hit, ‘Gone with the Wind’ in living color. These popular controlling images have loomed quite large in the minds of international audiences, as ‘Gone with the Wind’ is still shown in American theaters currently, and experts at the British Film Institute in 2004 rated ‘Gone with the Wind’ as the number one movie of all time in UK movie history based upon the total number of actual viewings. Critical analysis of character patterns demonstrate that images that appear superficially benign contribute to a broader and quite persistent pattern of marginalization within the aggregate. This approach allows experts and viewers alike to detect more subtle and sophisticated strands of racial discrimination that are ‘hidden in plain sight’ despite numerous changes in the Hollywood industry that appear to be more voluminous and diverse than three or four decades ago. In contrast to white characters, non-white or minority characters are likely to be subtly compromised or marginalized relative to white characters if and when seen within mainstream movies, rather than be subjected to obvious and offensive racist tropes. The hybrid form of both the older Jezebel and Mammy stereotypes exhibited by lead actress Queen Latifah in ‘Bringing Down the House’ represents a more suave and sophisticated merging of past imagery ideas deemed problematic in the past as well as the present.Keywords: African Americans, Hollywood film, hybrid, stereotypes
Procedia PDF Downloads 17713006 Governance in the Age of Artificial intelligence and E- Government
Authors: Mernoosh Abouzari, Shahrokh Sahraei
Abstract:
Electronic government is a way for governments to use new technology that provides people with the necessary facilities for proper access to government information and services, improving the quality of services and providing broad opportunities to participate in democratic processes and institutions. That leads to providing the possibility of easy use of information technology in order to distribute government services to the customer without holidays, which increases people's satisfaction and participation in political and economic activities. The expansion of e-government services and its movement towards intelligentization has the ability to re-establish the relationship between the government and citizens and the elements and components of the government. Electronic government is the result of the use of information and communication technology (ICT), which by implementing it at the government level, in terms of the efficiency and effectiveness of government systems and the way of providing services, tremendous commercial changes are created, which brings people's satisfaction at the wide level will follow. The main level of electronic government services has become objectified today with the presence of artificial intelligence systems, which recent advances in artificial intelligence represent a revolution in the use of machines to support predictive decision-making and Classification of data. With the use of deep learning tools, artificial intelligence can mean a significant improvement in the delivery of services to citizens and uplift the work of public service professionals while also inspiring a new generation of technocrats to enter government. This smart revolution may put aside some functions of the government, change its components, and concepts such as governance, policymaking or democracy will change in front of artificial intelligence technology, and the top-down position in governance may face serious changes, and If governments delay in using artificial intelligence, the balance of power will change and private companies will monopolize everything with their pioneering in this field, and the world order will also depend on rich multinational companies and in fact, Algorithmic systems will become the ruling systems of the world. It can be said that currently, the revolution in information technology and biotechnology has been started by engineers, large economic companies, and scientists who are rarely aware of the political complexities of their decisions and certainly do not represent anyone. Therefore, it seems that if liberalism, nationalism, or any other religion wants to organize the world of 2050, it should not only rationalize the concept of artificial intelligence and complex data algorithm but also mix them in a new and meaningful narrative. Therefore, the changes caused by artificial intelligence in the political and economic order will lead to a major change in the way all countries deal with the phenomenon of digital globalization. In this paper, while debating the role and performance of e-government, we will discuss the efficiency and application of artificial intelligence in e-government, and we will consider the developments resulting from it in the new world and the concepts of governance.Keywords: electronic government, artificial intelligence, information and communication technology., system
Procedia PDF Downloads 9413005 Context-Aware Recommender System Using Collaborative Filtering, Content-Based Algorithm and Fuzzy Rules
Authors: Xochilt Ramirez-Garcia, Mario Garcia-Valdez
Abstract:
Contextual recommendations are implemented in Recommender Systems to improve user satisfaction, recommender system makes accurate and suitable recommendations for a particular situation reaching personalized recommendations. The context provides information relevant to the Recommender System and is used as a filter for selection of relevant items for the user. This paper presents a Context-aware Recommender System, which uses techniques based on Collaborative Filtering and Content-Based, as well as fuzzy rules, to recommend items inside the context. The dataset used to test the system is Trip Advisor. The accuracy in the recommendations was evaluated with the Mean Absolute Error.Keywords: algorithms, collaborative filtering, intelligent systems, fuzzy logic, recommender systems
Procedia PDF Downloads 42213004 A Study on Information Structure in the Vajrachedika-Prajna-paramita Sutra and Translation Aspect
Authors: Yoon-Cheol Park
Abstract:
This research focuses on examining the information structures in the old Chinese character-Korean translation of the Vajrachedika-prajna-paramita sutra. The background of this research comes from the fact that there were no previous researches which looked into the information structures in the target text of the Vajrachedika-prajna-paramita sutra by now. The existing researches on the Buddhist scripture translation mainly put weight on message conveyance by literal and semantic translation methods. But the message conveyance from one language to another has a necessity to be delivered with equivalent information structure. Thus, this research is intended to investigate on the flow of old and new information in the target text of Buddhist scripture, compared with source text. The Vajrachedika-prajna-paramita sutra unlike other Buddhist scriptures is composed of conversational structures between Buddha and his disciple, Suboli. This implies that the information flow can be changed by utterance context and some propositions. So, this research tries to analyze the flow of old and new information within the source and target text. As a result of analysis, this research can discover the following facts; firstly, there are the differences of the information flow in the message conveyance between the old Chinese character and Korean by language features. The old Chinese character reveals that old-new information flow is developed, while Korean indicates new-old information flow because of word order. Secondly, the source text of the Vajrachedika-prajna-paramita sutra includes abstruse terminologies, jargon and abstract words. These make influence on the target text and cause the change of the information flow. But the repetitive expressions of these words provide the old information in the target text. Lastly, the Vajrachedika-prajna-paramita sutra offers the expository structure from conversations between Buddha and Suboli. It means that the information flow is developed in the way of explaining specific subjects and of paraphrasing unfamiliar phrases and expressions. From the results of analysis above, this research can verify that the information structures in the target text of the Vajrachedika-prajna-paramita sutra are changed by specific subjects and terminologies, developed with the new-old information flow by repetitive expressions or word order and reveal the information structures familiar to target culture. It also implies that the translation of the Vajrachedika-prajna-paramita sutra as a religious book needs the message conveyance to take into account the information structures of two languages.Keywords: abstruse terminologies, the information structure, new and old information, old Chinese character-Korean translation
Procedia PDF Downloads 36813003 A Distinct Method Based on Mamba-Unet for Brain Tumor Image Segmentation
Authors: Djallel Bouamama, Yasser R. Haddadi
Abstract:
Accurate brain tumor segmentation is crucial for diagnosis and treatment planning, yet it remains a challenging task due to the variability in tumor shapes and intensities. This paper introduces a distinct approach to brain tumor image segmentation by leveraging an advanced architecture known as Mamba-Unet. Building on the well-established U-Net framework, Mamba-Unet incorporates distinct design enhancements to improve segmentation performance. Our proposed method integrates a multi-scale attention mechanism and a hybrid loss function to effectively capture fine-grained details and contextual information in brain MRI scans. We demonstrate that Mamba-Unet significantly enhances segmentation accuracy compared to conventional U-Net models by utilizing a comprehensive dataset of annotated brain MRI scans. Quantitative evaluations reveal that Mamba-Unet surpasses traditional U-Net architectures and other contemporary segmentation models regarding Dice coefficient, sensitivity, and specificity. The improvements are attributed to the method's ability to manage class imbalance better and resolve complex tumor boundaries. This work advances the state-of-the-art in brain tumor segmentation and holds promise for improving clinical workflows and patient outcomes through more precise and reliable tumor detection.Keywords: brain tumor classification, image segmentation, CNN, U-NET
Procedia PDF Downloads 3413002 Identification of Nonlinear Systems Using Radial Basis Function Neural Network
Authors: C. Pislaru, A. Shebani
Abstract:
This paper uses the radial basis function neural network (RBFNN) for system identification of nonlinear systems. Five nonlinear systems are used to examine the activity of RBFNN in system modeling of nonlinear systems; the five nonlinear systems are dual tank system, single tank system, DC motor system, and two academic models. The feed forward method is considered in this work for modelling the non-linear dynamic models, where the K-Means clustering algorithm used in this paper to select the centers of radial basis function network, because it is reliable, offers fast convergence and can handle large data sets. The least mean square method is used to adjust the weights to the output layer, and Euclidean distance method used to measure the width of the Gaussian function.Keywords: system identification, nonlinear systems, neural networks, radial basis function, K-means clustering algorithm
Procedia PDF Downloads 47013001 Fabrication of 2D Nanostructured Hybrid Material-Based Devices for High-Performance Supercapacitor Energy Storage
Authors: Sunil Kumar, Vinay Kumar, Mamta Bulla, Rita Dahiya
Abstract:
Supercapacitors have emerged as a leading energy storage technology, gaining popularity in applications like digital telecommunications, memory backup, and hybrid electric vehicles. Their appeal lies in a long cycle life, high power density, and rapid recharge capabilities. These exceptional traits attract researchers aiming to develop advanced, cost-effective, and high-energy-density electrode materials for next-generation energy storage solutions. Two-dimensional (2D) nanostructures are highly attractive for fabricating nanodevices due to their high surface-to-volume ratio and good compatibility with device design. In the current study, a composite was synthesized by combining MoS2 with reduced graphene oxide (rGO) under optimal conditions and characterized using various techniques, including XRD, FTIR, SEM and XPS. The electrochemical properties of the composite material were assessed through cyclic voltammetry, galvanostatic charging-discharging and electrochemical impedance spectroscopy. The supercapacitor device demonstrated a specific capacitance of 153 F g-1 at a current density of 1 Ag-1, achieving an excellent energy density of 30.5 Wh kg-1 and a power density of 600 W kg-1. Additionally, it maintained excellent cyclic stability over 5000 cycles, establishing it as a promising candidate for efficient and durable energy storage solutions. These findings highlight the dynamic relationship between electrode materials and offer valuable insights for the development and enhancement of high-performance symmetric devices.Keywords: 2D material, energy density, galvanostatic charge-discharge, hydrothermal reactor, specific capacitance
Procedia PDF Downloads 1413000 The Effect of Critical Audit Matters on Financial Information Quality: The Role of Audit Committee Expertise
Authors: Khawla Hlel
Abstract:
Purpose: This study aims to examine whether critical audit matters (CAM) affect financial information quality. We also investigate the moderating role of the audit committee on the association between CAM and financial information quality. Design/Methodology/Approach: The analysis is based on GLS and GMM regressions explaining the absolute value of discretionary accruals by using 52 Tunisian listed firms on the Tunisia Stock Exchange (TSE) for the period 2017-2020. Findings: We find evidence that managers react to the CAM by increasing the quality of financial disclosures. This study provides insights into how a change in the auditor’s report model might impact the quality of financial information. It suggests that external auditors and audit committees serve as a beneficial mechanism for enhancing financial information quality by reducing information asymmetry. In addition, our results indicate that CAM is an efficient monitoring mechanism that increases financial reporting quality and supervises managers. Originality: This study is important for potential investors who should assess CAM when evaluating firms. Furthermore, the authors expect the findings to be interesting to firms, as this study highlights the effectiveness of the auditor in reducing managerial opportunistic behavior and improving information quality. The results could encourage audit regulators to ameliorate the standards, as this research reinforces the role of the auditor in increasing the quality of financial disclosure by offering the required information for shareholders.Keywords: critical audit matters, audit committee, information quality, Tunisian firms
Procedia PDF Downloads 8512999 Optical Flow Localisation and Appearance Mapping (OFLAAM) for Long-Term Navigation
Authors: Daniel Pastor, Hyo-Sang Shin
Abstract:
This paper presents a novel method to use optical flow navigation for long-term navigation. Unlike standard SLAM approaches for augmented reality, OFLAAM is designed for Micro Air Vehicles (MAV). It uses an optical flow camera pointing downwards, an IMU and a monocular camera pointing frontwards. That configuration avoids the expensive mapping and tracking of the 3D features. It only maps these features in a vocabulary list by a localization module to tackle the loss of the navigation estimation. That module, based on the well-established algorithm DBoW2, will be also used to close the loop and allow long-term navigation in confined areas. That combination of high-speed optical flow navigation with a low rate localization algorithm allows fully autonomous navigation for MAV, at the same time it reduces the overall computational load. This framework is implemented in ROS (Robot Operating System) and tested attached to a laptop. A representative scenarios is used to analyse the performance of the system.Keywords: vision, UAV, navigation, SLAM
Procedia PDF Downloads 60612998 A Practical Protection Method for Parallel Transmission-Lines Based on the Fault Travelling-Waves
Authors: Mohammad Reza Ebrahimi
Abstract:
In new restructured power systems, swift fault detection is very important. The parallel transmission-lines are vastly used in this kind of power systems because of high amount of energy transferring. In this paper, a method based on the comparison of two schemes, i.e., i) maximum magnitude of travelling-wave (TW) energy ii) the instants of maximum energy occurrence at the circuits of parallel transmission-line is proposed. Using the travelling-wave of fault in order to faulted line identification this method has noticeable operation time. Moreover, the algorithm can cover for identification of faults as external or internal faults. For an internal fault, the exact location of the fault can be estimated confidently. A lot of simulations have been done with PSCAD/EMTDC to verify the performance of the proposed algorithm.Keywords: travelling-wave, maximum energy, parallel transmission-line, fault location
Procedia PDF Downloads 18612997 Avoiding Packet Drop for Improved through Put in the Multi-Hop Wireless N/W
Authors: Manish Kumar Rajak, Sanjay Gupta
Abstract:
Mobile ad hoc networks (MANETs) are infrastructure less and intercommunicate using single-hop and multi-hop paths. Network based congestion avoidance which involves managing the queues in the network devices is an integral part of any network. QoS: A set of service requirements that are met by the network while transferring a packet stream from a source to a destination. Especially in MANETs, packet loss results in increased overheads. This paper presents a new algorithm to avoid congestion using one or more queue on nodes and corresponding flow rate decided in advance for each node. When any node attains an initial value of queue then it sends this status to its downstream nodes which in turn uses the pre-decided flow rate of packet transfer to its upstream nodes. The flow rate on each node is adjusted according to the status received from its upstream nodes. This proposed algorithm uses the existing infrastructure to inform to other nodes about its current queue status.Keywords: mesh networks, MANET, packet count, threshold, throughput
Procedia PDF Downloads 47412996 Real-Time Data Stream Partitioning over a Sliding Window in Real-Time Spatial Big Data
Authors: Sana Hamdi, Emna Bouazizi, Sami Faiz
Abstract:
In recent years, real-time spatial applications, like location-aware services and traffic monitoring, have become more and more important. Such applications result dynamic environments where data as well as queries are continuously moving. As a result, there is a tremendous amount of real-time spatial data generated every day. The growth of the data volume seems to outspeed the advance of our computing infrastructure. For instance, in real-time spatial Big Data, users expect to receive the results of each query within a short time period without holding in account the load of the system. But with a huge amount of real-time spatial data generated, the system performance degrades rapidly especially in overload situations. To solve this problem, we propose the use of data partitioning as an optimization technique. Traditional horizontal and vertical partitioning can increase the performance of the system and simplify data management. But they remain insufficient for real-time spatial Big data; they can’t deal with real-time and stream queries efficiently. Thus, in this paper, we propose a novel data partitioning approach for real-time spatial Big data named VPA-RTSBD (Vertical Partitioning Approach for Real-Time Spatial Big data). This contribution is an implementation of the Matching algorithm for traditional vertical partitioning. We find, firstly, the optimal attribute sequence by the use of Matching algorithm. Then, we propose a new cost model used for database partitioning, for keeping the data amount of each partition more balanced limit and for providing a parallel execution guarantees for the most frequent queries. VPA-RTSBD aims to obtain a real-time partitioning scheme and deals with stream data. It improves the performance of query execution by maximizing the degree of parallel execution. This affects QoS (Quality Of Service) improvement in real-time spatial Big Data especially with a huge volume of stream data. The performance of our contribution is evaluated via simulation experiments. The results show that the proposed algorithm is both efficient and scalable, and that it outperforms comparable algorithms.Keywords: real-time spatial big data, quality of service, vertical partitioning, horizontal partitioning, matching algorithm, hamming distance, stream query
Procedia PDF Downloads 15712995 An Algorithm of Set-Based Particle Swarm Optimization with Status Memory for Traveling Salesman Problem
Authors: Takahiro Hino, Michiharu Maeda
Abstract:
Particle swarm optimization (PSO) is an optimization approach that achieves the social model of bird flocking and fish schooling. PSO works in continuous space and can solve continuous optimization problem with high quality. Set-based particle swarm optimization (SPSO) functions in discrete space by using a set. SPSO can solve combinatorial optimization problem with high quality and is successful to apply to the large-scale problem. In this paper, we present an algorithm of SPSO with status memory to decide the position based on the previous position for solving traveling salesman problem (TSP). In order to show the effectiveness of our approach. We examine SPSOSM for TSP compared to the existing algorithms.Keywords: combinatorial optimization problems, particle swarm optimization, set-based particle swarm optimization, traveling salesman problem
Procedia PDF Downloads 55212994 Sustainable Transformative Approaches to Reuse the Built Heritage of Erbil Citadel Houses as Part of Restoration
Authors: Wafaa Anwar Sulaiman Goriel
Abstract:
The historiography of the Revival heritage aims to breathe a wider spirit of historical building back into life. This paper reflects an approach to revitalizing architectural antiquities through unusual methodologies elsewhere unknown in the renovation heritage sphere using the Erbil Citadel houses as a example. The 6000-year-old, continuously occupied site of Erbil Citadel embodies the challenges and mutual opportunities in ensuring that historical context is preserved during modern redevelopment. It shows how these principles can engage traditional construction systems with modern materials and technologies. It is an approach that champions the age and integrity of restored heritage sites, containing within its vernacular style elements which add to a sense of relevance when contextually re-set in modern settings. Some Citadel’s houses will be discussed in the paper and the restoration method has been processed.Keywords: Erbil Citadel houses, preservation, heritage, historical sites
Procedia PDF Downloads 1812993 Real Time Detection, Prediction and Reconstitution of Rain Drops
Authors: R. Burahee, B. Chassinat, T. de Laclos, A. Dépée, A. Sastim
Abstract:
The purpose of this paper is to propose a solution to detect, predict and reconstitute rain drops in real time – during the night – using an embedded material with an infrared camera. To prevent the system from needing too high hardware resources, simple models are considered in a powerful image treatment algorithm reducing considerably calculation time in OpenCV software. Using a smart model – drops will be matched thanks to a process running through two consecutive pictures for implementing a sophisticated tracking system. With this system drops computed trajectory gives information for predicting their future location. Thanks to this technique, treatment part can be reduced. The hardware system composed by a Raspberry Pi is optimized to host efficiently this code for real time execution.Keywords: reconstitution, prediction, detection, rain drop, real time, raspberry, infrared
Procedia PDF Downloads 41912992 Measuring Business Strategy and Information Systems Alignment
Authors: Amit Saraswat, Ruchi Tewari
Abstract:
Purpose: The research paper aims at understanding the alignment of business and IT in the Indian context and the business value attached to such an alignment. Methodology: The study is conducted in two stages. Stage one: Bibliographic research was conducted to evolve the parameters for defining alignment. Stage two: Evolving a model for strategic alignment to conduct an empirical study. The model is defined in terms of four fundamental domains of strategic management choice – business strategy, information strategy, organizational structure, and information technology structure. A survey through a questionnaire was conducted across organizations from 4 different industries and Structure Equation Modelling (SEM) technique is used for validating the model. Findings: In the Indian scenario all the subscales of alignment could not be validated. It could be validated that organizational strategy impacts information strategy and information technology structure. Research Limitations: The study is limited to the Indian context. Business IT alignment may be culture dependent so further research is required to validate the model in other cultures. Originality/Value: In the western world several models of alignment of business strategy and information systems is available but they do not measure the extent of alignment which the current study in the Indian context. Findings of the study can be used by managers in strategizing and understanding their business and information systems needs holistically and cohesively leading to efficient use of resources and output.Keywords: business strategy, information technology (IT), business IT alignment, SEM
Procedia PDF Downloads 38812991 Classic Modelled Hybrid Electric Vehicles Using The Power of Internet Of Things
Authors: Venkatesh Krishna Murthy
Abstract:
The era before government-regulated automotive designs gave us some astonishing vehicles that are well worth to keep on the road. The fact that restoring an automobile in 2015 does not mean it will perform like one designed in 2021. This is one of the reasons that manufacturers continue to turn to vintage hardware for future enhancements in their vehicles. Now we need to understand that a modern chassis could possibly allow manufacturers to give vintage performance cars a level of braking capability, compatibility with tires, chassis rigidity, suspension sophistication, and steering response, an experience only racers got until now. However, half a century of advancements in engineering can have a great impact on design in any field, and the automotive realm which holds no exception. In the current situation, a growing number of companies offer chassis and braking components to onboard manufacturers to retrofit contemporary technology for their vintage vehicles to modernize them at the foundation level. The recent question arises on performance on lithium batteries, as opposed to simply bolting upgraded components, for ex. lithium batteries with graphene as superconductive material to enhance performance, an area deeply investigated. Serving as the “bones” of the vehicle, the chassis and frame play a central role in dictating how that automobile will perform. While the desire to maintain originality is alluring for many, the benefits of a modern chassis are vast. In some situations, it also allows builders to put cars back on the road that might otherwise be too far gone. “There’s a couple of different factors at play here – one of them being that these older cars from the ’40s, ’50s, and ’60s have seen a lot of weather and a lot of road miles over the years, more often than not,” says Craig Morrison of Art Morrison Enterprises.Keywords: hybrid electric vehicles, internet of things, lithium graphene batteries, classic car chassis
Procedia PDF Downloads 171