Search results for: machine intelligence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3837

Search results for: machine intelligence

327 “Everything, Everywhere, All at Once” Hollywoodization and Lack of Authenticity in Today’s Mainstream Cinema

Authors: Haniyeh Parhizkar

Abstract:

When Sarris came up with the "auteur theory" in 1962, he emphasized that the utmost premise of auteur theory is the inner meanings and concepts of a film and that a film is purely an art form. Today's mainstream movies are conceptually closer to what the Frankfurt School scholars regarded as "reproduced" and "mass culture" years ago. Hollywood goes on to be a huge movie-making machine that leads the dominant paradigms of films throughout the world and cinema is far from art. Although there are still movies, directors, and audiences who favor art cinema over Hollywood and mainstream movies, it's an almost undeniable fact that, for the most part, people's perception of movies is widely influenced by their American depiction and Hollywood's legacy of mass culture. With the uprising of Hollywood studios as the forerunners of the movie industry and cinema being largely dependent on economics rather than artistic values, this distinctive role of cinema has diminished and is replaced with a global standard. The Blockbuster 2022 film, 'Everything, Everywhere, All at Once' is now the most-awarded movie of all time, winning seven Oscars at the 95th Academy Awards. Despite its main cast being Asian, the movie is produced by American incorporation and is heavily influenced by Hollywood's dominant themes of superheroes, fantasy, action, and adventure. The New Yorker film critic, Richard Brody, called the movie "a pitch for a Marvel" and critiqued the film for being "universalized" and "empty of history and culture". Other critics of Variety pinpointed the movie's similarities to Marvel, particularly in their storylines of multi-universe which manifest traces of American legacy. As argued by these critics, 'Everything, Everywhere, All at Once' might appear as a unique and authentic film at first glance, but it can be argued that it is yet another version of a Marvel movie. While the movie's universal acclaim was regarded as recognition and an acknowledgment of its Asian cast, the issue that arises here is when the Hollywood influences and American themes are so robust in the film, is the movie industry honoring another culture or is it yet another celebration of Hollywood's dominant paradigm. This essay will employ a critical approach to Hollywood's dominance and mass-produced culture, which has deprived authenticity of non-American movies and is constantly reproducing the same formula of success.

Keywords: hollywoodization, universalization, blockbuster, dominant paradigm, marvel, authenticity, diversity

Procedia PDF Downloads 60
326 A Hybrid Artificial Intelligence and Two Dimensional Depth Averaged Numerical Model for Solving Shallow Water and Exner Equations Simultaneously

Authors: S. Mehrab Amiri, Nasser Talebbeydokhti

Abstract:

Modeling sediment transport processes by means of numerical approach often poses severe challenges. In this way, a number of techniques have been suggested to solve flow and sediment equations in decoupled, semi-coupled or fully coupled forms. Furthermore, in order to capture flow discontinuities, a number of techniques, like artificial viscosity and shock fitting, have been proposed for solving these equations which are mostly required careful calibration processes. In this research, a numerical scheme for solving shallow water and Exner equations in fully coupled form is presented. First-Order Centered scheme is applied for producing required numerical fluxes and the reconstruction process is carried out toward using Monotonic Upstream Scheme for Conservation Laws to achieve a high order scheme.  In order to satisfy C-property of the scheme in presence of bed topography, Surface Gradient Method is proposed. Combining the presented scheme with fourth order Runge-Kutta algorithm for time integration yields a competent numerical scheme. In addition, to handle non-prismatic channels problems, Cartesian Cut Cell Method is employed. A trained Multi-Layer Perceptron Artificial Neural Network which is of Feed Forward Back Propagation (FFBP) type estimates sediment flow discharge in the model rather than usual empirical formulas. Hydrodynamic part of the model is tested for showing its capability in simulation of flow discontinuities, transcritical flows, wetting/drying conditions and non-prismatic channel flows. In this end, dam-break flow onto a locally non-prismatic converging-diverging channel with initially dry bed conditions is modeled. The morphodynamic part of the model is verified simulating dam break on a dry movable bed and bed level variations in an alluvial junction. The results show that the model is capable in capturing the flow discontinuities, solving wetting/drying problems even in non-prismatic channels and presenting proper results for movable bed situations. It can also be deducted that applying Artificial Neural Network, instead of common empirical formulas for estimating sediment flow discharge, leads to more accurate results.

Keywords: artificial neural network, morphodynamic model, sediment continuity equation, shallow water equations

Procedia PDF Downloads 164
325 Enhancing Large Language Models' Data Analysis Capability with Planning-and-Execution and Code Generation Agents: A Use Case for Southeast Asia Real Estate Market Analytics

Authors: Kien Vu, Jien Min Soh, Mohamed Jahangir Abubacker, Piyawut Pattamanon, Soojin Lee, Suvro Banerjee

Abstract:

Recent advances in Generative Artificial Intelligence (GenAI), in particular Large Language Models (LLMs) have shown promise to disrupt multiple industries at scale. However, LLMs also present unique challenges, notably, these so-called "hallucination" which is the generation of outputs that are not grounded in the input data that hinders its adoption into production. Common practice to mitigate hallucination problem is utilizing Retrieval Agmented Generation (RAG) system to ground LLMs'response to ground truth. RAG converts the grounding documents into embeddings, retrieve the relevant parts with vector similarity between user's query and documents, then generates a response that is not only based on its pre-trained knowledge but also on the specific information from the retrieved documents. However, the RAG system is not suitable for tabular data and subsequent data analysis tasks due to multiple reasons such as information loss, data format, and retrieval mechanism. In this study, we have explored a novel methodology that combines planning-and-execution and code generation agents to enhance LLMs' data analysis capabilities. The approach enables LLMs to autonomously dissect a complex analytical task into simpler sub-tasks and requirements, then convert them into executable segments of code. In the final step, it generates the complete response from output of the executed code. When deployed beta version on DataSense, the property insight tool of PropertyGuru, the approach yielded promising results, as it was able to provide market insights and data visualization needs with high accuracy and extensive coverage by abstracting the complexities for real-estate agents and developers from non-programming background. In essence, the methodology not only refines the analytical process but also serves as a strategic tool for real estate professionals, aiding in market understanding and enhancement without the need for programming skills. The implication extends beyond immediate analytics, paving the way for a new era in the real estate industry characterized by efficiency and advanced data utilization.

Keywords: large language model, reasoning, planning and execution, code generation, natural language processing, prompt engineering, data analysis, real estate, data sense, PropertyGuru

Procedia PDF Downloads 47
324 Investigations of Effective Marketing Metric Strategies: The Case of St. George Brewery Factory, Ethiopia

Authors: Mekdes Getu Chekol, Biniam Tedros Kahsay, Rahwa Berihu Haile

Abstract:

The main objective of this study is to investigate the marketing strategy practice in the Case of St. George Brewery Factory in Addis Ababa. One of the core activities in a Business Company to stay in business is having a well-developed marketing strategy. It assessed how the marketing strategies were practiced in the company to achieve its goals aligned with segmentation, target market, positioning, and the marketing mix elements to satisfy customer requirements. Using primary and secondary data, the study is conducted by using both qualitative and quantitative approaches. The primary data was collected through open and closed-ended questionnaires. Considering the size of the population is small, the selection of the respondents was carried out by using a census. The finding shows that the company used all the 4 Ps of the marketing mix elements in its marketing strategies and provided quality products at affordable prices by promoting its products by using high and effective advertising mechanisms. The product availability and accessibility are admirable with the practices of both direct and indirect distribution channels. On the other hand, the company has identified its target customers, and the company’s market segmentation practice is geographical location. Communication effectiveness between the marketing department and other departments is very good. The adjusted R2 model explains 61.6% of the marketing strategy practice variance by product, price, promotion, and place. The remaining 38.4% of variation in the dependent variable was explained by other factors not included in this study. The result reveals that all four independent variables, product, price, promotion, and place, have a positive beta sign, proving that predictor variables have a positive effect on that of the predicting dependent variable marketing strategy practice. Even though the marketing strategies of the company are effectively practiced, there are some problems that the company faces while implementing them. These are infrastructure problems, economic problems, intensive competition in the market, shortage of raw materials, seasonality of consumption, socio-cultural problems, and the time and cost of awareness creation for the customers. Finally, the authors suggest that the company better develop a long-range view and try to implement a more structured approach to attain information about potential customers, competitor’s actions, and market intelligence within the industry. In addition, we recommend conducting the study by increasing the sample size and including different marketing factors.

Keywords: marketing strategy, market segmentation, target marketing, market positioning, marketing mix

Procedia PDF Downloads 25
323 Technology Road Mapping in the Fourth Industrial Revolution: A Comprehensive Analysis and Strategic Framework

Authors: Abdul Rahman Hamdan

Abstract:

The Fourth Industrial Revolution (4IR) has brought unprecedented technological advancements that have disrupted many industries worldwide. In keeping up with the technological advances and rapid disruption by the introduction of many technological advancements brought forth by the 4IR, the use of technology road mapping has emerged as one of the critical tools for organizations to leverage. Technology road mapping can be used by many companies to guide them to become more adaptable and anticipate future transformation and innovation, and avoid being redundant or irrelevant due to the rapid changes in technological advancement. This research paper provides a comprehensive analysis of technology road mapping within the context of the 4IR. The objectives of the paper are to provide companies with practical insights and a strategic framework of technology road mapping for them to navigate the fast-changing nature of the 4IR. This study also contributes to the understanding and practice of technology road mapping in the 4IR and, at the same time, provides organizations with the necessary tools and critical insight to navigate the 4IR transformation by leveraging technology road mapping. Based on the literature review and case studies, the study analyses key principles, methodologies, and best practices in technology road mapping and integrates them with the unique characteristics and challenges of the 4IR. The research paper gives the background of the fourth industrial revolution. It explores the disruptive potential of technologies in the 4IR and the critical need for technology road mapping that consists of strategic planning and foresight to remain competitive and relevant in the 4IR era. It also highlights the importance of technology road mapping as an organisation’s proactive approach to align the organisation’s objectives and resources to their technology and product development in meeting the fast-evolving technological 4IR landscape. The paper also includes the theoretical foundations of technology road mapping and examines various methodological approaches, and identifies external stakeholders in the process, such as external experts, stakeholders, collaborative platforms, and cross-functional teams to ensure an integrated and robust technological roadmap for the organisation. Moreover, this study presents a comprehensive framework for technology road mapping in the 4IR by incorporating key elements and processes such as technology assessment, competitive intelligence, risk analysis, and resource allocation. It provides a framework for implementing technology road mapping from strategic planning, goal setting, and technology scanning to road mapping visualisation, implementation planning, monitoring, and evaluation. In addition, the study also addresses the challenges and limitations related to technology roadmapping in 4IR, including the gap analysis. In conclusion of the study, the study will propose a set of practical recommendations for organizations that intend to leverage technology road mapping as a strategic tool in the 4IR in driving innovation and becoming competitive in the current and future ecosystem.

Keywords: technology management, technology road mapping, technology transfer, technology planning

Procedia PDF Downloads 43
322 The Roman Fora in North Africa Towards a Supportive Protocol to the Decision for the Morphological Restitution

Authors: Dhouha Laribi Galalou, Najla Allani Bouhoula, Atef Hammouda

Abstract:

This research delves into the fundamental question of the morphological restitution of built archaeology in order to place it in its paradigmatic context and to seek answers to it. Indeed, the understanding of the object of the study, its analysis, and the methodology of solving the morphological problem posed, are manageable aspects only by means of a thoughtful strategy that draws on well-defined epistemological scaffolding. In this stream, the crisis of natural reasoning in archaeology has generated multiple changes in this field, ranging from the use of new tools to the integration of an archaeological information system where urbanization involves the interplay of several disciplines. The built archaeological topic is also an architectural and morphological object. It is also a set of articulated elementary data, the understanding of which is about to be approached from a logicist point of view. Morphological restitution is no exception to the rule, and the inter-exchange between the different disciplines uses the capacity of each to frame the reflection on the incomplete elements of a given architecture or on its different phases and multiple states of existence. The logicist sequence is furnished by the set of scattered or destroyed elements found, but also by what can be called a rule base which contains the set of rules for the architectural construction of the object. The knowledge base built from the archaeological literature also provides a reference that enters into the game of searching for forms and articulations. The choice of the Roman Forum in North Africa is justified by the great urban and architectural characteristics of this entity. The research on the forum involves both a fairly large knowledge base but also provides the researcher with material to study - from a morphological and architectural point of view - starting from the scale of the city down to the architectural detail. The experimentation of the knowledge deduced on the paradigmatic level, as well as the deduction of an analysis model, is then carried out on the basis of a well-defined context which contextualises the experimentation from the elaboration of the morphological information container attached to the rule base and the knowledge base. The use of logicist analysis and artificial intelligence has allowed us to first question the aspects already known in order to measure the credibility of our system, which remains above all a decision support tool for the morphological restitution of Roman Fora in North Africa. This paper presents a first experimentation of the model elaborated during this research, a model framed by a paradigmatic discussion and thus trying to position the research in relation to the existing paradigmatic and experimental knowledge on the issue.

Keywords: classical reasoning, logicist reasoning, archaeology, architecture, roman forum, morphology, calculation

Procedia PDF Downloads 120
321 Comparison of Existing Predictor and Development of Computational Method for S- Palmitoylation Site Identification in Arabidopsis Thaliana

Authors: Ayesha Sanjana Kawser Parsha

Abstract:

S-acylation is an irreversible bond in which cysteine residues are linked to fatty acids palmitate (74%) or stearate (22%), either at the COOH or NH2 terminal, via a thioester linkage. There are several experimental methods that can be used to identify the S-palmitoylation site; however, since they require a lot of time, computational methods are becoming increasingly necessary. There aren't many predictors, however, that can locate S- palmitoylation sites in Arabidopsis Thaliana with sufficient accuracy. This research is based on the importance of building a better prediction tool. To identify the type of machine learning algorithm that predicts this site more accurately for the experimental dataset, several prediction tools were examined in this research, including the GPS PALM 6.0, pCysMod, GPS LIPID 1.0, CSS PALM 4.0, and NBA PALM. These analyses were conducted by constructing the receiver operating characteristics plot and the area under the curve score. An AI-driven deep learning-based prediction tool has been developed utilizing the analysis and three sequence-based input data, such as the amino acid composition, binary encoding profile, and autocorrelation features. The model was developed using five layers, two activation functions, associated parameters, and hyperparameters. The model was built using various combinations of features, and after training and validation, it performed better when all the features were present while using the experimental dataset for 8 and 10-fold cross-validations. While testing the model with unseen and new data, such as the GPS PALM 6.0 plant and pCysMod mouse, the model performed better, and the area under the curve score was near 1. It can be demonstrated that this model outperforms the prior tools in predicting the S- palmitoylation site in the experimental data set by comparing the area under curve score of 10-fold cross-validation of the new model with the established tools' area under curve score with their respective training sets. The objective of this study is to develop a prediction tool for Arabidopsis Thaliana that is more accurate than current tools, as measured by the area under the curve score. Plant food production and immunological treatment targets can both be managed by utilizing this method to forecast S- palmitoylation sites.

Keywords: S- palmitoylation, ROC PLOT, area under the curve, cross- validation score

Procedia PDF Downloads 49
320 Critical Analysis of International Protections for Children from Sexual Abuse and Examination of Indian Legal Approach

Authors: Ankita Singh

Abstract:

Sex trafficking and child pornography are those kinds of borderless crimes which can not be effectively prevented only through the laws and efforts of one country because it requires a proper and smooth collaboration among countries. Eradication of international human trafficking syndicates, criminalisation of international cyber offenders, and effective ban on child pornography is not possible without applying effective universal laws; hence, continuous collaboration of all countries is much needed to adopt and routinely update these universal laws. Congregation of countries on an international platform is very necessary from time to time, where they can simultaneously adopt international agendas and create powerful universal laws to prevent sex trafficking and child pornography in this modern digital era. In the past, some international steps have been taken through The Convention on the Rights of the Child (CRC) and through The Optional Protocol to the Convention on the Rights of the Child on the Sale of Children, Child Prostitution, and Child Pornography, but in reality, these measures are quite weak and are not capable in effectively protecting children from sexual abuse in this modern & highly advanced digital era. The uncontrolled growth of artificial intelligence (AI) and its misuse, lack of proper legal jurisdiction over foreign child abusers and difficulties in their extradition, improper control over international trade of digital child pornographic content, etc., are some prominent issues which can only be controlled through some new, effective and powerful universal laws. Due to a lack of effective international standards and a lack of improper collaboration among countries, Indian laws are also not capable of taking effective actions against child abusers. This research will be conducted through both doctrinal as well as empirical methods. Various literary sources will be examined, and a questionnaire survey will be conducted to analyse the effectiveness of international standards and Indian laws against child pornography. Participants in this survey will be Indian University students. In this work, the existing international norms made for protecting children from sexual abuse will be critically analysed. It will explore why effective and strong collaboration between countries is required in modern times. It will be analysed whether existing international steps are enough to protect children from getting trafficked or being subjected to pornography, and if these steps are not found to be sufficient enough, then suggestions will be given on how international standards and protections can be made more effective and powerful in this digital era. The approach of India towards the existing international standards, the Indian laws to protect children from being subjected to pornography, and the contributions & capabilities of India in strengthening the international standards will also be analysed.

Keywords: child pornography, prevention of children from sexual offences act, the optional protocol to the convention on the rights of the child on the sale of children, child prostitution and child pornography, the convention on the rights of the child

Procedia PDF Downloads 11
319 Measuring Elemental Sulfur in Late Manually-Treated Grape Juice in Relation to Polyfunctional Mercaptan Formation in Sauvignon Blanc Wines

Authors: Bahareh Sarmadi, Paul A. Kilmartin, Leandro D. Araújo, Brandt P. Bastow

Abstract:

Aim: Sauvignon blanc is the most substantial variety cultivated in almost 62% of all producing vineyards of New Zealand. The popularity of New Zealand Sauvignon blanc is due to its unique taste. It is the most famous wine characterized by its aroma profile derived from mercaptans. 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) are two of the most important volatile mercaptans found in Sauvignon blanc wines. “Viticultural” and “Enological” factors such as machine-harvesting, the most common harvesting practice used in New Zealand, can be among the reasons for this distinct flavor. Elemental sulfur is commonly sprayed in the fields to protect berries against powdery mildew. Although it is not the only source of sulfur, this practice creates a source of elemental sulfur that can be transferred into the must and eventually into wines. Despite the clear effects of residual elemental sulfur present in the must on the quality and aroma of the final wines, its measurement before harvest or fermentation is not a regular practice in the wineries. This can be due to the lack of accessible and applicable methods for the equipment at most commercial wineries. This study aims to establish a relationship between the number and frequency of elemental sulfur applications and the concentration of polyfunctional mercaptans in the final wines. Methods: An apparatus was designed to reduce elemental sulfur to sulfide, then an ion-selective electrode to measure sulfide concentration. During harvest 2022, we explored a wider range of residual elemental sulfur levels than what typically applies in the vineyards. This has been done through later manual elemental sulfur applications in the vineyard. Additional sulfur applications were made 20, 10 and 5 days prior to harvesting the treated grapes, covering long and short pre-harvest intervals (PHI). The grapes were processed into juice and fermented into wine; then, they were analyzed to find the correlation between polyfunctional mercaptans concentrations in the wines and residual elemental sulfur in the juice samples. Results: The research showed that higher 3MH/3MHA was formed when elemental sulfur was applied more frequent in the vineyards and supported the proposed pathway in which elemental sulfur is a source of 3MH formation in wines.

Keywords: sauvignon blanc, elemental sulfur, polyfunctional mercaptans, varietal thiols

Procedia PDF Downloads 73
318 Potential of High Performance Ring Spinning Based on Superconducting Magnetic Bearing

Authors: M. Hossain, A. Abdkader, C. Cherif, A. Berger, M. Sparing, R. Hühne, L. Schultz, K. Nielsch

Abstract:

Due to the best quality of yarn and the flexibility of the machine, the ring spinning process is the most widely used spinning method for short staple yarn production. However, the productivity of these machines is still much lower in comparison to other spinning systems such as rotor or air-jet spinning process. The main reason for this limitation lies on the twisting mechanism of the ring spinning process. In the ring/traveler twisting system, each rotation of the traveler along with the ring inserts twist in the yarn. The rotation of the traveler at higher speed includes strong frictional forces, which in turn generates heat. Different ring/traveler systems concerning with its geometries, material combinations and coatings have already been implemented to solve the frictional problem. However, such developments can neither completely solve the frictional problem nor increase the productivity. The friction free superconducting magnetic bearing (SMB) system can be a right alternative replacing the existing ring/traveler system. The unique concept of SMB bearings is that they possess a self-stabilizing behavior, i.e. they remain fully passive without any necessity for expensive position sensing and control. Within the framework of a research project funded by German research foundation (DFG), suitable concepts of the SMB-system have been designed, developed, and integrated as a twisting device of ring spinning replacing the existing ring/traveler system. With the help of the developed mathematical model and experimental investigation, the physical limitations of this innovative twisting device in the spinning process have been determined. The interaction among the parameters of the spinning process and the superconducting twisting element has been further evaluated, which derives the concrete information regarding the new spinning process. Moreover, the influence of the implemented SMB twisting system on the yarn quality has been analyzed with respect to different process parameters. The presented work reveals the enormous potential of the innovative twisting mechanism, so that the productivity of the ring spinning process especially in case of thermoplastic materials can be at least doubled for the first time in a hundred years. The SMB ring spinning tester has also been presented in the international fair “International Textile Machinery Association (ITMA) 2015”.

Keywords: ring spinning, superconducting magnetic bearing, yarn properties, productivity

Procedia PDF Downloads 211
317 DEEPMOTILE: Motility Analysis of Human Spermatozoa Using Deep Learning in Sri Lankan Population

Authors: Chamika Chiran Perera, Dananjaya Perera, Chirath Dasanayake, Banuka Athuraliya

Abstract:

Male infertility is a major problem in the world, and it is a neglected and sensitive health issue in Sri Lanka. It can be determined by analyzing human semen samples. Sperm motility is one of many factors that can evaluate male’s fertility potential. In Sri Lanka, this analysis is performed manually. Manual methods are time consuming and depend on the person, but they are reliable and it can depend on the expert. Machine learning and deep learning technologies are currently being investigated to automate the spermatozoa motility analysis, and these methods are unreliable. These automatic methods tend to produce false positive results and false detection. Current automatic methods support different techniques, and some of them are very expensive. Due to the geographical variance in spermatozoa characteristics, current automatic methods are not reliable for motility analysis in Sri Lanka. The suggested system, DeepMotile, is to explore a method to analyze motility of human spermatozoa automatically and present it to the andrology laboratories to overcome current issues. DeepMotile is a novel deep learning method for analyzing spermatozoa motility parameters in the Sri Lankan population. To implement the current approach, Sri Lanka patient data were collected anonymously as a dataset, and glass slides were used as a low-cost technique to analyze semen samples. Current problem was identified as microscopic object detection and tackling the problem. YOLOv5 was customized and used as the object detector, and it achieved 94 % mAP (mean average precision), 86% Precision, and 90% Recall with the gathered dataset. StrongSORT was used as the object tracker, and it was validated with andrology experts due to the unavailability of annotated ground truth data. Furthermore, this research has identified many potential ways for further investigation, and andrology experts can use this system to analyze motility parameters with realistic accuracy.

Keywords: computer vision, deep learning, convolutional neural networks, multi-target tracking, microscopic object detection and tracking, male infertility detection, motility analysis of human spermatozoa

Procedia PDF Downloads 78
316 Insight into Enhancement of CO2 Capture by Clay Minerals

Authors: Mardin Abdalqadir, Paul Adzakro, Tannaz Pak, Sina Rezaei Gomari

Abstract:

Climate change and global warming recently became significant concerns due to the massive emissions of greenhouse gases into the atmosphere, predominantly CO2 gases. Therefore, it is necessary to find sustainable and inexpensive methods to capture the greenhouse gasses and protect the environment for live species. The application of naturally available and cheap adsorbents of carbon such as clay minerals became a great interest. However, the minerals prone to low storage capacity despite their high affinity to adsorb carbon. This paper aims to explore ways to improve the pore volume and surface area of two selected clay minerals, ‘montmorillonite and kaolinite’ by acid treatment to overcome their low storage capacity. Montmorillonite and kaolinite samples were treated with different sulfuric acid concentrations (0.5, 1.2 and 2.5 M) at 40 °C for 8 hours to achieve the above aim. The grain size distribution and morphology of clay minerals before and after acid treatment were explored with Scanning Electron Microscope to evaluate surface area improvement. The ImageJ software was used to find the porosity and pore volume of treated and untreated clay samples. The structure of the clay minerals was also analyzed using an X-ray Diffraction machine. The results showed that the pore volume and surface area were increased substantially through acid treatment, which speeded up the rate of carbon dioxide adsorption. XRD pattern of kaolinite did not change after sulfuric acid treatment, which indicates that acid treatment would not affect the structure of kaolinite. It was also discovered that kaolinite had a higher pore volume and porosity than montmorillonite before and after acid treatment. For example, the pore volume of untreated kaolinite was equal to 30.498 um3 with a porosity of 23.49%. Raising the concentration of acid from 0.5 M to 2.5 M in 8 hours’ time reaction led to increased pore volume from 30.498 um3 to 34.73 um3. The pore volume of raw montmorillonite was equal to 15.610 um3 with a porosity of 12.7%. When the acid concentration was raised from 0.5 M to 2.5 M for the same reaction time, pore volume also increased from 15.610 um3 to 20.538 um3. However, montmorillonite had a higher specific surface area than kaolinite. This study concludes that clay minerals are inexpensive and available material sources to model the realistic conditions and apply the results of carbon capture to prevent global warming, which is one of the most critical and urgent problems in the world.

Keywords: acid treatment, kaolinite, montmorillonite, pore volume, porosity, surface area

Procedia PDF Downloads 143
315 Systematic Review of Digital Interventions to Reduce the Carbon Footprint of Primary Care

Authors: Anastasia Constantinou, Panayiotis Laouris, Stephen Morris

Abstract:

Background: Climate change has been reported as one of the worst threats to healthcare. The healthcare sector is a significant contributor to greenhouse gas emissions with primary care being responsible for 23% of the NHS’ total carbon footprint. Digital interventions, primarily focusing on telemedicine, offer a route to change. This systematic review aims to quantify and characterize the carbon footprint savings associated with the implementation of digital interventions in the setting of primary care. Methods: A systematic review of published literature was conducted according to PRISMA (Preferred Reporting Item for Systematic Reviews and Meta-Analyses) guidelines. MEDLINE, PubMed, and Scopus databases as well as Google scholar were searched using key terms relating to “carbon footprint,” “environmental impact,” “sustainability”, “green care”, “primary care,”, and “general practice,” using citation tracking to identify additional articles. Data was extracted and analyzed in Microsoft Excel. Results: Eight studies were identified conducted in four different countries between 2010 and 2023. Four studies used interventions to address primary care services, three studies focused on the interface between primary and specialist care, and one study addressed both. Digital interventions included the use of mobile applications, online portals, access to electronic medical records, electronic referrals, electronic prescribing, video-consultations and use of autonomous artificial intelligence. Only one study carried out a complete life cycle assessment to determine the carbon footprint of the intervention. It estimate that digital interventions reduced the carbon footprint at primary care level by 5.1 kgCO2/visit, and at the interface with specialist care by 13.4 kg CO₂/visit. When assessing the relationship between travel-distance saved and savings in emissions, we identified a strong correlation, suggesting that most of the carbon footprint reduction is attributed to reduced travel. However, two studies also commented on environmental savings associated with reduced use of paper. Patient savings in the form of reduced fuel cost and reduced travel time were also identified. Conclusion: All studies identified significant reductions in carbon footprint following implementation of digital interventions. In the future, controlled, prospective studies incorporating complete life cycle assessments and accounting for double-consulting effects, use of additional resources, technical failures, quality of care and cost-effectiveness are needed to fully appreciate the sustainable benefit of these interventions

Keywords: carbon footprint, environmental impact, primary care, sustainable healthcare

Procedia PDF Downloads 37
314 Counter-Terrorism and De-Radicalization as Soft Strategies in Combating Terrorism in Indonesia: A Critical Review

Authors: Tjipta Lesmana

Abstract:

Terrorist attacks quickly penetrated Indonesia following the downfall of Soeharto regime in May 1998. Reform era was officially proclaimed. Indonesia turned to 'heaven state' from 'authoritarian state'. For the first time since 1966, the country experienced a full-scale freedom of expression, including freedom of the press, and heavy acknowledgement of human rights practice. Some religious extremists previously run away to neighbor countries to escape from security apparatus secretly backed home. Quickly they consolidated the power to continue their long aspiration and dream to establish 'Shariah Indonesia', Indonesia based on Khilafah ideology. Bali bombings I which shocked world community occurred on 12 October 2002 in the famous tourist district of Kuta on the Indonesian island of Bali, killing 202 people (including 88 Australians, 38 Indonesians, and people from more than 20 other nationalities). In the capital, Jakarta, successive bombings were blasted in Marriott hotel, Australian Embassy, residence of the Philippine Ambassador and stock exchange office. A 'drunken Indonesia' is far from ready to combat nationwide sudden and massive terrorist attacks. Police Detachment 88 (Densus 88) Indonesian counter-terrorism squad, was quickly formed following 2002 Bali Bombing. Anti-terrorism Provisional Act was immediately erected, as well, due to urgent need to fight terrorism. Some Bali bombings criminals were deadly executed after sentenced by the court. But a series of terrorist suicide attacks and another Bali bombings (the second one) in Bali, again, shocked world community. Terrorism network is undoubtedly spreading nationwide. Suspicion is high that they had close connection with Al Qaeda’s groups. Even 'Afghanistan alumni' and 'Syria alumni' returned to Indonesia to back up the local mujahidins in their fights to topple Indonesia constitutional government and set up Islamic state (Khilafah). Supported by massive aids from friendly nations, especially Australia and United States, Indonesia launched large scale operations to crush terrorism consisted of various radical groups such as JAD, JAS, and JAADI. Huge energy, money, and souls were dedicated. Terrorism is, however, persistently entrenched. High ranking officials from Detachment 88 squad and military intelligence believe that terrorism is still one the most deadly enemy of Indonesia.

Keywords: counter-radicalization, de-radicalization, Khalifah, Union State, Al Qaedah, ISIS

Procedia PDF Downloads 152
313 Thermoelectric Blanket for Aiding the Treatment of Cerebral Hypoxia and Other Related Conditions

Authors: Sarayu Vanga, Jorge Galeano-Cabral, Kaya Wei

Abstract:

Cerebral hypoxia refers to a condition in which there is a decrease in oxygen supply to the brain. Patients suffering from this condition experience a decrease in their body temperature. While there isn't any cure to treat cerebral hypoxia as of date, certain procedures are utilized to help aid in the treatment of the condition. Regulating the body temperature is an example of one of those procedures. Hypoxia is well known to reduce the body temperature of mammals, although the neural origins of this response remain uncertain. In order to speed recovery from this condition, it is necessary to maintain a stable body temperature. In this study, we present an approach to regulating body temperature for patients who suffer from cerebral hypoxia or other similar conditions. After a thorough literature study, we propose the use of thermoelectric blankets, which are temperature-controlled thermal blankets based on thermoelectric devices. These blankets are capable of heating up and cooling down the patient to stabilize body temperature. This feature is possible through the reversible effect that thermoelectric devices offer while behaving as a thermal sensor, and it is an effective way to stabilize temperature. Thermoelectricity is the direct conversion of thermal to electrical energy and vice versa. This effect is now known as the Seebeck effect, and it is characterized by the Seebeck coefficient. In such a configuration, the device has cooling and heating sides with temperatures that can be interchanged by simply switching the direction of the current input in the system. This design integrates various aspects, including a humidifier, ventilation machine, IV-administered medication, air conditioning, circulation device, and a body temperature regulation system. The proposed design includes thermocouples that will trigger the blanket to increase or decrease a set temperature through a medical temperature sensor. Additionally, the proposed design allows an efficient way to control fluctuations in body temperature while being cost-friendly, with an expected cost of 150 dollars. We are currently working on developing a prototype of the design to collect thermal and electrical data under different conditions and also intend to perform an optimization analysis to improve the design even further. While this proposal was developed for treating cerebral hypoxia, it can also aid in the treatment of other related conditions, as fluctuations in body temperature appear to be a common symptom that patients have for many illnesses.

Keywords: body temperature regulation, cerebral hypoxia, thermoelectric, blanket design

Procedia PDF Downloads 125
312 Chatbots and the Future of Globalization: Implications of Businesses and Consumers

Authors: Shoury Gupta

Abstract:

Chatbots are a rapidly growing technological trend that has revolutionized the way businesses interact with their customers. With the advancements in artificial intelligence, chatbots can now mimic human-like conversations and provide instant and efficient responses to customer inquiries. In this research paper, we aim to explore the implications of chatbots on the future of globalization for both businesses and consumers. The paper begins by providing an overview of the current state of chatbots in the global market and their growth potential in the future. The focus is on how chatbots have become a valuable tool for businesses looking to expand their global reach, especially in areas with high population density and language barriers. With chatbots, businesses can engage with customers in different languages and provide 24/7 customer service support, creating a more accessible and convenient customer experience. The paper then examines the impact of chatbots on cross-cultural communication and how they can help bridge communication gaps between businesses and consumers from different cultural backgrounds. Chatbots can potentially facilitate cross-cultural communication by offering real-time translations, voice recognition, and other innovative features that can help users communicate effectively across different languages and cultures. By providing more accessible and inclusive communication channels, chatbots can help businesses reach new markets and expand their customer base, making them more competitive in the global market. However, the paper also acknowledges that there are potential drawbacks associated with chatbots. For instance, chatbots may not be able to address complex customer inquiries that require human input. Additionally, chatbots may perpetuate biases if they are programmed with certain stereotypes or assumptions about different cultures. These drawbacks may have significant implications for businesses and consumers alike. To explore the implications of chatbots on the future of globalization in greater detail, the paper provides a thorough review of existing literature and case studies. The review covers topics such as the benefits of chatbots for businesses and consumers, the potential drawbacks of chatbots, and how businesses can mitigate any risks associated with chatbot use. The paper also discusses the ethical considerations associated with chatbot use, such as privacy concerns and the need to ensure that chatbots do not discriminate against certain groups of people. The ethical implications of chatbots are particularly important given the potential for chatbots to be used in sensitive areas such as healthcare and financial services. Overall, this research paper provides a comprehensive analysis of chatbots and their implications for the future of globalization. By exploring both the potential benefits and drawbacks of chatbot use, the paper aims to provide insights into how businesses and consumers can leverage this technology to achieve greater global reach and improve cross-cultural communication. Ultimately, the paper concludes that chatbots have the potential to be a powerful tool for businesses looking to expand their global footprint and improve their customer experience, but that care must be taken to mitigate any risks associated with their use.

Keywords: chatbots, conversational AI, globalization, businesses

Procedia PDF Downloads 69
311 Human Interaction Skills and Employability in Courses with Internships: Report of a Decade of Success in Information Technology

Authors: Filomena Lopes, Miguel Magalhaes, Carla Santos Pereira, Natercia Durao, Cristina Costa-Lobo

Abstract:

The option to implement curricular internships with undergraduate students is a pedagogical option with some good results perceived by academic staff, employers, and among graduates in general and IT (Information Technology) in particular. Knowing that this type of exercise has never been so relevant, as one tries to give meaning to the future in a landscape of rapid and deep changes. We have as an example the potential disruptive impact on the jobs of advances in robotics, artificial intelligence and 3-D printing, which is a focus of fierce debate. It is in this context that more and more students and employers engage in the pursuit of career-promoting responses and business development, making their investment decisions of training and hiring. Three decades of experience and research in computer science degree and in information systems technologies degree at the Portucalense University, Portuguese private university, has provided strong evidence of its advantages. The Human Interaction Skills development as well as the attractiveness of such experiences for students are topics assumed as core in the Ccnception and management of the activities implemented in these study cycles. The objective of this paper is to gather evidence of the Human Interaction Skills explained and valued within the curriculum internship experiences of IT students employability. Data collection was based on the application of questionnaire to intern counselors and to students who have completed internships in these undergraduate courses in the last decade. The trainee supervisor, responsible for monitoring the performance of IT students in the evolution of traineeship activities, evaluates the following Human Interaction Skills: Motivation and interest in the activities developed, interpersonal relationship, cooperation in company activities, assiduity, ease of knowledge apprehension, Compliance with norms, insertion in the work environment, productivity, initiative, ability to take responsibility, creativity in proposing solutions, and self-confidence. The results show that these undergraduate courses promote the development of Human Interaction Skills and that these students, once they finish their degree, are able to initiate remunerated work functions, mainly by invitation of the institutions in which they perform curricular internships. Findings obtained from the present study contribute to widen the analysis of its effectiveness in terms of future research and actions in regard to the transition from Higher Education pathways to the Labour Market.

Keywords: human interaction skills, employability, internships, information technology, higher education

Procedia PDF Downloads 265
310 Conflict Resolution in Fuzzy Rule Base Systems Using Temporal Modalities Inference

Authors: Nasser S. Shebka

Abstract:

Fuzzy logic is used in complex adaptive systems where classical tools of representing knowledge are unproductive. Nevertheless, the incorporation of fuzzy logic, as it’s the case with all artificial intelligence tools, raised some inconsistencies and limitations in dealing with increased complexity systems and rules that apply to real-life situations and hinders the ability of the inference process of such systems, but it also faces some inconsistencies between inferences generated fuzzy rules of complex or imprecise knowledge-based systems. The use of fuzzy logic enhanced the capability of knowledge representation in such applications that requires fuzzy representation of truth values or similar multi-value constant parameters derived from multi-valued logic, which set the basis for the three t-norms and their based connectives which are actually continuous functions and any other continuous t-norm can be described as an ordinal sum of these three basic ones. However, some of the attempts to solve this dilemma were an alteration to fuzzy logic by means of non-monotonic logic, which is used to deal with the defeasible inference of expert systems reasoning, for example, to allow for inference retraction upon additional data. However, even the introduction of non-monotonic fuzzy reasoning faces a major issue of conflict resolution for which many principles were introduced, such as; the specificity principle and the weakest link principle. The aim of our work is to improve the logical representation and functional modelling of AI systems by presenting a method of resolving existing and potential rule conflicts by representing temporal modalities within defeasible inference rule-based systems. Our paper investigates the possibility of resolving fuzzy rules conflict in a non-monotonic fuzzy reasoning-based system by introducing temporal modalities and Kripke's general weak modal logic operators in order to expand its knowledge representation capabilities by means of flexibility in classifying newly generated rules, and hence, resolving potential conflicts between these fuzzy rules. We were able to address the aforementioned problem of our investigation by restructuring the inference process of the fuzzy rule-based system. This is achieved by using time-branching temporal logic in combination with restricted first-order logic quantifiers, as well as propositional logic to represent classical temporal modality operators. The resulting findings not only enhance the flexibility of complex rule-base systems inference process but contributes to the fundamental methods of building rule bases in such a manner that will allow for a wider range of applicable real-life situations derived from a quantitative and qualitative knowledge representational perspective.

Keywords: fuzzy rule-based systems, fuzzy tense inference, intelligent systems, temporal modalities

Procedia PDF Downloads 65
309 Forensic Investigation: The Impact of Biometric-Based Solution in Combatting Mobile Fraud

Authors: Mokopane Charles Marakalala

Abstract:

Research shows that mobile fraud has grown exponentially in South Africa during the lockdown caused by the COVID-19 pandemic. According to the South African Banking Risk Information Centre (SABRIC), fraudulent online banking and transactions resulted in a sharp increase in cybercrime since the beginning of the lockdown, resulting in a huge loss to the banking industry in South Africa. While the Financial Intelligence Centre Act, 38 of 2001, regulate financial transactions, it is evident that criminals are making use of technology to their advantage. Money-laundering ranks among the major crimes, not only in South Africa but worldwide. This paper focuses on the impact of biometric-based solutions in combatting mobile fraud at the South African Risk Information. SABRIC had the challenges of a successful mobile fraud; cybercriminals could hijack a mobile device and use it to gain access to sensitive personal data and accounts. Cybercriminals are constantly looting the depths of cyberspace in search of victims to attack. Millions of people worldwide use online banking to do their regular bank-related transactions quickly and conveniently. This was supported by the SABRIC, who regularly highlighted incidents of mobile fraud, corruption, and maladministration in SABRIC, resulting in a lack of secure their banking online; they are vulnerable to falling prey to fraud scams such as mobile fraud. Criminals have made use of digital platforms since the development of technology. In 2017, 13 438 instances involving banking apps, internet banking, and mobile banking caused the sector to suffer gross losses of more than R250,000,000. The final three parties are forced to point fingers at one another while the fraudster makes off with the money. A non-probability sampling (purposive sampling) was used in selecting these participants. These included telephone calls and virtual interviews. The results indicate that there is a relationship between remote online banking and the increase in money-laundering as the system allows transactions to take place with limited verification processes. This paper highlights the significance of considering the development of prevention mechanisms, capacity development, and strategies for both financial institutions as well as law enforcement agencies in South Africa to reduce crime such as money-laundering. The researcher recommends that strategies to increase awareness for bank staff must be harnessed through the provision of requisite training and to be provided adequate training.

Keywords: biometric-based solution, investigation, cybercrime, forensic investigation, fraud, combatting

Procedia PDF Downloads 71
308 The Impact of Formulate and Implementation Strategy for an Organization to Better Financial Consequences in Malaysian Private Hospital

Authors: Naser Zouri

Abstract:

Purpose: Measures of formulate and implementation strategy shows amount of product rate-market based strategic management category such as courtesy, competence, and compliance to reach the high loyalty of financial ecosystem. Despite, it solves the market place error intention to fair trade organization. Finding: Finding shows the ability of executives’ level of management to motivate and better decision-making to solve the treatments in business organization. However, it made ideal level of each interposition policy for a hypothetical household. Methodology/design. Style of questionnaire about the data collection was selected to survey of both pilot test and real research. Also, divide of questionnaire and using of Free Scale Semiconductor`s between the finance employee was famous of this instrument. Respondent`s nominated basic on non-probability sampling such as convenience sampling to answer the questionnaire. The way of realization costs to performed the questionnaire divide among the respondent`s approximately was suitable as a spend the expenditure to reach the answer but very difficult to collect data from hospital. However, items of research survey was formed of implement strategy, environment, supply chain, employee from impact of implementation strategy on reach to better financial consequences and also formulate strategy, comprehensiveness strategic design, organization performance from impression on formulate strategy and financial consequences. Practical Implication: Dynamic capability approach of formulate and implement strategy focuses on the firm-specific processes through which firms integrate, build, or reconfigure resources valuable for making a theoretical contribution. Originality/ value of research: Going beyond the current discussion, we show that case studies have the potential to extend and refine theory. We present new light on how dynamic capabilities can benefit from case study research by discovering the qualifications that shape the development of capabilities and determining the boundary conditions of the dynamic capabilities approach. Limitation of the study :Present study also relies on survey of methodology for data collection and the response perhaps connection by financial employee was difficult to responds the question because of limitation work place.

Keywords: financial ecosystem, loyalty, Malaysian market error, dynamic capability approach, rate-market, optimization intelligence strategy, courtesy, competence, compliance

Procedia PDF Downloads 274
307 Comparative Analysis of Fused Deposition Modeling and Binding-Jet 3D Printing Technologies

Authors: Mohd Javaid, Shahbaz Khan, Abid Haleem

Abstract:

Purpose: Large numbers of 3D printing technologies are now available for sophisticated applications in different fields. Additive manufacturing has established its dominance in design, development, and customisation of the product. In the era of developing technologies, there is a need to identify the appropriate technology for different application. In order to fulfil this need, two widely used printing technologies such as Fused Deposition Modeling (FDM), and Binding-Jet 3D Printing are compared for effective utilisation in the current scenario for different applications. Methodology: Systematic literature review conducted for both technologies with applications and associated factors enabling for the same. Appropriate MCDM tool is used to compare critical factors for both the technologies. Findings: Both technologies have their potential and capabilities to provide better direction to the industry. Additionally, this paper is helpful to develop a decision support system for the proper selection of technologies according to their continuum of applications and associated research and development capability. The vital issue is raw materials, and research-based material development is key to the sustainability of the developed technologies. FDM is a low-cost technology which provides high strength product as compared to binding jet technology. Researcher and companies can take benefits of this study to achieve the required applications in lesser resources. Limitations: Study has undertaken the comparison with the opinion of experts, which may not always be free from bias, and some own limitations of each technology. Originality: Comparison between these technologies will help to identify best-suited technology as per the customer requirements. It also provides development in this different field as per their extensive capability where these technologies can be successfully adopted. Conclusion: FDM and binding jet technology play an active role in industrial development. These help to assist the customisation and production of personalised parts cost-effectively. So, there is a need to understand how these technologies can provide these developments rapidly. These technologies help in easy changes or in making revised versions of the product, which is not easily possible in the conventional manufacturing system. High machine cost, the requirement of skilled human resources, low surface finish, and mechanical strength of product and material changing option is the main limitation of this technology. However, these limitations vary from technology to technology. In the future, these technologies are to be commercially viable for efficient usage in direct manufacturing of varied parts.

Keywords: 3D printing, comparison, fused deposition modeling, FDM, binding jet technology

Procedia PDF Downloads 86
306 Utility of Thromboelastography to Reduce Coagulation-Related Mortality and Blood Component Rate in Neurosurgery ICU

Authors: Renu Saini, Deepak Agrawal

Abstract:

Background: Patients with head and spinal cord injury frequently have deranged coagulation profiles and require blood products transfusion perioperatively. Thromboelastography (TEG) is a ‘bedside’ global test of coagulation which may have role in deciding the need of transfusion in such patients. Aim: To assess the usefulness of TEG in department of neurosurgery in decreasing transfusion rates and coagulation-related mortality in traumatic head and spinal cord injury. Method and Methodology: A retrospective comparative study was carried out in the department of neurosurgery over a period of 1 year. There are two groups in this study. ‘Control’ group constitutes the patients in whom data was collected over 6 months (1/6/2009-31/12/2009) prior to installation of TEG machine. ‘Test’ group includes patients in whom data was collected over 6months (1/1/2013-30/6/2013) post TEG installation. Total no. of platelet, FFP, and cryoprecipitate transfusions were noted in both groups along with in hospital mortality and length of stay. Result: Both groups were matched in age and sex of patients, number of head and spinal cord injury cases, number of patients with thrombocytopenia and number of patients who underwent operation. Total 178 patients (135 head injury and 43 spinal cord injury patents) were admitted in neurosurgery department during time period June 2009 to December 2009 i.e. prior to TEG installation and after TEG installation a total of 243 patients(197 head injury and 46 spinal cord injury patents) were admitted. After TEG introduction platelet transfusion significantly reduced (p=0.000) compare to control group (67 units to 34 units). Mortality rate was found significantly reduced after installation (77 patients to 57 patients, P=0.000). Length of stay was reduced significantly (Prior installation 1-211days and after installation 1-115days, p=0.02). Conclusion: Bedside TEG can dramatically reduce platelet transfusion components requirement in department of neurosurgery. TEG also lead to a drastic decrease in mortality rate and length of stay in patients with traumatic head and spinal cord injuries. We recommend its use as a standard of care in the patients with traumatic head and spinal cord injuries.

Keywords: blood component transfusion, mortality, neurosurgery ICU, thromboelastography

Procedia PDF Downloads 305
305 Digital Immunity System for Healthcare Data Security

Authors: Nihar Bheda

Abstract:

Protecting digital assets such as networks, systems, and data from advanced cyber threats is the aim of Digital Immunity Systems (DIS), which are a subset of cybersecurity. With features like continuous monitoring, coordinated reactions, and long-term adaptation, DIS seeks to mimic biological immunity. This minimizes downtime by automatically identifying and eliminating threats. Traditional security measures, such as firewalls and antivirus software, are insufficient for enterprises, such as healthcare providers, given the rapid evolution of cyber threats. The number of medical record breaches that have occurred in recent years is proof that attackers are finding healthcare data to be an increasingly valuable target. However, obstacles to enhancing security include outdated systems, financial limitations, and a lack of knowledge. DIS is an advancement in cyber defenses designed specifically for healthcare settings. Protection akin to an "immune system" is produced by core capabilities such as anomaly detection, access controls, and policy enforcement. Coordination of responses across IT infrastructure to contain attacks is made possible by automation and orchestration. Massive amounts of data are analyzed by AI and machine learning to find new threats. After an incident, self-healing enables services to resume quickly. The implementation of DIS is consistent with the healthcare industry's urgent requirement for resilient data security in light of evolving risks and strict guidelines. With resilient systems, it can help organizations lower business risk, minimize the effects of breaches, and preserve patient care continuity. DIS will be essential for protecting a variety of environments, including cloud computing and the Internet of medical devices, as healthcare providers quickly adopt new technologies. DIS lowers traditional security overhead for IT departments and offers automated protection, even though it requires an initial investment. In the near future, DIS may prove to be essential for small clinics, blood banks, imaging centers, large hospitals, and other healthcare organizations. Cyber resilience can become attainable for the whole healthcare ecosystem with customized DIS implementations.

Keywords: digital immunity system, cybersecurity, healthcare data, emerging technology

Procedia PDF Downloads 38
304 Internet-Of-Things and Ergonomics, Increasing Productivity and Reducing Waste: A Case Study

Authors: V. Jaime Contreras, S. Iliana Nunez, S. Mario Sanchez

Abstract:

Inside a manufacturing facility, we can find innumerable automatic and manual operations, all of which are relevant to the production process. Some of these processes add more value to the products more than others. Manual operations tend to add value to the product since they can be found in the final assembly area o final operations of the process. In this areas, where a mistake or accident can increase the cost of waste exponentially. To reduce or mitigate these costly mistakes, one approach is to rely on automation to eliminate the operator from the production line - requires a hefty investment and development of specialized machinery. In our approach, the center of the solution is the operator through sufficient and adequate instrumentation, real-time reporting and ergonomics. Efficiency and reduced cycle time can be achieved thorough the integration of Internet-of-Things (IoT) ready technologies into assembly operations to enhance the ergonomics of the workstations. Augmented reality visual aids, RFID triggered personalized workstation dimensions and real-time data transfer and reporting can help achieve these goals. In this case study, a standard work cell will be used for real-life data acquisition and a simulation software to extend the data points beyond the test cycle. Three comparison scenarios will run in the work cell. Each scenario will introduce a dimension of the ergonomics to measure its impact independently. Furthermore, the separate test will determine the limitations of the technology and provide a reference for operating costs and investment required. With the ability, to monitor costs, productivity, cycle time and scrap/waste in real-time the ROI (return on investment) can be determined at the different levels to integration. This case study will help to show that ergonomics in the assembly lines can make significant impact when IoT technologies are introduced. Ergonomics can effectively reduce waste and increase productivity with minimal investment if compared with setting up to custom machine.

Keywords: augmented reality visual aids, ergonomics, real-time data acquisition and reporting, RFID triggered workstation dimensions

Procedia PDF Downloads 190
303 Case-Based Reasoning Application to Predict Geological Features at Site C Dam Construction Project

Authors: Shahnam Behnam Malekzadeh, Ian Kerr, Tyson Kaempffer, Teague Harper, Andrew Watson

Abstract:

The Site C Hydroelectric dam is currently being constructed in north-eastern British Columbia on sub-horizontal sedimentary strata that dip approximately 15 meters from one bank of the Peace River to the other. More than 615 pressure sensors (Vibrating Wire Piezometers) have been installed on bedding planes (BPs) since construction began, with over 80 more planned before project completion. These pressure measurements are essential to monitor the stability of the rock foundation during and after construction and for dam safety purposes. BPs are identified by their clay gouge infilling, which varies in thickness from less than 1 to 20 mm and can be challenging to identify as the core drilling process often disturbs or washes away the gouge material. Without the use of depth predictions from nearby boreholes, stratigraphic markers, and downhole geophysical data, it is difficult to confidently identify BP targets for the sensors. In this paper, a Case-Based Reasoning (CBR) method was used to develop an empirical model called the Bedding Plane Elevation Prediction (BPEP) to help geologists and geotechnical engineers to predict geological features and bedding planes at new locations in a fast and accurate manner. To develop CBR, a database was developed based on 64 pressure sensors already installed on key bedding planes BP25, BP28, and BP31 on the Right Bank, including bedding plane elevations and coordinates. Thirteen (20%) of the most recent cases were selected to validate and evaluate the accuracy of the developed model, while the similarity was defined as the distance between previous cases and recent cases to predict the depth of significant BPs. The average difference between actual BP elevations and predicted elevations for above BPs was ±55cm, while the actual results showed that 69% of predicted elevations were within ±79 cm of actual BP elevations while 100% of predicted elevations for new cases were within ±99cm range. Eventually, the actual results will be used to develop the database and improve BPEP to perform as a learning machine to predict more accurate BP elevations for future sensor installations.

Keywords: case-based reasoning, geological feature, geology, piezometer, pressure sensor, core logging, dam construction

Procedia PDF Downloads 57
302 Object-Scene: Deep Convolutional Representation for Scene Classification

Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang

Abstract:

Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.

Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization

Procedia PDF Downloads 305
301 Interdisciplinary Evaluations of Children with Autism Spectrum Disorder in a Telehealth Arena

Authors: Janice Keener, Christine Houlihan

Abstract:

Over the last several years, there has been an increase in children identified as having Autism Spectrum Disorder (ASD). Specialists across several disciplines: mental health and medical professionals have been tasked with ensuring accurate and timely evaluations for children with suspected ASD. Due to the nature of the ASD symptom presentation, an interdisciplinary assessment and treatment approach best addresses the needs of the whole child. During the unprecedented COVID-19 Pandemic, clinicians were faced with how to continue with interdisciplinary assessments in a telehealth arena. Instruments that were previously used to assess ASD in-person were no longer appropriate measures to use due to the safety restrictions. For example, The Autism Diagnostic Observation Schedule requires examiners and children to be in very close proximity of each other and if masks or face shields are worn, they render the evaluation invalid. Similar issues arose with the various cognitive measures that are used to assess children such as the Weschler Tests of Intelligence and the Differential Ability Scale. Thus the need arose to identify measures that are able to be safely and accurately administered using safety guidelines. The incidence of ASD continues to rise over time. Currently, the Center for Disease Control estimates that 1 in 59 children meet the criteria for a diagnosis of ASD. The reasons for this increase are likely multifold, including changes in diagnostic criteria, public awareness of the condition, and other environmental and genetic factors. The rise in the incidence of ASD has led to a greater need for diagnostic and treatment services across the United States. The uncertainty of the diagnostic process can lead to an increased level of stress for families of children with suspected ASD. Along with this increase, there is a need for diagnostic clarity to avoid both under and over-identification of this condition. Interdisciplinary assessment is ideal for children with suspected ASD, as it allows for an assessment of the whole child over the course of time and across multiple settings. Clinicians such as Psychologists and Developmental Pediatricians play important roles in the initial evaluation of autism spectrum disorder. An ASD assessment may consist of several types of measures such as standardized checklists, structured interviews, and direct assessments such as the ADOS-2 are just a few examples. With the advent of telehealth clinicians were asked to continue to provide meaningful interdisciplinary assessments via an electronic platform and, in a sense, going to the family home and evaluating the clinical symptom presentation remotely and confidently making an accurate diagnosis. This poster presentation will review the benefits, limitations, and interpretation of these various instruments. The role of other medical professionals will also be addressed, including medical providers, speech pathology, and occupational therapy.

Keywords: Autism Spectrum Disorder Assessments, Interdisciplinary Evaluations , Tele-Assessment with Autism Spectrum Disorder, Diagnosis of Autism Spectrum Disorder

Procedia PDF Downloads 184
300 Effect of Non-metallic Inclusion from the Continuous Casting Process on the Multi-Stage Forging Process and the Tensile Strength of the Bolt: Case Study

Authors: Tomasz Dubiel, Tadeusz Balawender, Miroslaw Osetek

Abstract:

The paper presents the influence of non-metallic inclusions on the multi-stage forging process and the mechanical properties of the dodecagon socket bolt used in the automotive industry. The detected metallurgical defect was so large that it directly influenced the mechanical properties of the bolt and resulted in failure to meet the requirements of the mechanical property class. In order to assess the defect, an X-ray examination and metallographic examination of the defective bolt were performed, showing exogenous non-metallic inclusion. The size of the defect on the cross-section was 0.531 [mm] in width and 1.523 [mm] in length; the defect was continuous along the entire axis of the bolt. In analysis, a FEM simulation of the multi-stage forging process was designed, taking into account a non-metallic inclusion parallel to the sample axis, reflecting the studied case. The process of defect propagation due to material upset in the head area was analyzed. The final forging stage in shaping the dodecagonal socket and filling the flange area was particularly studied. The effect of the defect was observed to significantly reduce the effective cross-section as a result of the expansion of the defect perpendicular to the axis of the bolt. The mechanical properties of products with and without the defect were analyzed. In the first step, the hardness test confirmed that the required value for the mechanical class 8.8 of both bolt types was obtained. In the second step, the bolts were subjected to a static tensile test. The bolts without the defect gave a positive result, while all 10 bolts with the defect gave a negative result, achieving a tensile strength below the requirements. Tensile strength tests were confirmed by metallographic tests and FEM simulation with perpendicular inclusion spread in the area of the head. The bolts were damaged directly under the bolt head, which is inconsistent with the requirements of ISO 898-1. It has been shown that non-metallic inclusions with orientation in accordance with the axis of the bolt can directly cause loss of functionality and these defects should be detected even before assembling in the machine element.

Keywords: continuous casting, multi-stage forging, non-metallic inclusion, upset bolt head

Procedia PDF Downloads 136
299 Development of a Real-Time Simulink Based Robotic System to Study Force Feedback Mechanism during Instrument-Object Interaction

Authors: Jaydip M. Desai, Antonio Valdevit, Arthur Ritter

Abstract:

Robotic surgery is used to enhance minimally invasive surgical procedure. It provides greater degree of freedom for surgical tools but lacks of haptic feedback system to provide sense of touch to the surgeon. Surgical robots work on master-slave operation, where user is a master and robotic arms are the slaves. Current, surgical robots provide precise control of the surgical tools, but heavily rely on visual feedback, which sometimes cause damage to the inner organs. The goal of this research was to design and develop a real-time simulink based robotic system to study force feedback mechanism during instrument-object interaction. Setup includes three Velmex XSlide assembly (XYZ Stage) for three dimensional movement, an end effector assembly for forceps, electronic circuit for four strain gages, two Novint Falcon 3D gaming controllers, microcontroller board with linear actuators, MATLAB and Simulink toolboxes. Strain gages were calibrated using Imada Digital Force Gauge device and tested with a hard-core wire to measure instrument-object interaction in the range of 0-35N. Designed simulink model successfully acquires 3D coordinates from two Novint Falcon controllers and transfer coordinates to the XYZ stage and forceps. Simulink model also reads strain gages signal through 10-bit analog to digital converter resolution of a microcontroller assembly in real time, converts voltage into force and feedback the output signals to the Novint Falcon controller for force feedback mechanism. Experimental setup allows user to change forward kinematics algorithms to achieve the best-desired movement of the XYZ stage and forceps. This project combines haptic technology with surgical robot to provide sense of touch to the user controlling forceps through machine-computer interface.

Keywords: surgical robot, haptic feedback, MATLAB, strain gage, simulink

Procedia PDF Downloads 512
298 New Gas Geothermometers for the Prediction of Subsurface Geothermal Temperatures: An Optimized Application of Artificial Neural Networks and Geochemometric Analysis

Authors: Edgar Santoyo, Daniel Perez-Zarate, Agustin Acevedo, Lorena Diaz-Gonzalez, Mirna Guevara

Abstract:

Four new gas geothermometers have been derived from a multivariate geo chemometric analysis of a geothermal fluid chemistry database, two of which use the natural logarithm of CO₂ and H2S concentrations (mmol/mol), respectively, and the other two use the natural logarithm of the H₂S/H₂ and CO₂/H₂ ratios. As a strict compilation criterion, the database was created with gas-phase composition of fluids and bottomhole temperatures (BHTM) measured in producing wells. The calibration of the geothermometers was based on the geochemical relationship existing between the gas-phase composition of well discharges and the equilibrium temperatures measured at bottomhole conditions. Multivariate statistical analysis together with the use of artificial neural networks (ANN) was successfully applied for correlating the gas-phase compositions and the BHTM. The predicted or simulated bottomhole temperatures (BHTANN), defined as output neurons or simulation targets, were statistically compared with measured temperatures (BHTM). The coefficients of the new geothermometers were obtained from an optimized self-adjusting training algorithm applied to approximately 2,080 ANN architectures with 15,000 simulation iterations each one. The self-adjusting training algorithm used the well-known Levenberg-Marquardt model, which was used to calculate: (i) the number of neurons of the hidden layer; (ii) the training factor and the training patterns of the ANN; (iii) the linear correlation coefficient, R; (iv) the synaptic weighting coefficients; and (v) the statistical parameter, Root Mean Squared Error (RMSE) to evaluate the prediction performance between the BHTM and the simulated BHTANN. The prediction performance of the new gas geothermometers together with those predictions inferred from sixteen well-known gas geothermometers (previously developed) was statistically evaluated by using an external database for avoiding a bias problem. Statistical evaluation was performed through the analysis of the lowest RMSE values computed among the predictions of all the gas geothermometers. The new gas geothermometers developed in this work have been successfully used for predicting subsurface temperatures in high-temperature geothermal systems of Mexico (e.g., Los Azufres, Mich., Los Humeros, Pue., and Cerro Prieto, B.C.) as well as in a blind geothermal system (known as Acoculco, Puebla). The last results of the gas geothermometers (inferred from gas-phase compositions of soil-gas bubble emissions) compare well with the temperature measured in two wells of the blind geothermal system of Acoculco, Puebla (México). Details of this new development are outlined in the present research work. Acknowledgements: The authors acknowledge the funding received from CeMIE-Geo P09 project (SENER-CONACyT).

Keywords: artificial intelligence, gas geochemistry, geochemometrics, geothermal energy

Procedia PDF Downloads 314