Search results for: dynamic algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6993

Search results for: dynamic algorithm

1083 Psychopathy Evaluation for People with Intellectual Disability Living in Institute Using Chinese Version of the Psychopathology Inventory

Authors: Lin Fu-Gong

Abstract:

Background: As WHO announced, people with intellectual disability (ID) were vulnerable to mental health problems. And there were few custom-made mental health scales for those people to monitor their mental health. Those people with mental problems often accompanied worse prognosis and usually became to be a heavier burden on the caregivers. Purpose: In this study, we intend to develop a psychopathy scale as a practical tool for monitoring the mental health for people with ID living in institute. Methods: In this study, we adopt the Psychopathology Inventory for Mentally Retarded Adults developed by professor Matson with certified reliability and validity in Western countries with Dr. Matson’s agreement in advance. We first translated the inventory into Chinese validated version considering the domestic culture background in the past year. And the validity and reliability evaluation of mental health status using this inventory among the people with intellectual living in the institute were done. Results: The inventory includes eight psychiatric disorder scales as schizophrenic, affective, psychosexual, adjustment, anxiety, somatoform, personality disorders and inappropriate mental adjustment. Around 83% of 40 invested people, who randomly selected from the institute, were found to have at least one disorder who were recommended with medical help by two evaluators. Among the residents examined, somatoform disorder and inappropriate mental adjustment were most popular with 60% and 78% people respectively. Conclusion: The result showed the prevalence psychiatric disorders were relatively high among people with ID in institute and the mental problems need to be further cared and followed for their mental health. The results showed that the psychopathology inventory was a useful tool for institute caregiver, manager and for long-term care policy to the government. In the coming stage, we plan to extend the use of the valid Chinese version inventory among more different type institutes for people with ID to establish their dynamic mental health status including medical need, relapse and rehabilitation to promote their mental health.

Keywords: intellectual disability, psychiatric disorder, psychopathology inventory, mental health, the institute

Procedia PDF Downloads 236
1082 Battle of Narratives: Georgia between Dialogue and Confrontation

Authors: Ketevan Epadze

Abstract:

The paper aims to examine conflicting historical narratives proposed by the Georgian and Abkhazian scholars on the territorial affiliation of Abkhazia in the 1950s, explain how these narratives were connected to the Soviet nationalities policy after WW II and demonstrate the dynamic of the narratives’ battle in the last years of the Soviet system, which was followed by military conflict in the post-Soviet era. Abkhazia –a breakaway region of Georgia- self-declared its independence in 1992. Historical dispute on the territorial rights of Abkhazia emerged long before the military conflict began and was connected to the theory of Abkhazian ethnogenesis written by the Georgian literary scholar Pavle Ingorokva. He argued that medieval Abkhazians were Georgians, while modern Abkhazians are newcomers in Abkhazia. After the de-Stalinization, Abkhazian historians developed historical narrative opposed to Ingorokva’s theory. In the 1980s, Georgian dissidents who strove for Georgia’s independence used Ingorokva’s thesis to oppose Abkhazians desire for self-determination and sovereignty. Abkhazian political actors in their turn employed opposite historical arguments to legitimate their rights over autonomy. Ingorokva’s theory is one of the principal issues, discussed during the Georgian-Abkhazian dialogue; it often confuses Georgians and gives the reasons to Abkhazians for complaining about the Georgian discrimination in the Soviet past. The study is based on the different kind of sources: archival materials of the 1950s (Communist Party Archive of Georgia, Soviet Journal ‘Mnatobi’), the book by Pavle Ingorokva ‘Giorgi Merchule’ (1947-1954) and Zurab Anchabadze’s responsive work to Ingorokva’s book – ‘From the medieval history of Abkhazia’ (1956-1959), political speeches of the Georgian and Abkhazian political actors in the 1980s, secondary sources on the Soviet nationalities policy from the 1950s to the 1990s.

Keywords: Soviet, history, ethnicity, nationalism, politics, post-Soviet, conflict

Procedia PDF Downloads 136
1081 Drug-Based Nanoparticles: Comparative Study of the Effect Drug Type on Release Kinetics and Cell Viability

Authors: Chukwudalu C. Nwazojie, Wole W. Soboyejo, John Obayemi, Ali Salifu Azeko, Sandra M. Jusu, Chinyerem M. Onyekanne

Abstract:

The conventional methods for the diagnosis and treatment of breast cancer include bulk systematic mammography, ultrasound, dynamic contrast-enhanced fast 3D gradient-echo (GRE) magnetic resonance imaging (MRI), surgery, chemotherapy, and radiotherapy. However, nanoparticles and drug-loaded polymer microspheres for disease (cancer) targeting and treatment have enormous potential to enhance the approaches that are used today. The goal is to produce an implantable biomedical device for localized breast cancer drug delivery within Africa and the world. The main advantage of localized delivery is that it reduces the amount of drug that is needed to have a therapeutic effect. Polymer blends of poly (D,L-lactide-co-glycolide) (PLGA) and polycaprolactone (PCL), which are biodegradable, is used as a drug excipient. This work focuses on the development of PLGA-PCL (poly (D,L-lactide-co-glycolide) (PLGA) blended with based injectable drug microspheres and are loaded with anticancer drugs (prodigiosin (PG), and paclitaxel (PTX) control) and also the conjugated forms of the drug functionalized with LHRH (luteinizing hormone-releasing hormone) (PG-LHRH, and PTX- LHRH control), using a single-emulsion solvent evaporation technique. The encapsulation was done in the presence of PLGA-PCL (as a polymer matrix) and poly-(vinyl alcohol) (PVA) (as an emulsifier). Comparative study of the various drugs release kinetics and degradation mechanisms of the PLGA-PCL with an encapsulated drug is achieved, and the implication of this study is for the potential application of prodigiosin PLGA-PCL loaded microparticles for controlled delivery of cancer drug and treatment to prevent the regrowth or locoregional recurrence, following surgical resection of triple-negative breast tumor.

Keywords: cancer, polymers, drug kinetics, nanoparticles

Procedia PDF Downloads 70
1080 Connecting Teachers in a Web-Based Professional Development Community in Crisis Time: A Knowledge Building Approach

Authors: Wei Zhao

Abstract:

The pandemic crisis disrupted normal classroom practices so that the constraints of the traditional practice became apparent. This turns out to be new opportunities for technology-based learning and teaching. However, how the technology supports the preschool teachers go through this sudden crisis and how preschool teachers conceived of the use of technology, appropriate and design technological artifacts as a mediator of knowledge construction in order to suit young children’s literacy level are rarely explored. This study addresses these issues by looking at the influence of a web-supported teacher community on changes/shifts in preschool teachers’ epistemological beliefs and practices. This teachers’ professional development community was formulated before the pandemic time and developed virtually throughout the home-based learning caused by Covid-19. It served as a virtual and asynchronous community for those teachers to collaboratively plan for and conduct online lessons using the knowledge-building approach for the purpose of sustaining children’s learning curiosity and opening up new learning opportunities during the lock-down period. The knowledge-building approach helps to increase teachers’ collective responsibility to collaboratively work on shared educational goals in the teacher community and awareness of noticing new ideas or innovations in their classroom. Based on the data collected across five months during and after the lock-down period and the activity theory, results show a dynamic interplay between the evolution of the community culture, the growth of teacher community and teachers’ identity transformation and professional development. Technology is useful in this regard not only because it transforms the geographical distance and new gathering guidelines after the outbreak of pandemic into new ways of communal communication and collaboration. More importantly, while teachers selected, monitored and adapted the technology, it acts as a catalyst for changes in teachers’ old teaching practices and epistemological dispositions.

Keywords: activity theory, changes in epistemology and practice, knowledge building, web-based teachers’ professional development community

Procedia PDF Downloads 153
1079 Monitoring of Cannabis Cultivation with High-Resolution Images

Authors: Levent Basayigit, Sinan Demir, Burhan Kara, Yusuf Ucar

Abstract:

Cannabis is mostly used for drug production. In some countries, an excessive amount of illegal cannabis is cultivated and sold. Most of the illegal cannabis cultivation occurs on the lands far from settlements. In farmlands, it is cultivated with other crops. In this method, cannabis is surrounded by tall plants like corn and sunflower. It is also cultivated with tall crops as the mixed culture. The common method of the determination of the illegal cultivation areas is to investigate the information obtained from people. This method is not sufficient for the determination of illegal cultivation in remote areas. For this reason, more effective methods are needed for the determination of illegal cultivation. Remote Sensing is one of the most important technologies to monitor the plant growth on the land. The aim of this study is to monitor cannabis cultivation area using satellite imagery. The main purpose of this study was to develop an applicable method for monitoring the cannabis cultivation. For this purpose, cannabis was grown as single or surrounded by the corn and sunflower in plots. The morphological characteristics of cannabis were recorded two times per month during the vegetation period. The spectral signature library was created with the spectroradiometer. The parcels were monitored with high-resolution satellite imagery. With the processing of satellite imagery, the cultivation areas of cannabis were classified. To separate the Cannabis plots from the other plants, the multiresolution segmentation algorithm was found to be the most successful for classification. WorldView Improved Vegetative Index (WV-VI) classification was the most accurate method for monitoring the plant density. As a result, an object-based classification method and vegetation indices were sufficient for monitoring the cannabis cultivation in multi-temporal Earthwiev images.

Keywords: Cannabis, drug, remote sensing, object-based classification

Procedia PDF Downloads 242
1078 Design and Development of On-Line, On-Site, In-Situ Induction Motor Performance Analyser

Authors: G. S. Ayyappan, Srinivas Kota, Jaffer R. C. Sheriff, C. Prakash Chandra Joshua

Abstract:

In the present scenario of energy crises, energy conservation in the electrical machines is very important in the industries. In order to conserve energy, one needs to monitor the performance of an induction motor on-site and in-situ. The instruments available for this purpose are very meager and very expensive. This paper deals with the design and development of induction motor performance analyser on-line, on-site, and in-situ. The system measures only few electrical input parameters like input voltage, line current, power factor, frequency, powers, and motor shaft speed. These measured data are coupled to name plate details and compute the operating efficiency of induction motor. This system employs the method of computing motor losses with the help of equivalent circuit parameters. The equivalent circuit parameters of the concerned motor are estimated using the developed algorithm at any load conditions and stored in the system memory. The developed instrument is a reliable, accurate, compact, rugged, and cost-effective one. This portable instrument could be used as a handy tool to study the performance of both slip ring and cage induction motors. During the analysis, the data can be stored in SD Memory card and one can perform various analyses like load vs. efficiency, torque vs. speed characteristics, etc. With the help of the developed instrument, one can operate the motor around its Best Operating Point (BOP). Continuous monitoring of the motor efficiency could lead to Life Cycle Assessment (LCA) of motors. LCA helps in taking decisions on motor replacement or retaining or refurbishment.

Keywords: energy conservation, equivalent circuit parameters, induction motor efficiency, life cycle assessment, motor performance analysis

Procedia PDF Downloads 348
1077 Multi-Objective Optimal Design of a Cascade Control System for a Class of Underactuated Mechanical Systems

Authors: Yuekun Chen, Yousef Sardahi, Salam Hajjar, Christopher Greer

Abstract:

This paper presents a multi-objective optimal design of a cascade control system for an underactuated mechanical system. Cascade control structures usually include two control algorithms (inner and outer). To design such a control system properly, the following conflicting objectives should be considered at the same time: 1) the inner closed-loop control must be faster than the outer one, 2) the inner loop should fast reject any disturbance and prevent it from propagating to the outer loop, 3) the controlled system should be insensitive to measurement noise, and 4) the controlled system should be driven by optimal energy. Such a control problem can be formulated as a multi-objective optimization problem such that the optimal trade-offs among these design goals are found. To authors best knowledge, such a problem has not been studied in multi-objective settings so far. In this work, an underactuated mechanical system consisting of a rotary servo motor and a ball and beam is used for the computer simulations, the setup parameters of the inner and outer control systems are tuned by NSGA-II (Non-dominated Sorting Genetic Algorithm), and the dominancy concept is used to find the optimal design points. The solution of this problem is not a single optimal cascade control, but rather a set of optimal cascade controllers (called Pareto set) which represent the optimal trade-offs among the selected design criteria. The function evaluation of the Pareto set is called the Pareto front. The solution set is introduced to the decision-maker who can choose any point to implement. The simulation results in terms of Pareto front and time responses to external signals show the competing nature among the design objectives. The presented study may become the basis for multi-objective optimal design of multi-loop control systems.

Keywords: cascade control, multi-Loop control systems, multiobjective optimization, optimal control

Procedia PDF Downloads 120
1076 Modeling and Temperature Control of Water-cooled PEMFC System Using Intelligent Algorithm

Authors: Chen Jun-Hong, He Pu, Tao Wen-Quan

Abstract:

Proton exchange membrane fuel cell (PEMFC) is the most promising future energy source owing to its low operating temperature, high energy efficiency, high power density, and environmental friendliness. In this paper, a comprehensive PEMFC system control-oriented model is developed in the Matlab/Simulink environment, which includes the hydrogen supply subsystem, air supply subsystem, and thermal management subsystem. Besides, Improved Artificial Bee Colony (IABC) is used in the parameter identification of PEMFC semi-empirical equations, making the maximum relative error between simulation data and the experimental data less than 0.4%. Operation temperature is essential for PEMFC, both high and low temperatures are disadvantageous. In the thermal management subsystem, water pump and fan are both controlled with the PID controller to maintain the appreciate operation temperature of PEMFC for the requirements of safe and efficient operation. To improve the control effect further, fuzzy control is introduced to optimize the PID controller of the pump, and the Radial Basis Function (RBF) neural network is introduced to optimize the PID controller of the fan. The results demonstrate that Fuzzy-PID and RBF-PID can achieve a better control effect with 22.66% decrease in Integral Absolute Error Criterion (IAE) of T_st (Temperature of PEMFC) and 77.56% decrease in IAE of T_in (Temperature of inlet cooling water) compared with traditional PID. In the end, a novel thermal management structure is proposed, which uses the cooling air passing through the main radiator to continue cooling the secondary radiator. In this thermal management structure, the parasitic power dissipation can be reduced by 69.94%, and the control effect can be improved with a 52.88% decrease in IAE of T_in under the same controller.

Keywords: PEMFC system, parameter identification, temperature control, Fuzzy-PID, RBF-PID, parasitic power

Procedia PDF Downloads 29
1075 A Perspective on Teaching Mathematical Concepts to Freshman Economics Students Using 3D-Visualisations

Authors: Muhammad Saqib Manzoor, Camille Dickson-Deane, Prashan Karunaratne

Abstract:

Cobb-Douglas production (utility) function is a fundamental function widely used in economics teaching and research. The key reason is the function's characteristics to describe the actual production using inputs like labour and capital. The characteristics of the function like returns to scale, marginal, and diminishing marginal productivities are covered in the introductory units in both microeconomics and macroeconomics with a 2-dimensional static visualisation of the function. However, less insight is provided regarding three-dimensional surface, changes in the curvature properties due to returns to scale, the linkage of the short-run production function with its long-run counterpart and marginal productivities, the level curves, and the constraint optimisation. Since (freshman) learners have diverse prior knowledge and cognitive skills, the existing “one size fits all” approach is not very helpful. The aim of this study is to bridge this gap by introducing technological intervention with interactive animations of the three-dimensional surface and sequential unveiling of the characteristics mentioned above using Python software. A small classroom intervention has helped students enhance their analytical and visualisation skills towards active and authentic learning of this topic. However, to authenticate the strength of our approach, a quasi-Delphi study will be conducted to ask domain-specific experts, “What value to the learning process in economics is there using a 2-dimensional static visualisation compared to using a 3-dimensional dynamic visualisation?’ Here three perspectives of the intervention were reviewed by a panel comprising of novice students, experienced students, novice instructors, and experienced instructors in an effort to determine the learnings from each type of visualisations within a specific domain of knowledge. The value of this approach is key to suggesting different pedagogical methods which can enhance learning outcomes.

Keywords: cobb-douglas production function, quasi-Delphi method, effective teaching and learning, 3D-visualisations

Procedia PDF Downloads 113
1074 Machine Learning Techniques in Bank Credit Analysis

Authors: Fernanda M. Assef, Maria Teresinha A. Steiner

Abstract:

The aim of this paper is to compare and discuss better classifier algorithm options for credit risk assessment by applying different Machine Learning techniques. Using records from a Brazilian financial institution, this study uses a database of 5,432 companies that are clients of the bank, where 2,600 clients are classified as non-defaulters, 1,551 are classified as defaulters and 1,281 are temporarily defaulters, meaning that the clients are overdue on their payments for up 180 days. For each case, a total of 15 attributes was considered for a one-against-all assessment using four different techniques: Artificial Neural Networks Multilayer Perceptron (ANN-MLP), Artificial Neural Networks Radial Basis Functions (ANN-RBF), Logistic Regression (LR) and finally Support Vector Machines (SVM). For each method, different parameters were analyzed in order to obtain different results when the best of each technique was compared. Initially the data were coded in thermometer code (numerical attributes) or dummy coding (for nominal attributes). The methods were then evaluated for each parameter and the best result of each technique was compared in terms of accuracy, false positives, false negatives, true positives and true negatives. This comparison showed that the best method, in terms of accuracy, was ANN-RBF (79.20% for non-defaulter classification, 97.74% for defaulters and 75.37% for the temporarily defaulter classification). However, the best accuracy does not always represent the best technique. For instance, on the classification of temporarily defaulters, this technique, in terms of false positives, was surpassed by SVM, which had the lowest rate (0.07%) of false positive classifications. All these intrinsic details are discussed considering the results found, and an overview of what was presented is shown in the conclusion of this study.

Keywords: artificial neural networks (ANNs), classifier algorithms, credit risk assessment, logistic regression, machine Learning, support vector machines

Procedia PDF Downloads 75
1073 Discourse Analysis and Semiotic Researches: Using Michael Halliday's Sociosemiotic Theory

Authors: Deyu Yuan

Abstract:

Discourse analysis as an interdisciplinary approach has more than 60-years-history since it was first named by Zellig Harris in 'Discourse Analysis' on Language in 1952. Ferdinand de Saussure differentiated the 'parole' from the 'langue' that established the principle of focusing on language but not speech. So the rising of discourse analysis can be seen as a discursive turn for the entire language research that closely related to the theory of Speech act. Critical discourse analysis becomes the mainstream of contemporary language research through drawing upon M. A. K. Halliday's socio-semiotic theory and Foucault, Barthes, Bourdieu's views on the sign, discourse, and ideology. So in contrast to general semiotics, social semiotics mainly focuses on parole and the application of semiotic theories to some applicable fields. The article attempts to discuss this applicable sociosemiotics and show the features of it that differ from the Saussurian and Peircian semiotics in four aspects: 1) the sign system is about meaning-generation resource in the social context; 2) the sign system conforms to social and cultural changes with the form of metaphor and connotation; 3) sociosemiotics concerns about five applicable principles including the personal authority principle, non-personal authority principle, consistency principle, model demonstration principle, the expertise principle to deepen specific communication; 4) the study of symbolic functions is targeted to the characteristics of ideational, interpersonal and interactional function in social communication process. Then the paper describes six features which characterize this sociosemiotics as applicable semiotics: social, systematic, usable interdisciplinary, dynamic, and multi-modal characteristics. Thirdly, the paper explores the multi-modal choices of sociosemiotics in the respects of genre, discourse, and style. Finally, the paper discusses the relationship between theory and practice in social semiotics and proposes a relatively comprehensive theoretical framework for social semiotics as applicable semiotics.

Keywords: discourse analysis, sociosemiotics, pragmatics, ideology

Procedia PDF Downloads 295
1072 Pelvic Floor Electrophysiology Patterns Associated with Obstructed Defecation

Authors: Emmanuel Kamal Aziz Saba, Gihan Abd El-Lateif Younis El-Tantawi, Mohammed Hamdy Zahran, Ibrahim Khalil Ibrahim, Mohammed Abd El-Salam Shehata, Hussein Al-Moghazy Sultan, Medhat

Abstract:

Pelvic floor electrophysiological tests are essential for assessment of patients with obstructed defecation. The present study was conducted to determine the different patterns of pelvic floor electrophysiology that are associated with obstructed defecation. The present cross sectional study included 25 patients with obstructed defecation. A control group of 20 apparently healthy subjects were included. All patients were subjected to history taking, clinical examination, proctosigmoidoscopy, lateral proctography (evacuation proctography), dynamic pelvic magnetic resonance imaging, anal manometry and electrophysiological studies. Electrophysiological studies were including pudendal nerve motor conduction study, pudendo-anal reflex, needle electromyography of external anal sphincter and puborectalis muscles, pudendal somatosensory evoked potential and tibial somatosensory evoked potential. The control group was subjected to electrophysiological studies which included pudendal nerve motor conduction study, pudendo-anal reflex, pudendal somatosensory evoked potential and tibial somatosensory evoked potential. The most common pelvic floor electrodiagnostic pattern characteristics of obstructed defecation was pudendal neuropathy, denervation and anismus of external anal sphincter and puborectalis with complete interference pattern of external anal sphincter and puborectalis at squeezing and cough and no localized defect in external anal sphincter. In conclusion, there were characteristic pelvic floor electrodiagnostic patterns associated with obstructed defecation.

Keywords: obstructed defecation, pudendal nerve terminal motor latency, pudendoanal reflex, sphincter electromyography

Procedia PDF Downloads 403
1071 Rating Agreement: Machine Learning for Environmental, Social, and Governance Disclosure

Authors: Nico Rosamilia

Abstract:

The study evaluates the importance of non-financial disclosure practices for regulators, investors, businesses, and markets. It aims to create a sector-specific set of indicators for environmental, social, and governance (ESG) performances alternative to the ratings of the agencies. The existing literature extensively studies the implementation of ESG rating systems. Conversely, this study has a twofold outcome. Firstly, it should generalize incentive systems and governance policies for ESG and sustainable principles. Therefore, it should contribute to the EU Sustainable Finance Disclosure Regulation. Secondly, it concerns the market and the investors by highlighting successful sustainable investing. Indeed, the study contemplates the effect of ESG adoption practices on corporate value. The research explores the asset pricing angle in order to shed light on the fragmented argument on the finance of ESG. Investors may be misguided about the positive or negative effects of ESG on performances. The paper proposes a different method to evaluate ESG performances. By comparing the results of a traditional econometric approach (Lasso) with a machine learning algorithm (Random Forest), the study establishes a set of indicators for ESG performance. Therefore, the research also empirically contributes to the theoretical strands of literature regarding model selection and variable importance in a finance framework. The algorithms will spit out sector-specific indicators. This set of indicators defines an alternative to the compounded scores of ESG rating agencies and avoids the possible offsetting effect of scores. With this approach, the paper defines a sector-specific set of indicators to standardize ESG disclosure. Additionally, it tries to shed light on the absence of a clear understanding of the direction of the ESG effect on corporate value (the problem of endogeneity).

Keywords: ESG ratings, non-financial information, value of firms, sustainable finance

Procedia PDF Downloads 50
1070 Using Time Series NDVI to Model Land Cover Change: A Case Study in the Berg River Catchment Area, Western Cape, South Africa

Authors: Adesuyi Ayodeji Steve, Zahn Munch

Abstract:

This study investigates the use of MODIS NDVI to identify agricultural land cover change areas on an annual time step (2007 - 2012) and characterize the trend in the study area. An ISODATA classification was performed on the MODIS imagery to select only the agricultural class producing 3 class groups namely: agriculture, agriculture/semi-natural, and semi-natural. NDVI signatures were created for the time series to identify areas dominated by cereals and vineyards with the aid of ancillary, pictometry and field sample data. The NDVI signature curve and training samples aided in creating a decision tree model in WEKA 3.6.9. From the training samples two classification models were built in WEKA using decision tree classifier (J48) algorithm; Model 1 included ISODATA classification and Model 2 without, both having accuracies of 90.7% and 88.3% respectively. The two models were used to classify the whole study area, thus producing two land cover maps with Model 1 and 2 having classification accuracies of 77% and 80% respectively. Model 2 was used to create change detection maps for all the other years. Subtle changes and areas of consistency (unchanged) were observed in the agricultural classes and crop practices over the years as predicted by the land cover classification. 41% of the catchment comprises of cereals with 35% possibly following a crop rotation system. Vineyard largely remained constant over the years, with some conversion to vineyard (1%) from other land cover classes. Some of the changes might be as a result of misclassification and crop rotation system.

Keywords: change detection, land cover, modis, NDVI

Procedia PDF Downloads 367
1069 Comparison of Steel and Composite Analysis of a Multi-Storey Building

Authors: Çiğdem Avcı Karataş

Abstract:

Mitigation of structural damage caused by earthquake and reduction of fatality is one of the main concerns of engineers in seismic prone zones of the world. To achieve this aim many technologies have been developed in the last decades and applied in construction and retrofit of structures. On the one hand Turkey is well-known a country of high level of seismicity; on the other hand steel-composite structures appear competitive today in this country by comparison with other types of structures, for example only-steel or concrete structures. Composite construction is the dominant form of construction for the multi-storey building sector. The reason why composite construction is often so good can be expressed in one simple way - concrete is good in compression and steel is good in tension. By joining the two materials together structurally these strengths can be exploited to result in a highly efficient design. The reduced self-weight of composite elements has a knock-on effect by reducing the forces in those elements supporting them, including the foundations. The floor depth reductions that can be achieved using composite construction can also provide significant benefits in terms of the costs of services and the building envelope. The scope of this paper covers analysis, materials take-off, cost analysis and economic comparisons of a multi-storey building with composite and steel frames. The aim of this work is to show that designing load carrying systems as composite is more economical than designing as steel. Design of the nine stories building which is under consideration is done according to the regulation of the 2007, Turkish Earthquake Code and by using static and dynamic analysis methods. For the analyses of the steel and composite systems, plastic analysis methods have been used and whereas steel system analyses have been checked in compliance with EC3 and composite system analyses have been checked in compliance with EC4. At the end of the comparisons, it is revealed that composite load carrying systems analysis is more economical than the steel load carrying systems analysis considering the materials to be used in the load carrying system and the workmanship to be spent for this job.

Keywords: composite analysis, earthquake, steel, multi-storey building

Procedia PDF Downloads 534
1068 Multidisciplinary Approach for a Tsunami Reconstruction Plan in Coquimbo, Chile

Authors: Ileen Van den Berg, Reinier J. Daals, Chris E. M. Heuberger, Sven P. Hildering, Bob E. Van Maris, Carla M. Smulders, Rafael Aránguiz

Abstract:

Chile is located along the subduction zone of the Nazca plate beneath the South American plate, where large earthquakes and tsunamis have taken place throughout history. The last significant earthquake (Mw 8.2) occurred in September 2015 and generated a destructive tsunami, which mainly affected the city of Coquimbo (71.33°W, 29.96°S). The inundation area consisted of a beach, damaged seawall, damaged railway, wetland and old neighborhood; therefore, local authorities started a reconstruction process immediately after the event. Moreover, a seismic gap has been identified in the same area, and another large event could take place in the near future. The present work proposed an integrated tsunami reconstruction plan for the city of Coquimbo that considered several variables such as safety, nature & recreation, neighborhood welfare, visual obstruction, infrastructure, construction process, and durability & maintenance. Possible future tsunami scenarios are simulated by means of the Non-hydrostatic Evolution of Ocean WAVEs (NEOWAVE) model with 5 nested grids and a higher grid resolution of ~10 m. Based on the score from a multi-criteria analysis, the costs of the alternatives and a preference for a multifunctional solution, the alternative that includes an elevated coastal road with floodgates to reduce tsunami overtopping and control the return flow of a tsunami was selected as the best solution. It was also observed that the wetlands are significantly restored to their former configuration; moreover, the dynamic behavior of the wetlands is stimulated. The numerical simulation showed that the new coastal protection decreases damage and the probability of loss of life by delaying tsunami arrival time. In addition, new evacuation routes and a smaller inundation zone in the city increase safety for the area.

Keywords: tsunami, Coquimbo, Chile, reconstruction, numerical simulation

Procedia PDF Downloads 214
1067 Expert System: Debugging Using MD5 Process Firewall

Authors: C. U. Om Kumar, S. Kishore, A. Geetha

Abstract:

An Operating system (OS) is software that manages computer hardware and software resources by providing services to computer programs. One of the important user expectations of the operating system is to provide the practice of defending information from unauthorized access, disclosure, modification, inspection, recording or destruction. Operating system is always vulnerable to the attacks of malwares such as computer virus, worm, Trojan horse, backdoors, ransomware, spyware, adware, scareware and more. And so the anti-virus software were created for ensuring security against the prominent computer viruses by applying a dictionary based approach. The anti-virus programs are not always guaranteed to provide security against the new viruses proliferating every day. To clarify this issue and to secure the computer system, our proposed expert system concentrates on authorizing the processes as wanted and unwanted by the administrator for execution. The Expert system maintains a database which consists of hash code of the processes which are to be allowed. These hash codes are generated using MD5 message-digest algorithm which is a widely used cryptographic hash function. The administrator approves the wanted processes that are to be executed in the client in a Local Area Network by implementing Client-Server architecture and only the processes that match with the processes in the database table will be executed by which many malicious processes are restricted from infecting the operating system. The add-on advantage of this proposed Expert system is that it limits CPU usage and minimizes resource utilization. Thus data and information security is ensured by our system along with increased performance of the operating system.

Keywords: virus, worm, Trojan horse, back doors, Ransomware, Spyware, Adware, Scareware, sticky software, process table, MD5, CPU usage and resource utilization

Procedia PDF Downloads 383
1066 Triangular Hesitant Fuzzy TOPSIS Approach in Investment Projects Management

Authors: Irina Khutsishvili

Abstract:

The presented study develops a decision support methodology for multi-criteria group decision-making problem. The proposed methodology is based on the TOPSIS (Technique for Order Performance by Similarity to Ideal Solution) approach in the hesitant fuzzy environment. The main idea of decision-making problem is a selection of one best alternative or several ranking alternatives among a set of feasible alternatives. Typically, the process of decision-making is based on an evaluation of certain criteria. In many MCDM problems (such as medical diagnosis, project management, business and financial management, etc.), the process of decision-making involves experts' assessments. These assessments frequently are expressed in fuzzy numbers, confidence intervals, intuitionistic fuzzy values, hesitant fuzzy elements and so on. However, a more realistic approach is using linguistic expert assessments (linguistic variables). In the proposed methodology both the values and weights of the criteria take the form of linguistic variables, given by all decision makers. Then, these assessments are expressed in triangular fuzzy numbers. Consequently, proposed approach is based on triangular hesitant fuzzy TOPSIS decision-making model. Following the TOPSIS algorithm, first, the fuzzy positive ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS) are defined. Then the ranking of alternatives is performed in accordance with the proximity of their distances to the both FPIS and FNIS. Based on proposed approach the software package has been developed, which was used to rank investment projects in the real investment decision-making problem. The application and testing of the software were carried out based on the data provided by the ‘Bank of Georgia’.

Keywords: fuzzy TOPSIS approach, investment project, linguistic variable, multi-criteria decision making, triangular hesitant fuzzy set

Procedia PDF Downloads 384
1065 Electrical Machine Winding Temperature Estimation Using Stateful Long Short-Term Memory Networks (LSTM) and Truncated Backpropagation Through Time (TBPTT)

Authors: Yujiang Wu

Abstract:

As electrical machine (e-machine) power density re-querulents become more stringent in vehicle electrification, mounting a temperature sensor for e-machine stator windings becomes increasingly difficult. This can lead to higher manufacturing costs, complicated harnesses, and reduced reliability. In this paper, we propose a deep-learning method for predicting electric machine winding temperature, which can either replace the sensor entirely or serve as a backup to the existing sensor. We compare the performance of our method, the stateful long short-term memory networks (LSTM) with truncated backpropagation through time (TBTT), with that of linear regression, as well as stateless LSTM with/without residual connection. Our results demonstrate the strength of combining stateful LSTM and TBTT in tackling nonlinear time series prediction problems with long sequence lengths. Additionally, in industrial applications, high-temperature region prediction accuracy is more important because winding temperature sensing is typically used for derating machine power when the temperature is high. To evaluate the performance of our algorithm, we developed a temperature-stratified MSE. We propose a simple but effective data preprocessing trick to improve the high-temperature region prediction accuracy. Our experimental results demonstrate the effectiveness of our proposed method in accurately predicting winding temperature, particularly in high-temperature regions, while also reducing manufacturing costs and improving reliability.

Keywords: deep learning, electrical machine, functional safety, long short-term memory networks (LSTM), thermal management, time series prediction

Procedia PDF Downloads 56
1064 Urban Block Design's Impact on the Indoor Daylight Quality, Heating and Cooling Loads of Buildings in the Semi-Arid Regions: Duhok City in Kurdistan Region-Iraq as a Case Study

Authors: Kawar Salih

Abstract:

It has been proven that designing sustainable buildings starts from early stages of urban design. The design of urban blocks specifically, is considered as one of the pragmatic strategies of sustainable urbanism. There have been previous studies that focused on the impact of urban block design and regulation on the outdoor thermal comfort in the semi-arid regions. However, no studies have been found that concentrated on that impact on the internal behavior of buildings of those regions specifically the daylight quality and energy performance. Further, most studies on semi-arid regions are focusing only on the cooling load reduction, neglecting the heating load. The study has focused on two parameters of urban block distribution which are the block orientation and the surface-to-volume ratio with the consideration of both heating and cooling loads of buildings. In Duhok (a semi-arid city in Kurdistan region of Iraq), energy consumption and daylight quality of different types of residential blocks have been examined using dynamic simulation. The findings suggest that there is a considerable higher energy load for heating than cooling, contradicting many previous studies about these regions. The results also highlight that the orientation of urban blocks can vary the energy consumption to 8%. Regarding the surface-to-volume ratio (S/V), it was observed that after the twice enlargement of the S/V, the energy consumption increased 15%. Though, the study demonstrates as well that there are opportunities for reducing energy consumption with the increase of the S/V which contradicts many previous research on S/V impacts on energy consumption. These results can help to design urban blocks with the bigger S/V than existing blocks in the city which it can provide better indoor daylight and relatively similar energy consumption.

Keywords: blocke orienation, building energy consumption, urban block design, semi-arid regions, surfacet-to-volume ratio

Procedia PDF Downloads 308
1063 "Exploring the Intersection of Accounting, Business, and Economics: Bridging Theory and Practice for Sustainable Growth

Authors: Stephen Acheampong Amoafoh

Abstract:

In today's dynamic economic landscape, businesses face multifaceted challenges that demand strategic foresight and informed decision-making. This abstract explores the pivotal role of financial analytics in driving business performance amidst evolving market conditions. By integrating accounting principles with economic insights, organizations can harness the power of data-driven strategies to optimize resource allocation, mitigate risks, and capitalize on emerging opportunities. This presentation will delve into the practical applications of financial analytics across various sectors, highlighting case studies and empirical evidence to underscore its efficacy in enhancing operational efficiency and fostering sustainable growth. From predictive modeling to performance benchmarking, attendees will gain invaluable insights into leveraging advanced analytics tools to drive profitability, streamline processes, and adapt to changing market dynamics. Moreover, this abstract will address the ethical considerations inherent in financial analytics, emphasizing the importance of transparency, integrity, and accountability in data-driven decision-making. By fostering a culture of ethical conduct and responsible stewardship, organizations can build trust with stakeholders and safeguard their long-term viability in an increasingly interconnected global economy. Ultimately, this abstract aims to stimulate dialogue and collaboration among scholars, practitioners, and policymakers, fostering knowledge exchange and innovation in the realms of accounting, business, and economics. Through interdisciplinary insights and actionable recommendations, participants will be equipped to navigate the complexities of today's business environment and seize opportunities for sustainable success.

Keywords: financial analytics, business performance, data-driven strategies, sustainable growth

Procedia PDF Downloads 14
1062 Optimizing Organizational Performance: The Critical Role of Headcount Budgeting in Strategic Alignment and Financial Stability

Authors: Shobhit Mittal

Abstract:

Headcount budgeting stands as a pivotal element in organizational financial management, extending beyond traditional budgeting to encompass strategic resource allocation for workforce-related expenses. This process is integral to maintaining financial stability and fostering a productive workforce, requiring a comprehensive analysis of factors such as market trends, business growth projections, and evolving workforce skill requirements. It demands a collaborative approach, primarily involving Human Resources (HR) and finance departments, to align workforce planning with an organization's financial capabilities and strategic objectives. The dynamic nature of headcount budgeting necessitates continuous monitoring and adjustment in response to economic fluctuations, business strategy shifts, technological advancements, and market dynamics. Its significance in talent management is also highlighted, aligning financial planning with talent acquisition and retention strategies to ensure a competitive edge in the market. The consequences of incorrect headcount budgeting are explored, showing how it can lead to financial strain, operational inefficiencies, and hindered strategic objectives. Examining case studies like IBM's strategic workforce rebalancing and Microsoft's shift for long-term success, the importance of aligning headcount budgeting with organizational goals is underscored. These examples illustrate that effective headcount budgeting transcends its role as a financial tool, emerging as a strategic element crucial for an organization's success. This necessitates continuous refinement and adaptation to align with evolving business goals and market conditions, highlighting its role as a key driver in organizational success and sustainability.

Keywords: strategic planning, fiscal budget, headcount planning, resource allocation, financial management, decision-making, operational efficiency, risk management, headcount budget

Procedia PDF Downloads 20
1061 Tourism and Marketing: An Exploration Study to the Strategic Market Analysis of Moses Mabhida Stadium as a Major Tourism Destination in Kwazulu-Natal

Authors: Nduduzo Andrias Ngxongo, Nsizwazikhona Simon Chili

Abstract:

This analytical exploration illustrates how the non-existence of a proper marketing strategy for a tourism destination may have resulted in a radical decline in both financial outputs and visitor arrivals. The marketing strategy is considered as the foundation for any tourism destination’s marketing tactics. Tourism destinations are ought to have dynamic and adaptive marketing strategies that will develop a promotional approach to help the destination to gain market share, identify its target markets, stay relevant to its existing clients, attract new visitors, and increase profits-earned. Accordingly, the Moses Mabhida Stadium (MMS), one of the prominent tourist attractions in KwaZulu-Natal; boasting a world-class architectural design, several international prestigious awards, and vibrant, adventurous activities, has in recent years suffered a gradual slump in both visitors and profits. Therefore, the basis of this paper was to thoroughly establish precisely how the existing MMS marketing strategy may be a basis for a decline in the number of visitors and profits-earned in recent years. The study adopted mixed method research strategy, with 380 participants. The outcome of the study suggests some costly disparities in the marketing strategy of MMS which has led to poor performance and a loss in tourism market share. In consequence, the outcome further suggests that the non-existence of market research analysis and destination marketing tools contributed vastly to the in-progress dilemma. This fact-finding exploration provides a birds-eye outlook of MMS marketing strategy, and based on the results, the study recommends for the introduction of a more far-reaching and revitalising marketing strategy through; constant and persistent market research initiatives, minimal political interference in the administration of state-funded organisations, reassessment of the feasibility study, vigorous, and sourcing of proficient personnel.

Keywords: tourism, destination, marketing , marketing strategy

Procedia PDF Downloads 228
1060 The Inherent Flaw in the NBA Playoff Structure

Authors: Larry Turkish

Abstract:

Introduction: The NBA is an example of mediocrity and this will be evident in the following paper. The study examines and evaluates the characteristics of the NBA champions. As divisions and playoff teams increase, there is an increase in the probability that the champion originates from the mediocre category. Since it’s inception in 1947, the league has been mediocre and continues to this day. Why does a professional league allow any team with a less than 50% winning percentage into the playoffs? As long as the finances flow into the league, owners will not change the current algorithm. The objective of this paper is to determine if the regular season has meaning in finding an NBA champion. Statistical Analysis: The data originates from the NBA website. The following variables are part of the statistical analysis: Rank, the rank of a team relative to other teams in the league based on the regular season win-loss record; Winning Percentage of a team based on the regular season; Divisions, the number of divisions within the league and Playoff Teams, the number of playoff teams relative to a particular season. The following statistical applications are applied to the data: Pearson Product-Moment Correlation, Analysis of Variance, Factor and Regression analysis. Conclusion: The results indicate that the divisional structure and number of playoff teams results in a negative effect on the winning percentage of playoff teams. It also prevents teams with higher winning percentages from accessing the playoffs. Recommendations: 1. Teams that have a winning percentage greater than 1 standard deviation from the mean from the regular season will have access to playoffs. (Eliminates mediocre teams.) 2. Eliminate Divisions (Eliminates weaker teams from access to playoffs.) 3. Eliminate Conferences (Eliminates weaker teams from access to the playoffs.) 4. Have a balanced regular season schedule, (Reduces the number of regular season games, creates equilibrium, reduces bias) that will reduce the need for load management.

Keywords: alignment, mediocrity, regression, z-score

Procedia PDF Downloads 102
1059 Valorization of Sargassum: Use of Twin-Screw Extrusion to Produce Biomolecules and Biomaterials

Authors: Bauta J., Raynaud C., Vaca-Medina G., Simon V., Roully A., Vandenbossche V.

Abstract:

Sargassum is a brown algae, originally found in the Sargasso Sea, located in the Caribbean region and the Gulf of Mexico. The flow of Sargassum is becoming a critical environmental problem all over the Caribbean islands particularly. In Guadeloupe alone, around 80,000 tons of seaweed are stranded during the season. Since the appearance of the first waves of Sargassum algae, several measures have been taken to collect them to keep the beaches clean. Nevertheless, 90% of the collected algae are currently stored without recovery. The lack of research initiative demands a more in-depth exploration of Sargassum algae chemistry, targeted towards added value applications and their development. In this context, the aim of the study was to develop a biorefinery process to valorize Sargassum as a source of bioactive natural substances and as raw material to produce biomaterials simultaneously. The technology used was the twin-screw extrusion, which allows to achieve continuously in the same machine different unit fractionation operations. After the identification of the molecules of interest in Sargassum algae, different operating conditions of thermo-mechanical treatment were applied in a twin-screw extruder. The nature of the solvent, the configuration of the extruder, the screw profile, and the temperature profile were studied in order to fractionate the algal biomass and to allow the recovery of a bioactive liquid fraction of interest and a solid residue suitable for the production of biomaterials. Each bioactive liquid fraction was characterized and strategic ways of adding value were proposed. In parallel, the possibility of using the solid residue to produce biomaterials was studied by setting up Dynamic Vapour Sorption (DVS) and basic Pressure-Volume-Temperature (PVT) analyses. The solid residue was molded by compression cooking. The obtained materials were finally characterized mechanically. The results obtained were very comforting and gave some perspectives to find an interesting valorization for the Sargassum algae.

Keywords: seaweeds, twin-screw extrusion, fractionation, bioactive compounds, biomaterials, biomass

Procedia PDF Downloads 97
1058 Single-Molecule Optical Study of Cholesterol-Mediated Dimerization Process of EGFRs in Different Cell Lines

Authors: Chien Y. Lin, Jung Y. Huang, Leu-Wei Lo

Abstract:

A growing body of data reveals that the membrane cholesterol molecules can alter the signaling pathways of living cells. However, the understanding about how membrane cholesterol modulates receptor proteins is still lacking. Single-molecule tracking can effectively probe into the microscopic environments and thermal fluctuations of receptor proteins in a living cell. In this study we applies single-molecule optical tracking on ligand-induced dimerization process of EGFRs in the plasma membranes of two cancer cell lines (HeLa and A431) and one normal endothelial cell line (MCF12A). We tracked individual EGFR and dual receptors, diffusing in a correlated manner in the plasma membranes of live cells. We developed an energetic model by integrating the generalized Langevin equation with the Cahn-Hilliard equation to help extracting important information from single-molecule trajectories. From the study, we discovered that ligand-bound EGFRs move from non-raft areas into lipid raft domains. This ligand-induced motion is a common behavior in both cancer and normal cells. By manipulating the total amount of membrane cholesterol with methyl-β-cyclodextrin and the local concentration of membrane cholesterol with nystatin, we further found that the amount of cholesterol can affect the stability of EGFR dimers. The EGFR dimers in the plasma membrane of normal cells are more sensitive to the local concentration changes of cholesterol than EGFR dimers in the cancer cells. Our method successfully captures dynamic interactions of receptors at the single-molecule level and provides insight into the functional architecture of both the diffusing EGFR molecules and their local cellular environment.

Keywords: membrane proteins, single-molecule tracking, Cahn-Hilliard equation, EGFR dimers

Procedia PDF Downloads 384
1057 Quality Analysis of Vegetables Through Image Processing

Authors: Abdul Khalique Baloch, Ali Okatan

Abstract:

The quality analysis of food and vegetable from image is hot topic now a day, where researchers make them better then pervious findings through different technique and methods. In this research we have review the literature, and find gape from them, and suggest better proposed approach, design the algorithm, developed a software to measure the quality from images, where accuracy of image show better results, and compare the results with Perouse work done so for. The Application we uses an open-source dataset and python language with tensor flow lite framework. In this research we focus to sort food and vegetable from image, in the images, the application can sorts and make them grading after process the images, it could create less errors them human base sorting errors by manual grading. Digital pictures datasets were created. The collected images arranged by classes. The classification accuracy of the system was about 94%. As fruits and vegetables play main role in day-to-day life, the quality of fruits and vegetables is necessary in evaluating agricultural produce, the customer always buy good quality fruits and vegetables. This document is about quality detection of fruit and vegetables using images. Most of customers suffering due to unhealthy foods and vegetables by suppliers, so there is no proper quality measurement level followed by hotel managements. it have developed software to measure the quality of the fruits and vegetables by using images, it will tell you how is your fruits and vegetables are fresh or rotten. Some algorithms reviewed in this thesis including digital images, ResNet, VGG16, CNN and Transfer Learning grading feature extraction. This application used an open source dataset of images and language used python, and designs a framework of system.

Keywords: deep learning, computer vision, image processing, rotten fruit detection, fruits quality criteria, vegetables quality criteria

Procedia PDF Downloads 43
1056 Development of a Regression Based Model to Predict Subjective Perception of Squeak and Rattle Noise

Authors: Ramkumar R., Gaurav Shinde, Pratik Shroff, Sachin Kumar Jain, Nagesh Walke

Abstract:

Advancements in electric vehicles have significantly reduced the powertrain noise and moving components of vehicles. As a result, in-cab noises have become more noticeable to passengers inside the car. To ensure a comfortable ride for drivers and other passengers, it has become crucial to eliminate undesirable component noises during the development phase. Standard practices are followed to identify the severity of noises based on subjective ratings, but it can be a tedious process to identify the severity of each development sample and make changes to reduce it. Additionally, the severity rating can vary from jury to jury, making it challenging to arrive at a definitive conclusion. To address this, an automotive component was identified to evaluate squeak and rattle noise issue. Physical tests were carried out for random and sine excitation profiles. Aim was to subjectively assess the noise using jury rating method and objectively evaluate the same by measuring the noise. Suitable jury evaluation method was selected for the said activity, and recorded sounds were replayed for jury rating. Objective data sound quality metrics viz., loudness, sharpness, roughness, fluctuation strength and overall Sound Pressure Level (SPL) were measured. Based on this, correlation co-efficients was established to identify the most relevant sound quality metrics that are contributing to particular identified noise issue. Regression analysis was then performed to establish the correlation between subjective and objective data. Mathematical model was prepared using artificial intelligence and machine learning algorithm. The developed model was able to predict the subjective rating with good accuracy.

Keywords: BSR, noise, correlation, regression

Procedia PDF Downloads 45
1055 Modeling and System Identification of a Variable Excited Linear Direct Drive

Authors: Heiko Weiß, Andreas Meister, Christoph Ament, Nils Dreifke

Abstract:

Linear actuators are deployed in a wide range of applications. This paper presents the modeling and system identification of a variable excited linear direct drive (LDD). The LDD is designed based on linear hybrid stepper technology exhibiting the characteristic tooth structure of mover and stator. A three-phase topology provides the thrust force caused by alternating strengthening and weakening of the flux of the legs. To achieve best possible synchronous operation, the phases are commutated sinusoidal. Despite the fact that these LDDs provide high dynamics and drive forces, noise emission limits their operation in calm workspaces. To overcome this drawback an additional excitation of the magnetic circuit is introduced to LDD using additional enabling coils instead of permanent magnets. The new degree of freedom can be used to reduce force variations and related noise by varying the excitation flux that is usually generated by permanent magnets. Hence, an identified simulation model is necessary to analyze the effects of this modification. Especially the force variations must be modeled well in order to reduce them sufficiently. The model can be divided into three parts: the current dynamics, the mechanics and the force functions. These subsystems are described with differential equations or nonlinear analytic functions, respectively. Ordinary nonlinear differential equations are derived and transformed into state space representation. Experiments have been carried out on a test rig to identify the system parameters of the complete model. Static and dynamic simulation based optimizations are utilized for identification. The results are verified in time and frequency domain. Finally, the identified model provides a basis for later design of control strategies to reduce existing force variations.

Keywords: force variations, linear direct drive, modeling and system identification, variable excitation flux

Procedia PDF Downloads 339
1054 Direct Approach in Modeling Particle Breakage Using Discrete Element Method

Authors: Ebrahim Ghasemi Ardi, Ai Bing Yu, Run Yu Yang

Abstract:

Current study is aimed to develop an available in-house discrete element method (DEM) code and link it with direct breakage event. So, it became possible to determine the particle breakage and then its fragments size distribution, simultaneous with DEM simulation. It directly applies the particle breakage inside the DEM computation algorithm and if any breakage happens the original particle is replaced with daughters. In this way, the calculation will be followed based on a new updated particles list which is very similar to the real grinding environment. To validate developed model, a grinding ball impacting an unconfined particle bed was simulated. Since considering an entire ball mill would be too computationally demanding, this method provided a simplified environment to test the model. Accordingly, a representative volume of the ball mill was simulated inside a box, which could emulate media (ball)–powder bed impacts in a ball mill and during particle bed impact tests. Mono, binary and ternary particle beds were simulated to determine the effects of granular composition on breakage kinetics. The results obtained from the DEM simulations showed a reduction in the specific breakage rate for coarse particles in binary mixtures. The origin of this phenomenon, commonly known as cushioning or decelerated breakage in dry milling processes, was explained by the DEM simulations. Fine particles in a particle bed increase mechanical energy loss, and reduce and distribute interparticle forces thereby inhibiting the breakage of the coarse component. On the other hand, the specific breakage rate of fine particles increased due to contacts associated with coarse particles. Such phenomenon, known as acceleration, was shown to be less significant, but should be considered in future attempts to accurately quantify non-linear breakage kinetics in the modeling of dry milling processes.

Keywords: particle bed, breakage models, breakage kinetic, discrete element method

Procedia PDF Downloads 162