Files
AI-on-the-edge-device/code/components/jomjol_helper/psram.cpp
CaCO3 b24b7d5ce2 v15.2.1 (#2361)
* Testcase for #2145 and debug-log (#2151)

* new models ana-cont-11.0.5, ana-class100-1.5.7, dig-class100-1.6.0

* Testcase for #2145
Added debug log, if allowNegativeRates is handeled

* Fix timezone config parser (#2169)

* make sure to parse the whole config line

* fix crash on empty timezone parameter

---------

Co-authored-by: CaCO3 <caco@ruinelli.ch>

* Enhance ROI pages (#2161)

* Check if the ROIs are equidistant. Only if not, untick the checkbox

* renaming

* Check if the ROIs have same y, dy and dx. If so, tick the sync checkbox

* only allow editing space when box is checked

* fix sync check

* show inner frame on all ROIs

* cleanup

* Check if the ROIs have same dy and dx. If so, tick the sync checkbox

* checkbox position

* renaming

* renaming

* show inner frame and cross hairs on all ROIs

* update ROIs on ticking checkboxes

* show timezone hint

* fix deleting last ROI

* cleanup

---------

Co-authored-by: CaCO3 <caco@ruinelli.ch>

* restart timeout on progress, catch error (#2170)

* restart timeout on progress, catch error

* .

---------

Co-authored-by: CaCO3 <caco@ruinelli.ch>

* BugFix #2167

* Release 15.1 preparations (#2171)

* Update Changelog.md

* Update Changelog.md

* Update Changelog.md

* Update changelog

* Fix links to PR

* Formating

* Update Changelog.md

* Update Changelog.md

* Update Changelog.md

* Update Changelog.md

* Update Changelog.md

* Update Changelog.md

* Update Changelog.md

* Update Changelog.md

* Update Changelog.md

* Update Changelog.md

* Update Changelog.md

* Update Changelog.md

---------

Co-authored-by: Slider0007 <jobbelle@gmx.net>
Co-authored-by: Slider0007 <115730895+Slider0007@users.noreply.github.com>

* fix typo

* Replace relative documentation links with absolute ones pointing to the external documentation (#2180)

Co-authored-by: CaCO3 <caco@ruinelli.ch>

* Sort model files in configuration combobox (#2189)

* new models ana-cont-11.0.5, ana-class100-1.5.7, dig-class100-1.6.0

* Testcase for #2145
Added debug log, if allowNegativeRates is handeled

* Sort model files in combobox

* reboot task - increase stack size (#2201)

Avoid stack overflow

* Update interface_influxdb.cpp

* Update Changelog.md

* Show PSRAM usage (#2206)

* centralize PSRAM usage (application code only)

* update logging

* update logging

* fix use after free

* initialize buffer

* free rgb_image before ussing it for new allocation

* use wrapper function

* switch log level to debug

* .

* undo adding free() calls

* .

* add names to all CImage instances

* .

* .

* .

* revert changes of stbi_image_free() with free_psram_heap() on the places where is is not in PSRAM

* .

* typos

* typo

* Added MQTT Outbox explanation/warning

* added CONFIG_SPIRAM_USE_MEMMAP explanation

---------

Co-authored-by: CaCO3 <caco@ruinelli.ch>

* Disable custom MQTT Outbox. This also moves the MQTT Publishing memory usage back to the internal RAM!

* log MQTT connection refused reasons (#2216)

* Revert PSRAM usage as it lead to memory fragmentation.
See https://github.com/jomjol/AI-on-the-edge-device/issues/2200 for details

* fix missing value data

* Revert PSRAM usage as it lead to memory fragmentation. (#2224)

See https://github.com/jomjol/AI-on-the-edge-device/issues/2200 for details

Co-authored-by: CaCO3 <caco@ruinelli.ch>

* Fix missing value data in graph (#2230)

* fix missing value data

---------

Co-authored-by: CaCO3 <caco@ruinelli.ch>

* Update Changelog.md (#2231)

* Update interface_influxdb.cpp (#2233)

* update copyright year

* Cleanup

* Set prevalue using MQTT + set prevalue to RAW value (REST+MQTT) (#2252)

* Use double instead of float

* Error handling + set to RAW if newvalue < 0

* REST SetPrevalue: Set to RAW if newvalue < 0

* set prevalue with MQTT

* removed the stb_image files and re-add them as a submodule. (#2223)

- stb_image.h: Version update 2.25 -> 2.28
- stb_resize.h: Version update 0.96 -> 0.97
- stb_write.h: Version update 1.14 -> 1.16

Co-authored-by: CaCO3 <caco@ruinelli.ch>

* Remove obsolete ClassFlowWriteList (#2264)

* Renaming & cleanup of some modules / functions in source code (#2265)

* Rename module tag name

* Rename server_tflite.cpp -> MainFlowControl.cpp

* Remove redundandant MQTTMainTopic function

* Update

* Remove obsolete GetMQTTMainTopic

* Fix last element missing in digit model drop down (#2282)

* Debug influxdb (#2283)

* Fix time offset issues in InfluxDB component. (#2278)

Closes #2273
Closes #2150

* Update interface_influxdb.cpp

* Update interface_influxdb.cpp

* Improve Logging

* Implement TimeSync at beginning

* Update time_sntp.cpp

* Update time_sntp.cpp

* Set Time After WLAN Init

---------

Co-authored-by: Antonin Delpeuch <antonin@delpeuch.eu>

* Implement a camera livestream handler (#2286)

* fix leading NaN (#2310)

* analogROI: Activate save button after ROI creation (#2326)

* Migration of PlatformIO 5.2.0 to 6.1.0 (resp. ESP IDF from 4.4.2 to 5.0.1) (#2305)

* Migration to PlatformIO 6.1.0

* Disable RMTMEM usage as it is no longer allowed -> Smart LEDs not functional!

* moved miniz into subfolder of jomjol_fileserver_ota, else it does not build anymore.

* cleanup

* fix leading NaN (#2310)

* Migration to PlatformIO 6.1.0

* Disable RMTMEM usage as it is no longer allowed -> Smart LEDs not functional!

* moved miniz into subfolder of jomjol_fileserver_ota, else it does not build anymore.

* cleanup

* Task watchdog has new config name

* Fix return value check. It must be something else than ESP_FAIL, but it does not need to be ESP_OK!

* add missing strucures to work around new RMTMEM restriction (untested)

---------

Co-authored-by: CaCO3 <caco@ruinelli.ch>

* Keep MainFlowTask alive to handle reboot (#2325)

* Shared PSRAM memory (#2285)

* enable PSRAM logging

* add extra functions for psram shared memroy handling

* CImageBasis objects still should used dynamic memory (eg. rawImage), haw ever tmpImage must be placed inside the shared memory

* Place all STBI allocs inside the shared memory

* The models are placed in the shared PSRAM reagion and must be allocated through the dedicated functions

* .

* renaming

* fix cast warning

* add flag to switch STBI PSRAM usage

* improve PSRAM shared handling

* reserve shared PSRAM as early as possible

* init logging eralier so we can use it in PSRAM shared alloc

* move Wifi_LWIP, BSS_SEG and MQTT Outbox into PSRAM to ffree internal memory

* Check if model fits into reserved shared memory

* Update code/components/jomjol_tfliteclass/CTfLiteClass.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_flowcontroll/ClassFlowControll.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_image_proc/CImageBasis.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* .

* .

* .

* .

* Korrektur Merge Conflict in main.cpp

---------

Co-authored-by: CaCO3 <caco@ruinelli.ch>
Co-authored-by: jomjol <30766535+jomjol@users.noreply.github.com>

* fix PSRAM init return value check

* Extend InfluxDBv1 with individual topic names (#2319)

* Implement individual influx topic

* Update interface_influxdb.cpp

* Update interface_influxdb.cpp

* Update FieldName

* analogROI: Activate save button after ROI creation (#2326)

* Migration of PlatformIO 5.2.0 to 6.1.0 (resp. ESP IDF from 4.4.2 to 5.0.1) (#2305)

* Migration to PlatformIO 6.1.0

* Disable RMTMEM usage as it is no longer allowed -> Smart LEDs not functional!

* moved miniz into subfolder of jomjol_fileserver_ota, else it does not build anymore.

* cleanup

* fix leading NaN (#2310)

* Migration to PlatformIO 6.1.0

* Disable RMTMEM usage as it is no longer allowed -> Smart LEDs not functional!

* moved miniz into subfolder of jomjol_fileserver_ota, else it does not build anymore.

* cleanup

* Task watchdog has new config name

* Fix return value check. It must be something else than ESP_FAIL, but it does not need to be ESP_OK!

* add missing strucures to work around new RMTMEM restriction (untested)

---------

Co-authored-by: CaCO3 <caco@ruinelli.ch>

* Keep MainFlowTask alive to handle reboot (#2325)

* Shared PSRAM memory (#2285)

* enable PSRAM logging

* add extra functions for psram shared memroy handling

* CImageBasis objects still should used dynamic memory (eg. rawImage), haw ever tmpImage must be placed inside the shared memory

* Place all STBI allocs inside the shared memory

* The models are placed in the shared PSRAM reagion and must be allocated through the dedicated functions

* .

* renaming

* fix cast warning

* add flag to switch STBI PSRAM usage

* improve PSRAM shared handling

* reserve shared PSRAM as early as possible

* init logging eralier so we can use it in PSRAM shared alloc

* move Wifi_LWIP, BSS_SEG and MQTT Outbox into PSRAM to ffree internal memory

* Check if model fits into reserved shared memory

* Update code/components/jomjol_tfliteclass/CTfLiteClass.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_flowcontroll/ClassFlowControll.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_image_proc/CImageBasis.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* Update code/components/jomjol_helper/psram.cpp

* .

* .

* .

* .

* Korrektur Merge Conflict in main.cpp

---------

Co-authored-by: CaCO3 <caco@ruinelli.ch>
Co-authored-by: jomjol <30766535+jomjol@users.noreply.github.com>

* fix PSRAM init return value check

* Extend incl. indiv. Measurement

* Implement UX

* Update ClassFlowInfluxDBv2.cpp

* Implement individual influx topic

* Update interface_influxdb.cpp

* Update interface_influxdb.cpp

* Update FieldName

* Extend incl. indiv. Measurement

* Implement UX

* Update ClassFlowInfluxDBv2.cpp

* Update main.cpp

---------

Co-authored-by: Slider0007 <115730895+Slider0007@users.noreply.github.com>
Co-authored-by: CaCO3 <caco3@ruinelli.ch>
Co-authored-by: CaCO3 <caco@ruinelli.ch>

* Update Changelog.md (#2344)

* Update Changelog.md

* Update Changelog.md

* merge conflicts

* merge conflicts

* update changelog

* Update Changelog.md

* Fix alignment mark PSRAM issue (#2357)

* Add lock for shared PSRAM memory, use it for the relevant round steps and the Alignment Mark Task

* only allow taking a new image for the Alignment Mark while round got completed

* show success/denial of Alignment Mark Update Request

* .

---------

Co-authored-by: CaCO3 <caco@ruinelli.ch>

* Update Changelog.md (#2360)

---------

Co-authored-by: Frank Haverland <fspapaping@googlemail.com>
Co-authored-by: CaCO3 <caco@ruinelli.ch>
Co-authored-by: jomjol <30766535+jomjol@users.noreply.github.com>
Co-authored-by: Slider0007 <jobbelle@gmx.net>
Co-authored-by: Slider0007 <115730895+Slider0007@users.noreply.github.com>
Co-authored-by: Antonin Delpeuch <antonin@delpeuch.eu>
2023-04-27 22:41:08 +02:00

212 lines
7.9 KiB
C++

#include "ClassLogFile.h"
#include "../../include/defines.h"
#include "psram.h"
static const char* TAG = "PSRAM";
using namespace std;
void *shared_region = NULL;
uint32_t allocatedBytesForSTBI = 0;
std::string sharedMemoryInUseFor = "";
/** Reserve a large block in the PSRAM which will be shared between the different steps.
* Each step uses it differently but only wiuthin itself. */
bool reserve_psram_shared_region(void) {
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Allocating shared PSRAM region (" +
std::to_string(TENSOR_ARENA_SIZE + MAX_MODEL_SIZE) + " bytes)...");
shared_region = malloc_psram_heap("Shared PSRAM region", TENSOR_ARENA_SIZE + MAX_MODEL_SIZE,
MALLOC_CAP_SPIRAM | MALLOC_CAP_8BIT);
if (shared_region == NULL) {
LogFile.WriteToFile(ESP_LOG_ERROR, TAG, "Failed to allocating shared PSRAM region!");
return false;
}
else {
return true;
}
}
/*******************************************************************
* Memory used in Take Image (STBI)
*******************************************************************/
bool psram_init_shared_memory_for_take_image_step(void) {
if (sharedMemoryInUseFor != "") {
LogFile.WriteToFile(ESP_LOG_ERROR, TAG, "Shared memory in PSRAM already in use for " + sharedMemoryInUseFor + "!");
return false;
}
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Init shared memory for step 'Take Image' (STBI buffers)");
allocatedBytesForSTBI = 0;
sharedMemoryInUseFor = "TakeImage";
return true;
}
void psram_deinit_shared_memory_for_take_image_step(void) {
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Deinit shared memory for step 'Take Image' (STBI buffers)");
allocatedBytesForSTBI = 0;
sharedMemoryInUseFor = "";
}
void *psram_reserve_shared_stbi_memory(size_t size) {
/* Only large buffers should be placed in the shared PSRAM
* If we also place all smaller STBI buffers here, we get artefacts for some reasons. */
if (size >= 100000) {
if ((allocatedBytesForSTBI + size) > TENSOR_ARENA_SIZE + MAX_MODEL_SIZE) { // Check if it still fits in the shared region
LogFile.WriteToFile(ESP_LOG_ERROR, TAG, "Shared memory in PSRAM too small (STBI) to fit additional " +
std::to_string(size) + " bytes! Available: " + std::to_string(TENSOR_ARENA_SIZE + MAX_MODEL_SIZE - allocatedBytesForSTBI) + " bytes!");
return NULL;
}
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Allocating memory (" + std::to_string(size) + " bytes) for STBI (use shared memory in PSRAM)...");
allocatedBytesForSTBI += size;
return (uint8_t *)shared_region + allocatedBytesForSTBI - size;
}
else { // Normal PSRAM
return malloc_psram_heap("STBI", size, MALLOC_CAP_SPIRAM);
}
}
void *psram_reallocate_shared_stbi_memory(void *ptr, size_t newsize) {
char buf[20];
sprintf(buf, "%p", ptr);
LogFile.WriteToFile(ESP_LOG_ERROR, TAG, "STBI requested realloc for " + std::string(buf) + " but this is currently unsupported!");
return NULL;
}
void psram_free_shared_stbi_memory(void *p) {
if ((p >= shared_region) && (p <= ((uint8_t *)shared_region + allocatedBytesForSTBI))) { // was allocated inside the shared memory
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Part of shared memory used for STBI (PSRAM, part of shared memory) is free again");
}
else { // Normal PSRAM
free_psram_heap("STBI", p);
}
}
/*******************************************************************
* Memory used in Aligning Step
* During this step we only use the shared part of the PSRAM
* for the tmpImage.
*******************************************************************/
void *psram_reserve_shared_tmp_image_memory(void) {
if (sharedMemoryInUseFor != "") {
LogFile.WriteToFile(ESP_LOG_ERROR, TAG, "Shared memory in PSRAM already in use for " + sharedMemoryInUseFor + "!");
return NULL;
}
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Allocating tmpImage (" + std::to_string(IMAGE_SIZE) + " bytes, use shared memory in PSRAM)...");
sharedMemoryInUseFor = "Aligning";
return shared_region; // Use 1th part of the shared memory for the tmpImage (only user)
}
void psram_free_shared_temp_image_memory(void) {
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Shared memory used for tmpImage (PSRAM, part of shared memory) is free again");
sharedMemoryInUseFor = "";
}
/*******************************************************************
* Memory used in Digitalization Steps
* During this step we only use the shared part of the PSRAM for the
* Tensor Arena and one of the Models.
* The shared memory is large enough for the largest model and the
* Tensor Arena. Therefore we do not need to monitor the usage.
*******************************************************************/
void *psram_get_shared_tensor_arena_memory(void) {
if ((sharedMemoryInUseFor == "") || (sharedMemoryInUseFor == "Digitalization_Model")) {
sharedMemoryInUseFor = "Digitalization_Tensor";
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Allocating Tensor Arena (" + std::to_string(TENSOR_ARENA_SIZE) + " bytes, use shared memory in PSRAM)...");
return shared_region; // Use 1th part of the shared memory for Tensor
}
else {
LogFile.WriteToFile(ESP_LOG_ERROR, TAG, "Shared memory in PSRAM already in use for " + sharedMemoryInUseFor + "!");
return NULL;
}
}
void *psram_get_shared_model_memory(void) {
if ((sharedMemoryInUseFor == "") || (sharedMemoryInUseFor == "Digitalization_Tensor")) {
sharedMemoryInUseFor = "Digitalization_Model";
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Allocating Model memory (" + std::to_string(MAX_MODEL_SIZE) + " bytes, use shared memory in PSRAM)...");
return (uint8_t *)shared_region + TENSOR_ARENA_SIZE; // Use 2nd part of the shared memory (after Tensor Arena) for the model
}
else {
LogFile.WriteToFile(ESP_LOG_ERROR, TAG, "Shared memory in PSRAM already in use for " + sharedMemoryInUseFor + "!");
return NULL;
}
}
void psram_free_shared_tensor_arena_and_model_memory(void) {
sharedMemoryInUseFor = "";
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Shared memory used for Tensor Arena and model (PSRAM, part of shared memory) is free again");
}
/*******************************************************************
* General
*******************************************************************/
void *malloc_psram_heap(std::string name, size_t size, uint32_t caps) {
void *ptr;
ptr = heap_caps_malloc(size, caps);
if (ptr != NULL) {
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Allocated " + to_string(size) + " bytes in PSRAM for '" + name + "'");
}
else {
LogFile.WriteToFile(ESP_LOG_ERROR, TAG, "Failed to allocate " + to_string(size) + " bytes in PSRAM for '" + name + "'!");
}
return ptr;
}
void *realloc_psram_heap(std::string name, void *ptr, size_t size, uint32_t caps) {
ptr = heap_caps_realloc(ptr, size, caps);
if (ptr != NULL) {
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Reallocated " + to_string(size) + " bytes in PSRAM for '" + name + "'");
}
else {
LogFile.WriteToFile(ESP_LOG_ERROR, TAG, "Failed to reallocate " + to_string(size) + " bytes in PSRAM for '" + name + "'!");
}
return ptr;
}
void *calloc_psram_heap(std::string name, size_t n, size_t size, uint32_t caps) {
void *ptr;
ptr = heap_caps_calloc(n, size, caps);
if (ptr != NULL) {
LogFile.WriteToFile(ESP_LOG_INFO, TAG, "Allocated " + to_string(size) + " bytes in PSRAM for '" + name + "'");
}
else {
LogFile.WriteToFile(ESP_LOG_ERROR, TAG, "Failed to allocate " + to_string(size) + " bytes in PSRAM for '" + name + "'!");
}
return ptr;
}
void free_psram_heap(std::string name, void *ptr) {
LogFile.WriteToFile(ESP_LOG_DEBUG, TAG, "Freeing memory in PSRAM used for '" + name + "'...");
heap_caps_free(ptr);
}