Just like almost everything in life, this type of stuff can be nice when used properly and only when appropriate, and otherwise can quickly be a recipe for disaster. Macros are one of those 'implicit' programming concepts, the stuff that you just can't notice by simply reading code, and I've been bit enough times by all kinds of implicit programming that I started actively avoiding that.
If your macro has some way to let developers notice that it is a macro just by reading the code where it is used, like how you will always notice a Python decorator or a C# attribute, then that's fine as long as the name of macro gives away what it does.
Macros that could even redefine the entire language, like how I saw a project recently just defining a
int32 macro with different bit sizes depending on the platform, or macros that have names that cannot be read by any sane human being (as is the norm with everything in C/C++ apparently), are absolute hell.